CN103946732A - Video display modification based on sensor input for a see-through near-to-eye display - Google Patents

Video display modification based on sensor input for a see-through near-to-eye display Download PDF

Info

Publication number
CN103946732A
CN103946732A CN201280046955.XA CN201280046955A CN103946732A CN 103946732 A CN103946732 A CN 103946732A CN 201280046955 A CN201280046955 A CN 201280046955A CN 103946732 A CN103946732 A CN 103946732A
Authority
CN
China
Prior art keywords
light
image
display
sensor
optical
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201280046955.XA
Other languages
Chinese (zh)
Other versions
CN103946732B (en
Inventor
J·D·哈迪克
R·F·奥斯特豪特
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Microsoft Technology Licensing LLC
Original Assignee
Microsoft Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Microsoft Corp filed Critical Microsoft Corp
Publication of CN103946732A publication Critical patent/CN103946732A/en
Application granted granted Critical
Publication of CN103946732B publication Critical patent/CN103946732B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B27/0172Head mounted characterised by optical features
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/0093Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for monitoring data relating to the user, e.g. head-tracking, eye-tracking
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B27/0176Head mounted characterised by mechanical features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/163Wearable computers, e.g. on a belt
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • G06F3/167Audio in a user interface, e.g. using voice commands for navigating, audio feedback
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0112Head-up displays characterised by optical features comprising device for genereting colour display
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0118Head-up displays characterised by optical features comprising devices for improving the contrast of the display / brillance control visibility

Abstract

This disclosure concerns a near field communication (NFC) device which includes a wrist- worn NFC-enabled electronics device, wherein the wrist-worn NFC enabled electronics device includes a first communications link for communicating with a second NFC-enabled electronics device via NFC protocols, and a second communications link for communicating with an eyepiece via a medium-range communications protocol and receiving control commands. The wrist-worn NFC-enabled electronics device facilitates the transfer of data between the eyepiece and the second NFC-enabled electronics device. The eyepiece comprises optics enabling a see-through display on which is displayed the data.

Description

The video display update of the sensor input based on to perspective, near-to-eye
The cross reference of related application
The application requires the right of priority of following U.S. Provisional Patent Application, and these applications by reference its entirety are contained in this:
The U.S. Provisional Application 61/539,269 that on September 26th, 2011 submits to.
The application is the part continuity of the non-temporary patent application of the following U.S., and each of these applications by reference its entirety is contained in this:
U.S.'s non-provisional application 13/591 that on August 21st, 2012 submits to, 187, this application requires the rights and interests of following provisional application, and each of these applications by reference its entirety is contained in this: the U.S. Provisional Patent Application 61/679,522 that on August 3rd, 2012 submits to; The U.S. Provisional Patent Application 61/679,558 that on August 3rd, 2012 submits to; The U.S. Provisional Patent Application 61/679,542 that on August 3rd, 2012 submits to; The U.S. Provisional Patent Application 61/679,578 that on August 3rd, 2012 submits to; The U.S. Provisional Patent Application 61/679,601 that on August 3rd, 2012 submits to; The U.S. Provisional Patent Application 61/679,541 that on August 3rd, 2012 submits to; The U.S. Provisional Patent Application 61/679,548 that on August 3rd, 2012 submits to; The U.S. Provisional Patent Application 61/679,550 that on August 3rd, 2012 submits to; The U.S. Provisional Patent Application 61/679,557 that on August 3rd, 2012 submits to; The U.S. Provisional Patent Application 61/679,566 that on August 3rd, 2012 submits to; The U.S. Provisional Patent Application 61/644,078 that on May 8th, 2012 submits to; The U.S. Provisional Patent Application 61/670,457 that on July 11st, 2012 submits to; And the U.S. Provisional Patent Application 61/674,689 of submission on July 23rd, 2012.
U.S.'s non-provisional application 13/441 that on April 6th, 2012 submits to, 145, this application requires the rights and interests of following provisional application, and each of these applications by reference its entirety is contained in this: the U.S. Provisional Patent Application 61/598,885 that on February 14th, 2012 submits to; The U.S. Provisional Patent Application 61/598,889 that on February 14th, 2012 submits to; The U.S. Provisional Patent Application 61/598,896 that on February 12nd, 2012 submits to; And the U.S. Provisional Patent Application 61/604,917 of submission on February 29th, 2012.
U.S.'s non-provisional application 13/429,413 that on March 25th, 2012 submits to, this application requires the rights and interests of following provisional application, and these applications by reference its entirety are contained in this: the U.S. Provisional Patent Application 61/584,029 that on January 6th, 2012 submits to.
U.S.'s non-provisional application 13/341 that on Dec 30th, 2011 submits to, 758, this application requires the rights and interests of following provisional application, and each of these applications by reference its entirety is contained in this: the U.S. Provisional Patent Application 61/557,289 that on November 8th, 2011 submits to.
U.S.'s non-provisional application 13/232,930 that on September 14th, 2011 submits to, this application requires the rights and interests of following provisional application, and each of these applications by reference its entirety is contained in this: the U.S. Provisional Application 61/382,578 that on September 14th, 2010 submits to; The U.S. Provisional Application 61/472,491 that on April 6th, 2011 submits to; The U.S. Provisional Application 61/483,400 that on May 6th, 2011 submits to; The U.S. Provisional Application 61/487,371 that on May 18th, 2010 submits to; And the U.S. Provisional Application 61/504,513 of submission on July 5th, 2011.
The non-temporary patent application 13/037 of the U.S. that on February 28th, 2011 submits to, the non-temporary patent application 13/037 of the U.S. that on February 28th, 324 and 2011 submits to, 335, each in these two applications requires the rights and interests of following provisional application, each of these provisional application by reference its entirety is contained in this: the U.S. Provisional Patent Application 61/308,973 that on February 28th, 2010 submits to; The U.S. Provisional Patent Application 61/373,791 that on August 13rd, 2010 submits to; The U.S. Provisional Patent Application 61/382,578 that on September 14th, 2010 submits to; The U.S. Provisional Patent Application 61/410,983 that on November 8th, 2010 submits to; The U.S. Provisional Patent Application 61/429,445 that on January 3rd, 2011 submits to; And the U.S. Provisional Patent Application 61/429,447 of submission on January 3rd, 2011.
Background technology
Field:
The present invention relates to augmented reality eyepiece, the control technology being associated and application, relate in particular to the software application operating on eyepiece.
The invention still further relates to and use the changeable eyeglass that is serializing pattern so that the thin display technique of image to be provided from waveguide.
In the industry, the head mounted display with reflecting surface is known.The head mounted display with single oblique angle part reflection beam splitting chip has been described in United States Patent (USP) 4969714.Although the method provides remarkable brightness and the homogeneity of color on the shown visual field, optical system is relatively thick due to oblique angle beam splitting chip.
In United States Patent (USP) 6829095 and 7724441, describe and there is part reflection surface array so that the head mounted display of thinner optical system to be provided, these displays are shown in Figure 124, wherein part reflection surface array 12408 is used to provide image light 12404 showing on the visual field, thereby allows user to check shown image, view together with the environment before user.The image light 12404 that user checks is by forming from each the combined reflected light in multiple part reflecting surfaces 12408.Must be by multiple part reflecting surfaces 12408 from the light of image source 12402, wherein a part for light 12402 is to user's eye reflections, thereby image light 12404 is provided.For the even image showing on the visual field is provided, the reflection characteristic of part reflecting surface 12408 must be accurately controlled.For from the nearest surface of image source, the reflectivity of part reflecting surface 12408 must be minimum, and for from image source surface farthest, the reflectivity of part reflecting surface 12408 must be the highest.Generally speaking, the reflectivity of part reflecting surface 12408 must increase with respect to the distance from image source is linear.This has proposed the problem of manufacture and cost, because the reflectivity of every a part of reflecting surface 12408 is different from adjacently situated surfaces, and each surperficial reflectivity must closely be controlled.Therefore, adopt part reflection surface array, be difficult to provide the image on the whole demonstration visual field with uniform brightness and color.
Or described in United States Patent (USP) 4711512, diffraction grating is used to the image light-redirecting of turnover waveguide to showing the visual field.But diffraction grating is with high costs and have a color aberration.
Thereby, the lasting demand that has the relatively thin optical system to the inhomogeneity head mounted display of image that shows brightness good on the visual field and color is also provided.
The invention still further relates to comprise wire grid polarizer film as part reflecting surface to make light irradiation be deflected downwards to the headlight of compactness and the light weight of reflectogram image source.
In the display with reflectogram image source and headlight as shown in Figure 133, light irradiation 13308 passes from edge light 13300, and by headlight 13304 deflections to irradiate reflectogram image source 13302.Light irradiation 13308 then self-reflection image source 13302 reflects, and becomes image light 13310, and display optics is passed and entered to image light 13310 then back by headlight 13304.Thus, headlight 13304 make light irradiation 13308 deflections that enter from edge light 13300 and allow simultaneously the image light 13310 that reflects by and be not deflected, therefore image light 13310 can import in display optics, wherein display optics can be dispersion in the time that display is flat screen display, or in the time that display is near-to-eye, can be refraction or diffraction.In this embodiment, display optics can comprise diffuser.
To the reflectogram image source such as liquid crystal over silicon (LCOS) image source, light irradiation is polarization, and reflectogram image source comprises that quarter-wave delays film, the polarization state during this membrane change self-reflection image source reflection.So comprise polarizer in display optics, it makes polarization effect that liquid crystal gives form an image during by display optics at image light.
U.S. Patent application 7163330 has been described a series of headlights, comprises the groove in the upper surface of headlight so that be deflected downwards to reflectogram image source from the light of edge light along the flat between groove, and the image light being reflected to allow imports in display optics.Figure 134 illustrates the diagram of the headlight 13400 with groove 13410 and flat 13408.Light irradiation 13402 from edge light 13300 reflects from groove 13410, and is deflected down to irradiate reflectogram image source 13302.Image light 13404 self-reflection image sources 13302 reflect, and by the flat 13408 of headlight 13400.Linear and bending groove 13410 have been described.But, for effectively making the groove 13410 of light irradiation 13402 deflections, groove 13410 must occupy the very large area of headlight, thereby has limited the area of flat 13408, and because light makes to offer the image quality degradation of display optics in the time passing back by headlight from groove scattering.Headlight 13400 is formed by solid material piece conventionally, and therefore may be relatively heavier.
In United States Patent (USP) 7545571, wearable display system is provided, it comprises reflectogram image source 13502, this image source there is polarization beam apparatus 13512 as headlight so that light irradiation 13504 deflections that edge light 13500 provides polarization to reflectogram image source 13502, as shown in Figure 135.Polarization beam apparatus 13512 is the oblique angle planes that are solid block, has the independent curved surface reflector 13514 being associated with edge light 13500.Curved surface reflector 13514 can be the total internal reflection piece 13510 that is connected to polarization beam apparatus 13512.Thus, disclosed in this patent have polarization beam apparatus solid block and provide large-scale and relative heavier headlight with the headlight of total internal reflection piece.In addition, Figure 135 also illustrates image light 13508.
Still exist for the display with reflectogram image source the demand of headlight is provided, this headlight provides good picture quality and or compact and light weight with little scattered light.
The invention still further relates to the optics flat surfaces made from blooming.More specifically, the invention provides for using blooming to manufacture the method for the smooth beam splitter of optics.
Can obtain for various purposes blooming, comprise: beam splitter, polarization beam apparatus, holographic reflector and eyeglass.In imaging applications, especially in catoptric imaging application, regulation blooming is very smooth is important to preserve before image wave.Some blooming has contact adhesive in a side, obtains support structure to allow blooming to be attached to substrate, and assists to make blooming to keep smooth.But the blooming that is attached in this way substrate often has with being called as the small scale fluctuating of orange peel and the surface of pit, it is smooth that this stops surface to realize optics, and the image that therefore reflected is demoted.
In U.S. Patent application 20090052030, the method for the manufacture of blooming is provided, wherein blooming is wire-grid polarizer.But, not for the technology of the film with optics flatness is provided.
In United States Patent (USP) 4537739 and 4643789, provide for making former figure be attached to the method for molded structure by using band that former figure is transported to mould.But these methods are not predicted the particular/special requirement of blooming.
In U.S. Patent application 20090261490, provide for the manufacture of comprising the simple optical goods of blooming and molded method.The method is for generated curved surface, because the method comprises that restriction between radius-of-curvature and the ratio of diameter is with gauffer in the film of avoiding causing because of the distortion of film during molded.The particular/special requirement for manufacture with the optics flat surfaces of blooming is selected.
In United States Patent (USP) 7820081, provide the method that functional film layer is laminated to lens.The method is adhered to lens with hot setting adhesive by functional membrane.But this technique is included in lens when heat thermoforming blooming, with make blooming, bonding agent and lens during technique for sticking together with deformation.Thus, the method is unsuitable for manufacturing optics flat surfaces.
Therefore, still exist using blooming to make to provide the demand of the surperficial method that comprises blooming with optics flatness.
Summary of the invention
In each embodiment, eyepiece can comprise the in house software application operating in integrated multimedia calculation mechanism, and this application is suitable for 3D augmented reality (AR) content demonstration and mutual with eyepiece.3D AR software application can provide in conjunction with mobile application and development and by application shop, or as the specially independent application for eyepiece, as final usage platform and provide by special 3D AR eyepiece shop.In house software application can be docked with the input and output mechanism being provided by the inside and outside mechanism of eyepiece by eyepiece, such as from surrounding environment, sensor device, user action capture device, inter-process mechanism, inner multimedia processing mechanism, other internal applications, camera, sensor, microphone, by transceiver, by tactile interface, from initiations such as outer computer structure, applications, event and/or feeds of data, external unit, third parties.In conjunction with the order of eyepiece operation and control model can be mutual by input, user action, the external unit of input equipment by sensing, the reception of event and/or feeds of data, internal applications execution, applications execution etc. are initiated.In each embodiment, can exist as applied and provide by house software, be included in the series of steps of carrying out in controlling, comprise in the following at least the combination of two: event and/or feeds of data, sensing input and/or sensing equipment, user action catches input and/or output, for control and/initiate order user move and/or move, order and/or control model and interface (wherein input can be reflected), can utility command respond the application on the platform of input, from platform, interface is to communication and/or the connection of external system and/or equipment, external unit, applications, feedback to user is (such as about external unit, applications) etc.
The present invention is also provided for the method for the optical system that provides relatively thin, and this optical system provides the image with improved brightness and color uniformity on the demonstration visual field.The present invention includes the integral array of the narrow changeable eyeglass on viewing area, show the visual field to provide, wherein changeable eyeglass is used in order to reflect the each several part from the light of image source, thereby presents the order part of image to user.By according to repetitive sequence from transparent to switching rapidly reflectingly narrow changeable eyeglass, user awareness is to the each several part that the whole image as provided by image source will be provided in image.Suppose that each narrow changeable eyeglass switches by 60Hz or higher frequency, user can not perceive the flicker in image each several part.
Each embodiment of narrow switchable mirror chip arrays is provided.In one embodiment, changeable eyeglass is the changeable eyeglass of liquid crystal.In another embodiment, changeable eyeglass is the removable prism element that changeable total internal reflection eyeglass is provided with air gap.
In alternative embodiment, not all changeable eyeglass is all sequentially used, on the contrary, use be the eye spacing based on user and changeable eyeglass in change selected group.
The present invention also provide comprise wire-grid polarizer film as part reflecting surface so that light irradiation is deflected down to the headlight of compactness and the light weight of reflectogram image source.Edge light is polarized, and wire-grid polarizer is directed, light irradiation is reflected and image light is allowed through and passes to display optics.By using flexible wire-grid polarizer film, the invention provides part reflecting surface, this surface can be bent light irradiation to be focused on to reflectogram image source, thereby has improved efficiency and improved the homogeneity of brightness of image.Wire-grid polarizer also has low-down light scattering, because image light passes through headlight in the way of going to display optics, therefore picture quality is kept.In addition, because part reflecting surface is wire-grid polarizer film, the major part of headlight is made up of air, and therefore headlight is light in weight.
The present invention is also provided for manufacturing the surperficial method with the optics flatness in the time using blooming.In various embodiments of the present invention, blooming can comprise beam splitter, polarization beam apparatus, wire-grid polarizer, eyeglass, part eyeglass or or holographic film.Advantage provided by the invention is: the surface of blooming is that optics is smooth, to make the wavefront of light be kept to provide improved picture weight.
In certain embodiments, the invention provides the image display system of the blooming that comprises that optics is smooth.The blooming that optics is smooth is included in the substrate that keeps the blooming that optics is smooth in the display module shell that has image source and check position.The image that wherein image source provides reflexes to and checks position from blooming, and the substrate with blooming can be replaced in display module shell.
In other embodiments of the invention, blooming is attached to molded structure, and therefore blooming is a part for display module shell.
In prior art display 18700 as shown in Figure 187, that have reflectogram image source 18720 and solid beam splitter square headlight 18718, light 18712 passes to diffuser 18704 from light source 18702, makes there it more evenly so that light irradiation 18714 to be provided.Light irradiation 18714 is partially reflected layer 18708 and is redirected, thereby irradiates reflectogram image source 18720.Light irradiation 18714 then self-reflection image source 18720 reflects, and becomes image light 18710, and then image light 18710 pass and enter by partially reflecting layer 18708 the image optics device (not shown) being associated back, and image optics device presents image to beholder.Thus, solid beam splitter square 18718 is redirected light irradiation 18714, and simultaneously allow reflection image light 18710 by and be not redirected, therefore image light can import image optics device into, wherein image optics device can be dispersion in the time that display is flat screen display, or in the time that display is projector or near-to-eye, can be refraction or diffraction.
To the reflectogram image source such as liquid crystal over silicon (LCOS) image source, light irradiation is polarization, and at light irradiation self-reflection image source reflex time, the picture material that reflectogram image source presents based on image source changes polarization state, thereby forms image light.So comprised analyzer polarizer, it makes polarization effect that LCOS gives form an image at image light during by image optics device, and image is presented to beholder.
In United States Patent (USP) 7545571, wearable display system is provided, it comprises reflectogram image source, image source there is polarization beam apparatus as headlight so that the light irradiation deflection that edge light provides polarization to reflectogram image source.Polarization beam apparatus is the oblique angle plane that is solid block, has the independent curved surface reflector being associated with edge light.Curved surface reflector can be the total internal reflection piece that is connected to polarization beam apparatus.Thus, disclosed in this patent have polarization beam apparatus solid block and provide large-scale and relative heavier headlight with the headlight of total internal reflection piece.
United States Patent (USP) 6195136 discloses a series of headlight illuminating methods for reflectogram image source.Method for making the compacter use curved surface beam splitter of headlight is disclosed.But, curved surface beam splitter be positioned to from image source quite away from, to reduce from the then angle to the light of image source by beam splitter reflection of light source.And, only in a side of headlight, provide light, therefore the size of beam splitter must be at least equally large with image source.As a result, in the time measuring along optic axis, compared with irradiation area in image source, headlight overall dimension is still relatively large.
Still exist for the display with reflectogram image source the demand of headlight is provided, this headlight provides good picture quality and or compact, efficient and light weight with little scattered light.
The headlight of compact, efficient and light weight that the present invention provides in display unit, this display unit comprises that part reflecting surface is being redirected to reflectogram image source from the light irradiation in ambient light source, wherein as much smaller than the width of irradiated reflectogram image source in the size of the display unit of measuring according to the height in diffuser region.In certain embodiments, part reflecting surface can be bent that the light from light source is focused on or focused on reflectogram image source.Light source can be polarized, and polarization beam apparatus film can be used as bending part reflecting surface, is allowed through and passes to image optics device with the image light that light irradiation is redirected and reflected.Polarization beam apparatus film is light weight, and has low-down light scattering, because image light passes through headlight in the way of going to display optics, therefore picture quality is kept.
In other embodiments of the invention, light source is arranged on the relative both sides of headlight, to make providing light to two opposite edges of reflectogram image source.In this case, part reflecting surface is made up of two surfaces, and one of them surface makes the half to image source from the light irradiation deflection of a light source, and another surface makes light deflection to second half of image source.In this embodiment, part reflecting surface can be bending or smooth.
In another embodiment of the present invention, part reflecting surface is polarization beam apparatus and light source is polarized, be therefore first polarized beam splitter from the light of light source and be redirected, then be reflected image source reflect and change polarization after be launched.
In another embodiment, be not polarized from the light of light source, therefore the catoptrical polarization state of polarization beam apparatus is with the half of irradiation reflectogram image source, simultaneously radiative another polarization state.The polarized state of light of launching passes to the opposite side of headlight, and light is recycled there.The recycle of the polarization state of launching can complete in the following manner: by quarter-wave film and by lens reflecting, it is passed back by quarter-wave film and change thus polarization state.After the polarized state of light launching and reflect is changed, light is polarized beam splitter and is redirected to irradiate second half of reflectogram image source.In alternative embodiment, from the light of two side lamps of headlight, according to work in complementary fashion, the diffuser of the polarized state of light from opposite side wherein launched on it and opposite side becomes unpolarized when mutual, and recycle thus.
In an embodiment more of the present invention, provide for the manufacture of the method for headlight with flexible portion reflectance coating.Flexible membrane can be supported at edge and be independent of supporting on reflectogram image source, or flexible membrane can be sandwiched between transparent two or more solid member.Solid member can be placed in contact with flexible membrane before be shaped.Solid member can keep flexible membrane according to flat geometry or curved geometric.In another embodiment, flexible membrane can be supported at edge, then solid member can be cast on the spot, so that flexible membrane is embedded in transparent solid material.
In one embodiment, system can comprise the mutual wear-type eyepiece that user wears, wherein eyepiece comprise user by its check the optics assembly of surrounding environment and shown content, for the treatment of content so as the integrated processor showing to user, for content being introduced to the integrated image source of optics assembly; This processor is suitable for revised context, and wherein amendment is in response to sensor input and makes.Content can be video image.Amendment can be following at least one of them: adjust brightness, adjust color saturation, adjust colour balance, adjust tone, adjust video resolution, adjust transparency, adjust compressibility, adjust frame per second per second, the part of isolation video, stop displaying video, suspend video or restart video.Sensor input can from following one of them obtain: charge-coupled image sensor, black silicon sensor, IR sensor, acoustic sensor, induction pick-up, motion sensor, optical sensor, opacity sensor, proximity sense, inductance sensor, eddy current sensor, passive infrared proximity sense, radar, capacitive transducer, capacitive displacement transducer, hall effect sensor, Magnetic Sensor, GPS sensor, thermal imaging sensor, thermopair, thermistor, photoelectric sensor, sonac, infrared laser sensor, inertia motion sensor, MEMS internal motion sensor, ultrasonic 3D motion sensor, accelerometer, inclinator, power sensor, piezoelectric sensor, rotary encoder, linear encoder, chemical sensor, ozone sensor, smoke transducer, thermal sensor, magnetometer, carbon dioxide detector, carbon monoxide detector, oxygen sensor, glucose sensor, smoke detector, metal detector, Raindrop sensor, altitude gauge, GPS, to whether in outside detection, to contextual detection, to movable detection, object detector (for example, billboard), sign detector (for example,, for making the geographic position mark of advertisement), laser range finder, sonar, electric capacity, photoresponse, heart rate sensor or RF/ micropower impulse radio (MIR) sensor.Can stop play content in response to the instruction of mobile accelerometer input of the head from about user.Audio sensor input can be generated by the speaking of at least one participant of video conference.Vision sensor can be at least one participant's video image or the video image of vision demonstration of video conference.Amendment can be in response to from the instruction of moving about user of sensor and make at least one in more or less transparent video image.
In one embodiment, system can comprise the mutual wear-type eyepiece that user wears, wherein eyepiece comprise user by its check the optics of surrounding environment and shown content, for the treatment of content so as the integrated processor showing to user, for content being introduced to the integrated image source of optics, this processor is suitable for revised context, and wherein amendment is in response to sensor input and makes; And this system also comprises integrated video picture catching mechanism, this mechanism records the one side of surrounding environment and provides content to show.
By reading the detailed description and the accompanying drawings of following examples, these and other system of the present invention, method, object, feature and advantage will be apparent to those skilled in the art.
Above-mentioned all documents by reference its entirety are contained in this.Obtain unless otherwise expressly stated or from word is clear, otherwise singulative to project quote the project that should be understood to include plural form, vice versa.Unless otherwise expressly stated or from the context clear obtaining, otherwise grammer conjunction is intended to express the combination of any and all turnovers and the connection of the subordinate clause that combines, sentence, word etc.
Accompanying drawing summary
The present invention and can understand with reference to the following drawings the detailed description of its some embodiment below:
Fig. 1 has described the illustrative embodiment of optical devices.
Fig. 2 has described RGB LED projector.
Fig. 3 has described the projector in use.
Fig. 4 has described to be placed in the waveguide of framework and the embodiment of correcting lens.
Fig. 5 has described the design of waveguide eyepiece.
Fig. 6 has described the embodiment of the eyepiece with perspective lens.
Fig. 7 has described the embodiment of the eyepiece with perspective lens.
Fig. 8 A-C described according to above turning over/under turn over the embodiment of the eyepiece that configuration arranges.
Fig. 8 D-E has described the embodiment of the buckling element of secondary optics.
Fig. 8 F described above to turn over/under turn over the embodiment of electrical-optical module.
Fig. 9 has described the electrochromic layer of eyepiece.
Figure 10 has described the advantage of eyepiece aspect Real-time image enhancement, keystone correction and virtual perspective correction.
Figure 11 has described the chart of the responsiveness contrast wavelength of three substrates.
Figure 12 illustrates the performance of black silicon sensor.
Figure 13 A has described existing night vision system, and Figure 13 B has described night vision system of the present invention, and Figure 13 C illustrates the difference of responsiveness between the two.
Figure 14 has described the haptic interface of eyepiece.
Figure 14 A has described to characterize the motion of nodding in the embodiment of the eyepiece controlled.
Figure 15 has described to control the finger ring of eyepiece.
Figure 15 AA has described to control the finger ring with the eyepiece of integrated camera, wherein can allow in one embodiment user to provide themselves video image as a part for video conference.
Figure 15 A has described the handset type sensor in the embodiment of virtual mouse.
Figure 15 B has described to be arranged on the facial actuation sensor on eyepiece.
Figure 15 C has described the finger point control of eyepiece.
Figure 15 D has described the finger point control of eyepiece.
Figure 15 E has described the example of eye tracking control.
Figure 15 F has described the hand positioning control of eyepiece.
Figure 16 has described the location-based application model of eyepiece.
Figure 17 shows A) can carry out flexibility platform and the B of the non-cooled cmos image sensor of VIS/NIR/SWIR imaging) difference of picture quality between figure image intensifying night vision system.
Figure 18 has described to enable the customized advertising board of augmented reality.
Figure 19 has described to enable the customized advertising of augmented reality.
Figure 20 has described to enable the customization illustration of augmented reality.
Figure 20 A has described for delivering message so that the method sending in the time that beholder arrives a certain position.
Figure 21 has described the replacement arrangement of eyepiece optics and electron device.
Figure 22 has described the replacement arrangement of eyepiece optics and electron device.
Figure 22 A with the luminous example depiction of eyes eyepiece.
Figure 22 B has described to have the xsect of the eyepiece for reducing the luminous light control element of eyes.
Figure 23 has described the replacement arrangement of eyepiece optics and electron device.
Figure 24 has described the latched position of dummy keyboard.
Figure 24 A has described the embodiment of the virtual projection image on anthropolith.
Figure 25 has described the detailed view of projector.
Figure 26 has described the detailed view of RGB LED module.
Figure 27 has described gaming network.
Figure 28 has described the method for playing with augmented reality glasses.
Figure 29 has described the exemplary electronic circuit figure for augmented reality eyepiece.
Figure 29 A has described the control circuit for the eye tracking control of external unit.
Figure 29 B has described the communication network between the user of augmented reality eyepiece.
Figure 30 has described the parts of images that eyepiece carries out and has removed.
Figure 31 has described a people's who catches based on microphone as augmented reality equipment speech and has identified the process flow diagram of this person's method.
Figure 32 has described for the typical camera in video call or meeting.
Figure 33 shows the embodiment of the block diagram of video call camera.
Figure 34 has described the embodiment of optics or the stable eyepiece of numeral.
Figure 35 has described the embodiment of classic card Cassegrain configuration.
Figure 36 has described the configuration of microcaloire Cassegrain telescopic folding optical camera.
Figure 37 has described the paddling process of dummy keyboard.
Figure 38 has described the target label process of dummy keyboard.
Figure 38 A has described the embodiment of vision word translater.
Figure 39 illustrates the glasses that catch for biometric data according to an embodiment.
Figure 40 illustrates the iris recognition that catches glasses according to the use biometric data of an embodiment.
Figure 41 has described according to the face of an embodiment and iris recognition.
Figure 42 illustrates according to the use of two omni-directional microphones of an embodiment.
Figure 43 has described to adopt the directionality of multiple microphones to improve.
Figure 44 shows the use of controlling audio capture mechanism according to the adaptive array of an embodiment.
Figure 45 shows mosaic finger and the palm register system according to an example embodiment.
Figure 46 illustrates the traditional optical method being used by other fingerprint and palmmprint system.
Figure 47 shows the method using according to the mosaic sensor of an example embodiment.
Figure 48 shows according to the device layout of the mosaic sensor of an example embodiment.
Figure 49 illustrates according to the camera visual field using in the mosaic sensor of another embodiment and multiple camera.
Figure 50 shows according to the biological phone of an embodiment and sense of touch computing machine.
Figure 51 shows according to the biological phone of an embodiment and sense of touch computing machine in the use catching aspect potential fingerprint and palmmprint.
Figure 52 illustrates typical DOMEX set.
Figure 53 shows according to the relation between the biological phone of the use of an embodiment and biologicall test image and the biologicall test watch list of the seizure of sense of touch computing machine.
Figure 54 shows the pocket biological tool external member according to an embodiment.
Figure 55 shows according to each assembly of the pocket biological tool external member of an embodiment.
Figure 56 has described according to the fingerprint of an embodiment, palmmprint, geographic position and POI registering apparatus.
Figure 57 shows the system for multi-modal biologicall test collection, mark, geographic position and POI registration according to an embodiment.
Figure 58 illustrates according to the forearm wearable device of the fingerprint of an embodiment, palmmprint, geographic position and POI registration.
Figure 59 shows according to the folding biologicall test registration of the movement of embodiment suite of tools.
Figure 60 is according to the AS figure of the biologicall test registration suite of tools of an embodiment.
Figure 61 is according to the system diagram of the folding biologicall test registering apparatus of an embodiment.
Figure 62 shows film fingerprint and the palmprint sensor according to an example embodiment.
Figure 63 show according to an example embodiment for pointing, the biologicall test collecting device collected of palm and registration data.
Figure 64 illustrates according to the two-stage palmmprint of an embodiment and catches.
Figure 65 illustrates according to the seizure of the finger tip tapping of an embodiment.
Figure 66 illustrates according to the bat seal of an embodiment and the seizure of roll printing.
Figure 67 has described the system for gathering non-contacting fingerprint, palmmprint or other biologicall test line.
Figure 68 has described the process for gathering discontiguous fingerprint, palmmprint or other biologicall test line.
Figure 69 describes an embodiment of wrist-watch controller.
Figure 70 A-D has described the embodiment example of eyepiece, comprises the ability of charging and integrated demonstration.
Figure 71 has described the embodiment of ground rod data system.
Figure 72 has described the block diagram of the control mapped system that comprises eyepiece.
Figure 73 has described biologicall test electric torch.
Figure 74 has described the helmet of eyepiece and has worn version.
Figure 75 has described the embodiment of situational awareness glasses.
Figure 76 A has described 360 ° of imagers of assembling, and Figure 76 B has described the cross section view of 360 ° of imagers.
Figure 77 has described the multiple decomposition view that closes view camera.
Figure 78 has described flight eye.
Figure 79 has described the decomposition plan view of eyepiece.
Figure 80 has described the electrooptics assembly decomposing.
Figure 81 has described the decomposition view of the axle of electrooptics assembly.
Figure 82 has described to utilize an embodiment of the optical presentation system of the plane irradiation instrument with reflective display.
Figure 83 has described the structure embodiment of plane illuminating optical system.
Figure 84 has described plane and has irradiated instrument and have the embodiment assembling that LASER SPECKLE suppresses the reflective display of assembly.
Figure 85 has described to have the embodiment that irradiates instrument for the plane of the flute profile feature of redirecting light.
Figure 86 has described to have paired flute profile feature and ' anti-flute profile ' feature is irradiated the embodiment of instrument to reduce the plane of image aberration.
The plane that Figure 87 has described to manufacture from laminar structure is irradiated the embodiment of instrument.
Figure 88 has described to have the embodiment that irradiates instrument for the plane of the wedge shape optics of redirecting light.
Figure 89 has described the block diagram of irradiation module according to an embodiment of the invention.
Figure 90 has described the block diagram of optical frequency converter according to an embodiment of the invention.
Figure 91 has described the block diagram of Ear Mucosa Treated by He Ne Laser Irradiation module according to an embodiment of the invention.
Figure 92 has described the block diagram of LASER Illuminator System according to another embodiment of the present invention.
Figure 93 has described the block diagram of imaging system according to an embodiment of the invention.
Figure 94 A and B have described to have respectively the lens of photochromic element and heating element according to vertical view and side view.
Figure 95 has described the embodiment of LCoS headlamp designs.
Figure 96 has described to have the optical bonding prism of polarizer.
Figure 97 has described to have the optical bonding prism of polarizer.
Figure 98 has described multiple embodiment of LCoS headlamp designs.
The chock that Figure 99 has described to be superimposed upon on LCoS adds OBS.
Figure 100 has described two versions of chock.
Figure 101 has described the bending PBS film on LCoS chip.
Figure 102 A has described an embodiment of optics assembly.
Figure 102 B has described an embodiment of the optics assembly with embedded camera.
Figure 103 has described an embodiment of image source.
Figure 104 has described an embodiment of image source.
Figure 105 has described all embodiment of image source.
Figure 106 has described to illustrate in conjunction with the function of the eyepiece in one embodiment of the invention and control aspect the top-level block diagram in software application instrument and market.
Figure 107 has described the functional block diagram of the eyepiece Application development environ-ment in one embodiment of the invention.
Figure 108 has described in one embodiment of the invention the platform element exploitation stack about the software application of eyepiece.
Figure 109 is the diagram according to an embodiment of the invention with the head mounted display of see-through capabilities.
Figure 110 is to as the diagram of the view of the unmarked scene of checking by the head mounted display of being described in Figure 109.
Figure 111 is the diagram with the view of the scene of the folded tagged Figure 110 of 2D.
Figure 112 is as the diagram of the 3D mark of Figure 111 of the left eye demonstration to beholder.
Figure 113 is as the diagram of the 3D mark of Figure 111 of the right eye demonstration to beholder.
Figure 114 is the diagram that superposes each other the left and right 3D mark that different Figure 111 is shown.
Figure 115 is the diagram with the view of the scene of Figure 110 of 3D mark.
Figure 116 is the diagram of the stereo-picture of the scene of caught, Figure 110.
Figure 117 is the diagram that the superimposed left and right stereo-picture of the different Figure 116 between image is shown.
Figure 118 is the diagram that the scene of Figure 110 of the 3D mark of stack is shown.
Figure 119 is of the present invention for the process flow diagram of Depth cue embodiment of the method for 3D mark is provided.
Figure 120 is of the present invention for the process flow diagram of another Depth cue embodiment of the method for 3D mark is provided.
Figure 121 is of the present invention for the process flow diagram of another Depth cue embodiment of the method for 3D mark is provided.
Figure 122 is of the present invention for the process flow diagram of a Depth cue embodiment of the method again of 3D mark is provided.
Figure 123 A has described for DISPLAY ORDER frame is provided to carry out the processor of image demonstration by display module.
Figure 123 B has described to be configured to remove the display interface of display driver.
Figure 124 is the schematic diagram with the prior art waveguide of multiple part reverberators;
Figure 125 is the schematic diagram in primary importance with the waveguide of the changeable eyeglass of multiple electricity;
Figure 125 A is the diagram with the Waveguide assembly of electrical connection.
Figure 126 is the schematic diagram in the second place with the waveguide of the changeable eyeglass of multiple electricity;
Figure 127 is the schematic diagram in the 3rd position with the waveguide of the changeable eyeglass of multiple electricity;
Figure 128 is the schematic diagram in primary importance with the waveguide of the changeable eyeglass of multiple machineries;
Figure 128 A is the schematic diagram with the Waveguide assembly of micro-actuator and associated hardware;
Figure 129 is the schematic diagram in the second place with the waveguide of the changeable eyeglass of multiple machineries;
Figure 130 is the schematic diagram in the 3rd position with the waveguide of the changeable eyeglass of multiple machineries;
Figure 131 A and Figure 131 B are the diagrams of the Waveguide display with changeable eyeglass on user's face; And
Figure 132 A-132C is the diagram of the viewing area that provides of the user for having different eye spacing.
Figure 133 is the schematic diagram with the reflectogram image source of edge light and headlight, and this schematic diagram illustrates the light passing through;
Figure 134 is the schematic diagram that comprises the prior art headlight of groove;
Figure 135 is the schematic diagram that is the prior art headlight that comprises plane polarization beam splitter and curvature reflectors of solid slug;
Figure 136 is the schematic diagram with one embodiment of the invention of single edge light and sweep grid polarizer film;
Figure 137 is the schematic diagram with one embodiment of the invention of two edge light and sweep grid polarizer film;
Figure 138 is the schematic diagram that keeps the side frame of flexible wires grid polarizer film according to required curved shape;
Figure 139 is the process flow diagram of method of the present invention.
Figure 140 is the schematic diagram with the nearly eye imaging system of beam splitter;
Figure 141 is the schematic diagram for the optical module of nearly eye imaging system;
Figure 142 is the diagram of film pattern optical sheet;
Figure 143 is the diagram with the inserted mode system module housing of built-in optical sheet;
Figure 144 is the diagram of the compression molding of lamination pattern optical sheet;
Figure 145 A-C is the diagram that applies blooming in molded module housing.
Figure 146 has described according to the signal front perspective view of the AR eyepiece of the disclosure one embodiment (there is no its mirror leg).
Figure 147 has described the signal back perspective view of the AR eyepiece of Figure 146.
Figure 148 has described the signal back portion skeleton view on the wearer right side of the AR eyepiece of Figure 146.
Figure 149 has described the signal back portion skeleton view on the wearer right side of the AR eyepiece of Figure 146.
Figure 150 has described the perspective illustration of the assembly of the AR eyepiece for one of support projection screen shown in Figure 146.
Figure 151 has described the perspective illustration of the adjustment platform of the AR eyepiece shown in Figure 146.
Figure 152 has described the perspective illustration of the assembly of the transverse adjusting mechanism of the AR eyepiece shown in Figure 146.
Figure 153 has described the perspective illustration of the assembly of the tilt adjusting mechanism of the AR eyepiece shown in Figure 146.
Figure 154 is the chart that the dark adaptation curve of human eye is shown.
Figure 155 is the chart that the impact that reduces gradually the dark adaptation curve of illumination on human eye is shown.
Figure 156 is the diagram with the head mounted display of see-through capabilities.
Figure 157 is the figure that display brightness and the relation between the time in the time entering dark surrounds are shown.
Figure 158 is the process flow diagram of shade adaptation method.
Figure 159 has described the dummy keyboard presenting in the user visual field.
Figure 160 has described the example of the display system with optics flat reflective surface.
Figure 161 shows the diagram of nearly eye display module.
Figure 162 shows the diagram of the optical device being associated with the type of head mounted display.
Figure 163 shows the diagram of wherein having added in the enclosure baffle plate between illumination beam splitter device and lens.
Figure 164 shows wherein the diagram that enters surface and added the another embodiment of the present invention of baffle plate at lens.
Figure 165 shows the output at lens has wherein added the diagram of the another embodiment of the present invention of baffle plate.
Figure 166 shows baffle plate wherein and is attached between lens and imaging beam splitter the diagram of the another embodiment of the present invention of shell.
Figure 167 shows the sidewall to shell wherein and applies the diagram of the another embodiment of the present invention of absorber coatings.
Figure 168 illustrates the diagram in another source of parasitic light in head mounted display, and wherein parasitic light directly enters from the edge of light source.
Figure 169 describes the parasitic light of any reflecting surface reflection from the shell of lens or edge.
Figure 170 shows the wherein of the present invention again diagram of an embodiment that be provided with baffle plate adjacent with light source.
Figure 171 has described to use the absorber coatings of protuberance (ridge), and wherein a series of little protuberances or step are taken on a series of baffle plates to stop or to prune the marginal ray in the whole sidewall areas of shell.
Figure 172 illustrates the another embodiment of belt or thin slice, and belt or thin slice comprise the slide glass and the protuberance that can be used for blocking reflected light.
Figure 173 has described the decomposition view of an embodiment of glasses.
Figure 174 has described distribution design and the wire guide of glasses.
Figure 175 has described the distribution design of glasses and the amplified version of wire guide.
Figure 176 A shows the distribution design of glasses and the cross section view of wire guide.
Figure 176 B shows the distribution design of glasses and the cross section view of wire guide.
Figure 176 C shows the distribution design of glasses and the full release of wire guide.
Figure 177 has described the U-shaped annex for fixing glasses.
Figure 178 has described the embodiment of the cable tension system of the head for glasses being fixed to user.
Figure 179 A and Figure 179 B have described the embodiment for glasses being fixed to according to bending configuration to the cable tension system of user's head.
Figure 180 has described the embodiment of the cable tension system of the head for glasses being fixed to user.
Figure 181 has described the embodiment for glasses being fixed to the system of user's head.
Figure 182 has described the embodiment for glasses being fixed to the system of user's head.
Figure 183 has described the embodiment for glasses being fixed to the system of user's head.
Figure 184 has described the embodiment for glasses being fixed to the system of user's head.
Figure 185 A has described an embodiment of optical element string.
Figure 185 B has described the sample ray-traces of light in an embodiment of optical element string.
Figure 186 has described LCoS and has added the embodiment that ASIC wraps.
Figure 187 is the illustrating of prior art headlight that uses single source and beam splitter piece;
Figure 188 is the illustrating of prior art headlight that uses single source and reflective beam splitter layer;
Figure 189 is the illustrating of headlight that uses single source, and wherein planar reflective beam splitter layer is placed with the angle reducing;
Figure 190 is the illustrating of headlight that uses single source, and wherein reflective beam splitter layer is bending;
Figure 191 is the illustrating of headlight that uses two light sources, and the folding reflective beam splitter film wherein with flat surfaces is positioned in transparent solid;
Figure 192 is the illustrating of headlight that uses two light sources, wherein uses the folding independence with flat surfaces without supporting reflex beam splitter film;
Figure 193 is the illustrating of headlight that uses two light sources, wherein uses the folding independence with curved surface without supporting reflex beam splitter film;
Figure 194 is the illustrating of headlight that uses two light sources, and the folding reflective beam splitter film wherein with curved surface is positioned in transparent solid;
Figure 195 is the illustrating of headlight that uses single source, and headlight has relative eyeglass and the quarter-wave film part with recycle polarized light, and the folding reflective beam splitter film wherein with flat surfaces is positioned in transparent solid;
Figure 196 is the illustrating of headlight that uses single source, and headlight has relative eyeglass and quarter-wave film with a part for recycle polarized light, is wherein provided with the independence with flat surfaces without supported folding reflecting polarization beam splitting device film;
Figure 197 is the illustrating of headlight that uses single source, and headlight has relative eyeglass and quarter-wave film with a part for recycle polarized light, is wherein provided with the independence with curved surface without supported folding reflecting polarization beam splitting device film;
Figure 198 manufactures headlight shown in Figure 197 but the folding reflective beam splitter film with flat surfaces is placed in illustrating of method in transparent solid, wherein top and bottom film retainer are used to the setting of reflective beam splitter film and location, and the each several part of polarized light is recycled;
Figure 199 be use method manufacture shown in Figure 198, with the illustrating of the headlight using together with the recycle sections of two light sources and polarized light;
Figure 200 is folding independent without the illustrating of supporting reflex beam splitter film, and this film is supported on edge at the first step of the method for the solid headlight of casting;
Figure 20 1 is illustrated in for the method for the solid headlight of casting for injecting transparent cast material illustrating except the hole of gas side by side;
Figure 20 2 is the illustrating of casting that the top of casting solid headlight is shown;
Figure 20 3 illustrates the illustrating of top of flattening casting solid headlight with flat transparent thin slice;
Figure 20 4 is the process flow diagrams for the method by assembly solid headlight;
Figure 20 5 is the process flow diagrams for manufacture the method for solid headlight by casting; And
Figure 20 6 is for using multistep molding process to manufacture the process flow diagram of the method for solid film retainer.
Figure 20 7 has described an embodiment of near-field communication wrist-watch.
Figure 20 8 has described and the embodiment of near-field communication wrist-watch of service point equipment interconnection that enables near-field communication.
Figure 20 9 has described and an embodiment who enables the near-field communication wrist-watch that the service point equipment of near-field communication and user's smart phone dock.
Describe in detail
The present invention relates to eyepiece electrooptics device.Eyepiece can comprise the projection optical device that is suitable for projecting image onto in perspective or translucent lens, thereby allows the wearer of eyepiece to check environment around and shown image.The projection optical device that is also referred to as projector can comprise the RGB LED module that makes use order color.Adopt an order color, single full-color image can be broken down into colour field based on primary colors red, green and blue, and by LCoS(liquid crystal over silicon) the independent imaging of optical display 210.Because each colour field is by optical display 210 imagings, corresponding LED color is opened.When these colour fields are when order is shown rapidly, full-color image can be in sight.Adopt order color to irradiate, can adjust the image of the institute's projection obtaining in eyepiece for any chromatic aberation by move red image etc. with respect to blueness and/or green image.After image, can be reflected in the waveguide of a pair of free form surface, wherein image beche-de-mer without spike and total internal reflection (TIR) are checked region until the user of arrival lens sees the activity of image.The processor that can comprise storer and operating system can be controlled LED light source and optical display.Projector can comprise or optically be coupled to and shows coupled lens, condenser lens, polarization beam apparatus and field lens.
With reference to figure 123A and 123B, processor 12302(for example, digital signal processor) can provide DISPLAY ORDER frame 12324 for the display module 12328(by eyepiece 100 for example, LCOS display module) carry out image demonstration.In each embodiment, sequence frames 12324 can adopt or not adopt as the display driver 12312 of the intermediary's assembly between processor 12302 and display module 12328 and produce.For example, and with reference to figure 123A, processor 12302 for example can comprise frame buffer zone 12304 and display interface 12308(, mobile industry processor interface (MIPI), and show serial line interface (DSI)).Display interface 12308 can offer RGB data 12310 according to pixels the display driver 12312 of the intermediary's assembly between processor 12302 and display module 12328, wherein display driver 12312 is accepted RGB data according to pixels 12310 and generates for the independent full frame of redness to show data 12318, show that for the independent full frame of green data 12320 and the independent full frame for blueness show data 12322, offer display module 12328 by DISPLAY ORDER frame 12324 thus.In addition, display driver 12312 can provide timing signal to display module 12328, such as so that synchronously as the transmission of the full frame 12318,12320,12322 of DISPLAY ORDER frame 12324.In another example, and with reference to figure 123B, display interface 12330 can be configured to show data 12334, show data 12338 and show that for blue full frame data 12340 remove display driver 12312 as DISPLAY ORDER frame 12324 for green full frame by be directly provided for red full frame to display module 12328.In addition, timing signal 12332 can directly offer display module from display interface 12330.This configuration can provide significantly lower power consumption to the demand of display driver by removing.The not only removable demand to driver of this direct panel information, and can simplify the overall logic of configuration, and remove and generate the required redundant memories such as Pixel Information from the panel information of pixel, from frame to changing.
With reference to Figure 186, in each embodiment, in order to improve the output of LCoS+ASIC bag 18600, ASIC can be installed on flexible print circuit (FPC) 18604, has reinforcing device on top.If the reinforcing device on top is equally high with ASIC, can not make overall bag increase thickness.FPC can via connector 18602(such as ZIF (ZIF) connect or for compared with the board to board connector of high pin count) be connected to standard LCoS bag (such as glass fiber reinforced epoxy resin laminate (FR4) 18608).ASIC, reinforcing device and LCoS are adhered to FPC by useful contact adhesive.
With reference to figure 1, the illustrative embodiment of augmented reality eyepiece 100 can be described.Can understand, the embodiment of eyepiece 100 can not comprise whole elements that Fig. 1 describes, and other embodiment can comprise additional or different elements.In each embodiment, optical element can be typed in the mirror shank of eyepiece frame 102 divides 122.Available projector 108 projects image onto at least one lens 104 of the opening that is placed in framework 102.Such as receiving during mirror shank that projector, skin projector, micro-projector, femto projector, one or more projector 108 such as projector, line holographic projections instrument based on laser can be placed in eyepiece frame 102 divides.In each embodiment, two lens 104 are all perspectives or translucent, and in other embodiments, and only lens 104 are translucent and another lens are opaque or disappearance.In each embodiment, in eyepiece 100, can comprise more than one projector 108.
In each embodiment of the embodiment describing such as Fig. 1, eyepiece 100 also can comprise at least one articulate earphone 120, wireless set 118 and heating radiator 114, this heating radiator 114 is for absorbing the heat from LED photo engine, so that LED photo engine keeps feeling nice and cool and allowing it to work under full brightness.Also there are the open multimedia application processors of one or more TI OMAP4() 112, there is the winding displacement 110 of RF antenna, they are description in more detail in this article all.
In one embodiment and with reference to figure 2, projector 200 can be RGB projector.Projector 200 can comprise shell 202, heating radiator 204 and RGB LED engine or module 206.RGB LED engine 206 can comprise LED, dichroic device, concentrator etc.Digital signal processor (DSP) (not shown) can convert image or video flowing to control signal, such as voltage-drop/curent change, pulse-length modulation (PWM) signal etc., to control intensity, duration and the mixing of LED lamp.For example, the dutycycle that DSP can control each pwm signal is to control the average current of each LED that produces multiple colors of flowing through.The rest image coprocessor of eyepiece can adopt noise filtering, image/video to stablize and face detects, and can carry out figure image intensifying.The audio frequency back-end processor of eyepiece can adopt buffering, SRC, equilibrium etc.
Projector 200 can comprise optical display 210 and multiple assemblies as shown in the figure such as LCoS display.In each embodiment, projector 200 can be designed to adopt single sided board LCoS display 210; But three panel displays are also possible.In single sided board embodiment, use in order red, blue and green (order color on the spot) irradiation display 210.In other embodiments, projector 200 can use replaces optics display technique, such as backlight liquid crystal display (LCD), front lighting LCD, semi-transparent reflection formula LCD, organic generating diode (OLED), field-emitter display (FED), ferroelectric LCoS(FLCOS), be arranged on liquid crystal technology on sapphire, transparent liquid crystal micro-display, quantum dot display etc.
In each embodiment, display can be 3D display, LCD, thin film transistor (TFT) LCD, LED, LCOS, ferroelectric liquid crystal on silicon display, CMOS display, OLED, QLED, on the point of crossing between OED pixel, there is the OLED array of CMOS formula element sensor, transmission-type LCoS display, CRT monitor, VGA display, SXGA display, QVGA display, there is the display of the gaze tracker based on video, there is the display of exit pupil expansion technique, Asahi film display, freeform optics display, XY polynomial expression combiner display, photoconduction transmission display, Amoled display etc.In each embodiment, display can be to allow eyepiece to be shown as the holographic display device of hologram from the image of image source.In each embodiment, display can be liquid crystal reflective micro-display.Such display can comprise polarization optics device, and compared with some OLED micro-display, can improve brightness.In each embodiment, display can be free curved surface prism display.Free curved surface prism display can be realized 3D three-dimensional imaging ability.In each embodiment, display can be with Cannon and/or Olympus company respectively at United States Patent (USP) 6,384, and those displays of describing in 983 and 6,181,475 are similar or identical.In other embodiments, display can comprise the gaze tracker based on video.In each embodiment, the light beam of infrared light supply can separate and expand in exit pupil expander (EPE), to produce the collimated light beam towards eyes from EPE.Micro video camera can corneal imaging, and eye gaze direction can be calculated by the flash of light of location pupil and infrared beam.After user's calibration, can reflect the user focus in shown image from the data of gaze tracker, this can be used as input equipment.Such equipment can be similar to those equipment that Nokia of Tampere, Finland city research centre provides.In addition, in each embodiment, display can comprise exit pupil expander, and it amplifies exit pupil and image transmitting is arrived to reposition.Thereby, may be only need to be at user's the thin transparent film of placement at the moment, and image source can be placed on other place.In a further embodiment, display can be from axle optical display.In each embodiment, such display can not overlap with the machine center of aperture.This has been avoided key light circle to be blocked by auxiliary optical component, kit and/or sensor, and the use to kit and/or sensor can be provided at focus place.For example, active matrix organic light-emitting diode (Amoled) display can use the Pixel Design that is called as PenTile from Nouvoyance company, and this design passes through more light according to various ways.First, red, blue and green sub-pixel is larger than the sub-pixel in traditional monitor.Secondly, in every four sub-pixels, having a sub-pixel is clear (clear).This means the less power brightly more luminous that uses backlight.Less sub-pixel means lower resolution conventionally, but PenTile display is cheated the three/for the moment perception identical resolution of eyes at the sub-pixel of the about RGB striped panel of use with independent sub-pixel.PenTile display is also determined the brightness of scene with image processing algorithm, automatically dim backlight to darker image.
For the limitation of the prior art described before overcoming, the invention provides the changeable eyeglass integral array in waveguide, these eyeglasses can be used in order to provide lining by line scan of each several part to showing visual field epigraph.Mode by order promptly by eyeglass from the reflective transmission-type that switches to, image can not have to offer user can perception flicker in the situation that.Due to compared with reflective condition, each changeable eyeglass is more often in transmissive state, and the array of changeable eyeglass is revealed as transparent time and also presents shown image to user user.
Waveguide is known to presenting of the light from image source to those skilled in the art, therefore will here not discuss.The exemplary discussion of waveguide and the light transmission from image source to viewing area provides in United States Patent (USP) 5076664 and 6829095.The present invention includes at waveguide redirection map picture light so that the method and apparatus of image to be provided to user, wherein the image in waveguide only provides from image source.
Figure 125 illustrates waveguide display device 12500, it have be redirected that transmit by waveguide 12510, from the light of image source 12502 to the integral array of the changeable eyeglass 12508a-12508c of image light 12504 is provided to user.Show three changeable eyeglass 12508a-12508c, but in the present invention, array can comprise the changeable eyeglass of different numbers.Changeable eyeglass shown in Figure 125 is the changeable eyeglass of electricity that comprises the changeable eyeglass of liquid crystal.Provide cover glass 12512 to comprise liquid crystal material in the thin layer that is illustrated as changeable eyeglass 12508a-12508c.Figure 125 also illustrates power lead 12514 and 12518.
The integral array of waveguide 12510 and changeable eyeglass 12508a-12508c can be made up of plastics or glass material, as long as this material is suitably smooth.Important not as in most of liquid crystal apparatus of the homogeneity of thickness, because changeable eyeglass has high reflectance.Being configured in United States Patent (USP) 6999649 of changeable liquid crystal lens described.
Figure 126 and 127 illustrates the order aspect in the present invention because a moment only in array one of changeable eyeglass in reflective condition, so other the changeable eyeglass in array is in transmissive state.Figure 125 illustrates the first changeable eyeglass 12508a in reflective condition, thereby the redirected light from image source 12502 is to become the image light 12504 that presents a part for image to user.Other changeable eyeglass 12508b and 12508c are in transmissive state.Figure 124 also illustrates waveguide 12410.
In Figure 126, changeable eyeglass 12508a and 12508c are in transmissive state, and changeable eyeglass 12508b is in reflective condition.The image section that this situation provides image light 12600 and has been associated to user.Finally, in Figure 127, changeable eyeglass 12508a and 12508b are in transmissive state, and changeable eyeglass 12508c is in reflective condition.The image section that this last situation provides image light 12700 and has been associated to user.After this last situation, repeat as shown in Figure 124, be afterwards as shown in Figure 125, be then the order as shown in Figure 126, so that lining by line scan to image to be provided.This order is repeated continuously when user checks shown image.Therefore, be redirected by single switchable mirror sheet at any given time in order from all light of image source 12502.Image source can provide continuous working in the lining by line scan of image light 12504 at changeable eyeglass on the visual field.If image light is perceived as brighter or different changeable eyeglasses is existed to different colour balances, image source can be adjusted to compensation, and the brightness of image source or colour balance can be modulated to synchronize with the transfer sequence of the array of changeable eyeglass.In another embodiment of the present invention, the order that changeable eyeglass switches can be changed to provide interlaced picture to user, such as the array for four changeable eyeglasses according to 1,3,2,4 order of repetitive mode.
Figure 128 illustrates another embodiment of the present invention, and wherein the integral array of mechanically operated changeable eyeglass is provided.In this case, the switchable mirror in waveguide display device 12800 comprises prism 12804a-12804c, and these prisms are moved alternately to provide air gap or the optics respectively and between surperficial 12810a-12810c to contact.As shown in Figure 128, prism 12804a is moved downward to provide air gap, is by the reflecting surface of total internal reflection work to make surperficial 12810a.Meanwhile, prism 12804b and 12804c are forced to upwards to provide optics to contact at surperficial 12810b with 12810c place respectively, are radioparent to make surperficial 12810b and 12810c.This situation is redirected becomes from the light of image source 12502 the image light 12802 that presents a part for image to user.In this embodiment, almost the contact of 100% optics is mobile is almost 100% total internal reflection of reflectivity from transmissivity for changeable eyeglass.Figure 128 also illustrates power lead 12812, base and general grounding connection 12814 and micro-actuator 12818a-c.
Figure 129 and 130 illustrates for other situation in the sequence of the mechanically operated changeable eyeglass of switchable mirror chip arrays.In Figure 129, prism 12804a is forced to upwards and contacts to provide respectively with the optics of surperficial 12810a and 12810c with 12804c, thereby for providing transmissive state from the light of image source 12502.Meanwhile, move down prism 12804b to manufacture air gap at surperficial 12810b place, so that must be redirected and become the image light 12900 that presents the part that is associated of image to user from the light of image source 12502.In the final step of the order shown in Figure 130, prism 12804a is forced to upwards to provide optics to contact at surperficial 12810a with 12810b place respectively, so that must be from the light of image source by arriving surperficial 12810c with 12804b.Prism 12804c is moved downward to provide air gap at surperficial 12810c place, has the reflecting surface of total internal reflection so that surperficial 12810c is become, and is redirected from the light of image source 12502 image section that becomes image light 13000 and be associated.
In discussion before, the situation of total internal reflection is based on the material of waveguide 12808 and the optical property of air as known for the skilled artisan.In order to obtain 90 degree reflections as shown in Figure 128-130, the refractive index of waveguide 12808 must be greater than 1.42.For optics contact is provided respectively between prism 12804a-12804c and surperficial 12810a-12810c, the necessary match surface 12810a-12810c in surface of prism 12804a-12804c, error is in 1.0 microns.Finally, in order to make to advance by waveguide 12808 and prism 12804a-12804c and not in the deflection of interphase place, the refractive index of prism 12804a-12804c must be identical with the refractive index of waveguide 12808 from the light of image source 12502, error is in about 0.1.
Figure 131 a and 131b illustrate as the diagram of Waveguide assembly included in the present invention 13102 and switchable mirror chip arrays.Figure 131 a illustrates that wherein the major axis of switchable mirror chip arrays is vertically oriented with the side view of the Waveguide assembly 13102 in account, so that image light 13100 is directed in user's eye.Figure 131 b illustrates that wherein the minor axis of switchable mirror chip arrays 13104 can be seen with the vertical view of the Waveguide assembly 13102 in account, and image light 13100 is provided for user's eyes 13110.In Figure 131 a and 131b, the visual field providing in image light 13100 can be seen by clear.In Figure 131 b, as the various piece of the image by different changeable eyeglass provided in array also can be seen.Figure 131 b also illustrates an embodiment of the Waveguide assembly 13102 that comprises image source 13108, wherein image source 13108 has internal light source so that the light from the miniscope such as LCOS display or LCD display to be provided, then light be sent to changeable eyeglass by waveguide, and it is redirected by changeable eyeglass and becomes the image light 13100 presenting to user's eyes 13110 there.
In order to reduce the image flicker that is used to provide to user time sharing user institute of the order portion perception of image at changeable eyeglass, changeable eyeglass sequence preference ground is with the frequency work faster than 60Hz.In this case, each in array in n changeable eyeglass is in each circulation of order, in reflective condition (1/60) X1/n second, then in transmissive state (1/60) X (n-1)/n second.Therefore,, with compared with reflective condition, in the greater part of each each circulation of changeable eyeglass in sequence, in transmissive state, therefore user is perceived as the array of changeable eyeglass relatively transparent.
In another embodiment of the present invention, the integral array of changeable eyeglass has than covering the required more changeable eyeglass of eyeglass in viewing area.Extra changeable eyeglass is utilized for the different user with different eye spacing (being also referred to as interpupillary distance) adjustment is provided.In this case, the changeable eyeglass that is used to present to user image is adjacent one another are, to make them present continuous image-region.Depend on user's eye spacing, use the changeable eyeglass at array edges place.As the example shown in Figure 132 A-132C, the array 13200 with seven changeable eyeglasses is provided, each eyeglass 3mm is wide.During use, the viewing area (13202a-13202c) that provides 15mm wide is provided five adjacent changeable eyeglasses, have for eye spacing ± 3mm adjust.In the narrow eye spacer conditions shown in Figure 132 A, be used to show towards five changeable eyeglasses of inner edge, and two changeable eyeglasses in outside are not used.In the wide eye spacer conditions shown in Figure 132 C, be used to show towards five changeable eyeglasses of outer rim, and two changeable eyeglasses in inside are not used.Intermediate state is shown in Figure 132 B, and wherein five middle changeable eyeglasses are used, and outside and inner changeable eyeglass is not used.In the present invention, term " is not used " to refer to changeable eyeglass and remain in transmissive state, and other changeable eyeglass is used according to the order between the transmissive state and the reflective condition that repeat.
Example
In the first example, the changeable eyeglass of the liquid crystal with quick response that uses New York, United States Hopewell town Kent Optronics company limited (http://www.kentoptronics.com/) to provide.Waveguide is made up of glass or plastics, in the space between involved each layer of liquid crystal, to make liquid crystal as 5 micron thick.Cover glass comprises liquid crystal on the outer surface.Response time is 10 milliseconds, in reflective condition reflectivity be 87% and in transmissive state transmissivity be 87%.Three changeable eyeglasses can drive in the sequence with 30Hz operation.If changeable eyeglass is that 5mm is wide, provide the wide viewing area of 15mm, the 38 degree visuals field that this equals to check in the wide eye movement scope (eyebox) of 8mm from the eyes from waveguide 10mm.
In the second example, the Mechanical Driven array of the prism that the glass that to provide by refractive index be 1.53 or plastics are made, this waveguide is that 1.53 same material is made by having refractive index.The surface of prism is polished so that the flatness that is less than 1 micron to be provided, and piezoelectric micro-actuator is used to approximately 10 microns, mobile prism to transfer reflective condition to from transmissive state.Waveguide is molded to provide the flatness that is less than 1 micron for the mating face of prism.Five changeable eyeglasses can be driven by this piezo-activator to operate in the sequence with 100Hz operation.Piezoelectric micro-actuator from Steiner & Martins company limited of city of Miami, Florida State ( http:// www.steminc.com/piezo/pZ_STAKPNViewPN.asp PZ_SM_MODEL=SMPAK155510D10) to buy, micro-actuator provides the movement of 10 microns in the 5X5X10mm packaging being driven by 150V to exceed the power of 200 pounds.The array of wide 5 prisms of the 5mm that respectively does for oneself is used to provide 25mm wide viewing area, the 72 degree visuals field that this equals to check in the wide eye movement scope of 8mm from the eyes from waveguide 10mm.Or, once only providing the wide viewing area of 15mm (the 38 degree visual field) with 3 prisms, the ability with transverse shifting viewing area ± 5mm is adjusted with the different spacing between the eyes for different user.
In each embodiment, waveguide display system can comprise to be provided from the image source of the image light of shown image, image light is sent to the integral array of the changeable eyeglass of the waveguide of viewing area and the viewing area that in the future the image light-redirecting of self-waveguide can be checked by user to shown image.In each embodiment, changeable eyeglass can be by electric drive.In each embodiment, changeable eyeglass can be mechanically driven.In other embodiments, micro-actuator can be used to the changeable eyeglass of Mechanical Driven.In addition, micro-actuator can be piezoelectricity.Changeable eyeglass can switch between transmission and reflective condition, so that the each several part of image light to be provided in lining by line scan on viewing area.
In each embodiment, from waveguide provide the method for shown image can comprise to waveguide provide from the integral array of changeable eyeglass is provided in the image light of image source, waveguide on viewing area and at the changeable eyeglass of sequential operation between transmission and reflective condition so that the each several part of image light to be provided in lining by line scan on viewing area.
In a further embodiment, the waveguide display system that there is interpupillary distance adjustment can comprise provide from the image source of the image light of shown image, by image light send to the waveguide of viewing area and in the future the image light-redirecting of self-waveguide to the integral array of the changeable eyeglass of display.In addition, the array of changeable eyeglass can have than covering the required more eyeglass of eyeglass in viewing area, and the changeable eyeglass at array edges place can be used to the viewing area of the eye spacing that match user is provided.
Eyepiece can be by any Power supply, such as battery power, sun power, circuit electric energy etc.Power supply can be integrated in framework 102 or be placed in eyepiece 100 outside and with eyepiece 100 be powered element electric connection.For example, solar collector can be placed on framework 102, on belt hook etc.Battery charging can use wall charger, onboard charger, carry out on belt hook, in eyepiece box etc.
Projector 200 can comprise LED photo engine 206, and this engine can be installed on heating radiator 204 and retainer 208, installs for the friction of guaranteeing LED photo engine, hollow taper light tunnel 220, diffuser 212 and condenser lens 214.Hollow tunnel 220 helps to homogenize from the fast-changing light of RGB LED photo engine.In one embodiment, hollow light tunnels 220 comprises silver coating.Diffuser lens 212 further homogenized and mixed light before light is directed into condenser lens 214.Light leaves condenser lens 214 and then enters polarization beam apparatus (PBS) 218.In PBS, LED light was propagated also light splitting in polarization components before being refracted into field lens 216 and LCoS display 210.LCoS display provides image for micro-projector.Image, then from LCoS display reflects and return by polarization beam apparatus, is then reflected 90 degree.Therefore, image roughly leaves micro-projector 200 in the centre of micro-projector, and then light be directed into coupled lens 504, as described below.
Fig. 2 described projection part with and the embodiment of other stilt that so place is described, but it will be understood by those skilled in the art that and can adopt other configuration and optical technology.For example, replace and adopt reflective optical device, can be used to realize the light path of projecting apparatus system such as the transparent configuration with Sapphire Substrate, therefore change potentially and/or eliminated optical module, such as beam splitter, redirected eyeglass etc.System can have back light system, and wherein LED RGB tlv triple can be to be directed to make light by the light source of display.As a result of, backlight and display or can install with waveguide purlieu, or after display, can exist cylindricality/directing optical device to make light correctly enter optical device.If there is no directing optical device, display can be installed in top, the side etc. of waveguide.In an example, little transparent display can use for example, the active backboard of silicon in transparent substrates (, sapphire) to realize, and transparency electrode is by controls such as the active backboard of silicon, liquid crystal material, polarizers.The function of polarizer can be the contrast of depolarizing to improve display of correcting by the light of system.In another example, system can be utilized the spatial light modulator that light path is applied to the modulation with spatial variations of certain form, and such as microchannel spatial light modulator, wherein diaphragm mirror optical shutter is based on microelectromechanical-systems (MEMS).System also can be utilized other optical module, for example, such as tunable optical filter (, have can deformation film actuator), high angular deflection micromirror systems, discrete phase optical element etc.
In other embodiments, eyepiece can utilize OLED display, quantum dot display etc., and these displays provide higher power efficiency, brighter display, lower-cost assembly etc.In addition, can provide flexible display such as the display technique of OLED and quantum dot display, therefore allow to reduce the larger packaging efficiency of eyepiece overall dimension.For example, OLED and quantum dot display material can be printed in plastic by stamping technology, therefore obtain flexible display module.For example, OLED(organic LED) display can be not need flexibility backlight, lower powered display.It can be bending, as Standard spectacles eyeglass.In one embodiment, OLED display can be transparent display or transparent display is provided.In each embodiment, the combination of irrealizable each resolution levels and equipment size (for example, frame thickness) before the allowance of high modulation transition function.
With reference to Figure 82, eyepiece can utilize the plane associated with reflective display 8210 to irradiate instrument 8208, wherein light source 8208 is coupled 8204 with the edge of plane irradiation instrument 8208, and its midplane irradiates the planar side irradiation reflective display 8210 of instrument 8208, and display 8210 provides the imaging of the content that will present to wearer's eyes 8222 by conduction optical device 8212.In each embodiment, reflective display 8210 can be LCD(LCoS on LCD, silicon), cholesteryl liquid crystal, guest-host type liquid crystal, Polymer Dispersed Liquid Crystal, phase retardation liquid crystal etc. or other liquid crystal technology known in the art.In other embodiments, reflective display 8210 can be bistable display, such as electrophoresis, electrofluid, moistening, the moving electricity of electricity, cholesteryl liquid crystal etc., or any other bistable display known in the art.Reflective display 8210 can be also the combination of LCD technology and bistable display technologies.In each embodiment, light source 8208 and plane are irradiated other surface that coupling 8204 between " edge " of instrument 8208 can irradiate instrument 8208 by plane and are carried out, then be directed to plane and irradiate in the plane of instrument 8208, such as passing through at the beginning top surface, basal surface, inclination surface etc.For example, light can enter plane from top surface and irradiate instrument, but enters 45 facet, makes light become the direction that curves plane.In alternative embodiment, the curved available optical coating of the change of this direction of light is realized.
In an example, light source 8202 can be direct-coupling 8204 irradiates the edge of instrument RGB LED source (for example, LED array) to plane.Then the light that enters the edge of plane irradiation instrument be directed to reflective display so that imaging is as described here all.Light can enter reflective display to be imaged, and then irradiates instrument (such as the reflecting surface that adopts reflective display dorsal part) by plane and is redirected.Then light can enter conduction optical device 8212 to image orientation is arrived to wearer's eyes 8222, such as scioptics 8214, reflexed to reflecting surface 8220 by beam splitter 8218, gets back to eyes 8222 by beam splitter 8218 grades.Although described conduction optical device 8212 according to 8214,8218 and 8220, it will be understood by those skilled in the art that conduction optical device 8212 can comprise known any conduction optics configuration, comprise than more complicated or more simply configuration described herein.For example, adopt the different focal in field lens 8214, beam splitter 8218 can directly become curved image towards eyes, has therefore eliminated bending eyeglass 8220, has realized better simply design and has realized.In each embodiment, light source 8202 can be LED light source, LASER Light Source, white light source etc., or any other light source known in the art.Optical coupling arrangement 8204 can be that light source 8202 and plane are irradiated the direct-coupling between instrument 8208, or by couplant or mechanism, such as waveguide, optical fiber, photoconductive tube, lens etc.Plane is irradiated instrument 8208 can receive light, and by interference grating, optics imperfection, scattering signatures, reflecting surface, refracting element etc., light-redirecting is arrived to the planar side of its structure.It can be the cover glass in reflective display 8210 that plane is irradiated instrument 8208, such as the combination thickness that irradiates instrument 8208 for reducing reflective display 8210 and plane.Plane is irradiated instrument 8208 also can comprise the diffuser being positioned at from a conduction optical device 8212 nearest side, to irradiates the cone angle of instrument 8208 expanded view picture light when conducting optical device 8212 by plane at image light.Conduction optical device 8212 can comprise multiple optical elements, such as lens, eyeglass, beam splitter etc., or any other optics transport element known in the art.
Figure 83 has presented the embodiment for the optical system 8302 of eyepiece 8300, the plane irradiation instrument 8310 and the reflective display 8308 that are wherein arranged on substrate 8304 are illustrated as docking by conduction optical device 8212, this conduction optical device 8212 comprises initial dispersion lens 8312, beam splitter 8314 and spherical mirror 8318, conduction optical device 8212 presents image to eye movement scope 8320, and wearer's eyes receive image at eye movement scope 8320 places.In an example, the partially transmitting mirror sheet coating that smooth beam splitter 8314 can be wire-grid polarizer, metal etc., and spherical reflector 8318 can be that a series of dielectric coating is to provide from the teeth outwards partially reflecting mirror.In another embodiment, the coating on spherical mirror 8318 can be that thin metallic coating is to provide partially transmitting mirror.
In an embodiment of optical system, Figure 84 shows as the plane of a part for ferroelectric light wave circuit (FLC) 8404 and irradiates instrument 8408, FLC8404 comprise utilize the configuration of LASER Light Source 8402 of being coupled to plane and irradiating instrument 8408 by waveguide wavelength converter 8420,8422, its midplane irradiate instrument 8408 utilize grating technology to present towards the plane surface of reflective display 8410 from plane irradiate instrument edge import light into.Then image light from reflective display 8410 irradiated instrument 8408, be redirected to return lead optical device via the hole 8412 in supporting construction 8414 by plane.Because this embodiment has utilized laser, FLC also utilize light feedback by according to described in United States Patent (USP) 7265896 broadening laser spectrum reduce the spot from laser.In this embodiment, LASER Light Source 8402 is IR LASER Light Source, and wherein sets of beams is incorporated into RGB by FLC, has and causes laser to jump and the back-reflection of Bandwidth to provide spot to suppress is provided.In this embodiment, spot is suppressed in waveguide 8420 and carries out.Be coupled to plane from the laser of LASER Light Source 8402 by multiple-mode interfence combiner (MMI) 8422 and irradiate instrument 8408.Each LASER Light Source port is positioned such that to irradiate on an output port of instrument 8408 in plane through the optical superposition of MMI combiner.The grating that plane is irradiated instrument 8408 turns to reflective display generation irradiation uniformly.In each embodiment, optical grating element can use superfine pitch (for example, interferometry) to produce the irradiation to reflective display, and in the time that light arrives conduction optical device through plane irradiation instrument, this irradiation is reflected back toward and has extremely low scattering from grating.That is, light spreads out of in the situation that aiming at, and is almost completely transparent to make grating.Notice, the light feedback using in this embodiment is due to the use to LASER Light Source, and in the time using LED, may not need spot to suppress, because LED has had enough bandwidth.
In Figure 85, illustrate and utilized the plane that comprises the configuration (being " flute profile " configuration in this case) with optics imperfection to irradiate an embodiment of the optical system of instrument 8502.In this embodiment, light source 8202 is irradiated the edge of instrument 8502 to plane by direct-coupling 8204.Then light irradiate instrument 8502 and run into plane through plane and irradiate the sulculus 8504A-D in tool materials, such as the groove in polymethylmethacrylate (PMMA) sheet.In each embodiment, along with groove 8504A-D progressively leaves input port, its spacing can change (for example, while, advancing to 8504D along with them from 8504A ' aggressiveness ' is more and more less) to some extent, it changes highly to some extent, its pitch changes to some extent etc.Then light be redirected to the irrelevant array of reflective display 8210 as light source by groove 8504A-D, thereby produce light fan-shaped that advances to reflective display 8210, wherein reflective display 8210 from groove 8504A-D enough far with generation from each groove, overlapping so that the irradiation mode of uniform irradiation in the region to reflective display 8210 to be provided.In other embodiments, may there is the optimal spacing of groove, wherein in reflective display 8210, the number of the groove of every pixel can be increased to make light more irrelevant (fuller), but this has produced compared with low contrast because there being more groove offering at provided image internal interference in wearer's image again.Although reference groove has been described this embodiment, other optics imperfection (such as point) is also possible.
In each embodiment, and with reference to Figure 86, i.e. " opposing slot " of subtend protuberance 8604() can be applied in the groove of plane irradiation instrument, such as in ' fastening ' protuberance subassembly 8602.Wherein subtend protuberance 8604 is placed in groove 8504A-D, to make there is air gap between the sidewall of groove and the sidewall of subtend protuberance.This air gap provides the circumscribed of the refractive index as irradiated the institute's perception of instrument time by plane at light to change, and this has promoted the reflection of light in groove side-walls.Subtend protuberance 8604 apply the aberration and the deflection that have reduced the image light being caused by groove.That is, the image light that self-reflection display 8210 reflects is reflected by groove sidewall, and it is because snell law has changed direction thus.By subtend protuberance is provided in groove, the Sidewall angles of the Sidewall angles coupling subtend protuberance of its middle slot, the refraction of image light is compensated, and image light is redirected towards conduction optical device 8214.
In each embodiment, and with reference to Figure 87, it can be the laminar structure of being made up of multiple laminate layers 8704 that plane is irradiated instrument 8702, and wherein laminate layers 8704 has different refractivity alternately.For example, plane irradiation instrument 8702 can be through two diagonal planes 8708 of laminated foil.In this way, replace the bathtub construction shown in Figure 85 and 86 with laminar structure 8702.For example, laminated foil can be made up of similar material (PMMA1 contrasts PMMA2---wherein difference is the molecular weight of PMMA).As long as each layer is quite thick, just may not there is not disturbing effect, and as transparent plastic sheet.Shown in configuration in, the light source of little number percent 8202 is redirected to reflective display by diagonal angle lamination, wherein the pitch of lamination is selected to chromatic aberrations.
In an embodiment of optical system, Figure 88 shows the plane irradiation instrument 8802 of utilization ' wedge shape ' configuration.In this embodiment, light source is irradiated the edge of instrument 8802 to plane by direct-coupling 8204.Then light irradiate instrument 8802 through plane and run into the inclined surface of the first wedge shape 8804, wherein light is redirected to reflective display 8210, then gets back to irradiation instrument 8802 and also arrives on conduction optical device by the first wedge shape 8804 and the second wedge shape 8812.In addition, laminated coating 8808,8810 can be applied to wedge shape to improve conductive properties.In an example, wedge shape can be made up of PMMA, is of a size of 1/2mm Gao – 10mm wide, and crosses over whole reflective display, has angle of 1 to 1.5 degree etc.In each embodiment, light is can be interior through multiple reflections in wedge shape 8804 before reflective display 8210 to irradiate through wedge shape 8804.If applied by 8808 and 8810 pairs of wedge shapes 8804 of highly-reflective coating, light can turn to and again transfer back to light source 8202 before in wedge shape 8804, carry out multiple reflections.But by adopt laminated coating 8808 and 8810 in wedge shape 8804, such as using SiO2, niobium pentaoxide etc., light can be directed to irradiate reflective display 8210.Coating 8808 and 8810 can be designed in broad range angle with specified wavelength reflected light, still for example, at the interior utilizing emitted light of a certain angular range (, the outer angle of θ).In each embodiment, this design can allow light in wedge shape internal reflection, until it arrives the transmission window for presenting to reflective display 8210, then coating is configured to allow transmission there.The angle orientation of wedge shape from the light of LED luminescent system to illuminate equably reflective image display, to produce the image reflecting by irradiation system.Enter wedge shape 8804 by providing from the light of light source 8202 with the wide cone angle that makes light, the diverse location place of the length along wedge shape 8804 is arrived transmission window by different light rays, thereby the surperficial uniform irradiation to reflective display 8210 is provided, and the image that therefore offers wearer's eyes has uniform luminance as determined in the picture material in image.
In each embodiment, comprise that plane as described here irradiates the perspective optical system of instrument 8208 and reflective display 8210 and can be applied to any headset equipment known in the art, such as comprising eyepiece as described herein, but also can be applicable to the helmet (for example, the military helmet, pilot's helmet, bicycle helmet, motorcycle helmet, the deep-sea helmet, the space helmet etc.), skiing safety goggles, glasses, diving mask, night vision face shield, anti-poison dust-proof mask, the Hazmat helmet, virtual implementing helmet, analog machine etc.In addition, the optical system being associated with headset equipment and protective cover can be incorporated to optical system according to variety of way, comprise the optics and cover except being associated with headset equipment traditionally, and optical system is inserted to headset equipment.For example, optical system can be included in skiing safety goggles as independent unit, thereby provide the content of institute's projection to user, but wherein optical system does not substitute any assembly of skiing safety goggles, such as skiing safety goggles perspective cover (for example, be exposed to the clear or painted plastic jacket of external environment condition, make user's eyes avoid the invasion and attack of wind and snow).Or optical system can substitute some optical device being associated with wear-type device traditionally at least in part.For example, the outer lens of the alternative eyewear applications of some optical element of conduction optical device 8212.In an example, beam splitter, lens or the eyeglass of conduction optical device 8212 (for example can replace eyewear applications, sunglasses) front lens, therefore the needs of the front lens to glasses have been eliminated, if be extended to cover glasses such as curved reflectors sheet 8220, eliminate the demand to lens cover.In each embodiment, comprise that the perspective optical system of plane irradiation instrument 8208 and reflective display 8210 can be arranged in wear-type device, so that the function for wear-type device and attractive in appearance not rude.For example, in the situation of glasses, or more specifically in the situation of eyepiece, optical system can be positioned near the top of lens, such as in the top of framework.
In each embodiment, optics assembly can be used for such as being arranged in the configuration of the display on the upper or helmet of head, and/or also can comprise single lens, binocular, holographic binocular, helmet view window, there is head mounted display, the integrated helmet of Mangin mirror and show sighting system, the integrated helmet demonstration sighting system, link senior head mounted display (AHMD) and multiple microdisplay optical device.In each embodiment, optics assembly can comprise telephoto lens.Such lens can be that spectacle is installed or otherwise install.Such a embodiment is useful for having the people of the defects of vision.In each embodiment, the wide-angle Kepler telescope of EliPeli can be built in lens by interior.Such design can be with delivery the embedded eyeglass in lens carry out the amplification for higher multiple of folded optical path and source element.This can allow wearer in glasses form, to check through amplification with without the visual field of amplifying simultaneously.In each embodiment, optics assembly can be used for having the Q-Sight being developed by the BAE system house in London city and is arranged in the configuration of the display on the helmet.Such configuration can provide the new line that gives Situation Awareness outwards to see ability.And each embodiment can use any optics assembly in configuration as above.
The plane irradiation instrument that is also referred to as irradiation module can provide the light of multicolour, comprises RGB (RGB) light and/or white light.Can be directed to 3LCD system, digital light processing from the light of irradiation module system, liquid crystal over silicon (LCoS) system or other micro-demonstration or micro-optical projection system.Irradiation module can provide high brightness, long-life, spot to reduce or the source of immaculate light by wavelength combinations and the nonlinear frequency transformation with the nonlinear feedback to source.Each embodiment of the present invention provides the light of multicolour, comprises RGB (RGB) light and/or white light.Can be directed to 3LCD system, digital light processing from the light of irradiation module system, liquid crystal over silicon (LCoS) system or other micro-demonstration or micro-optical projection system.Irradiation module described herein can be used in the optics assembly of eyepiece 100.
One embodiment of the present of invention comprise a system, and this system comprises: the laser instrument, LED or other light source that are configured to the light beam that produces the first wavelength; Be coupled to laser instrument and be configured to guide the planar lightwave circuit of light beam; And be coupled to planar lightwave circuit and be configured to receive the light beam of the first wavelength, the optical beam transformation of the first wavelength is become to the waveguide optical frequency changer of the output beam of second wave length.This system can provide optically-coupled feedback, the power of the light beam of the first wavelength that this feedback non-linearly depends on to laser instrument.
Another embodiment of the present invention comprises a system, and this system comprises: substrate; Be placed on substrate and be configured to the light source (such as diode laser matrix or one or more LED) of the multiple light beams that send the first wavelength; Be placed on substrate and be coupled to light source, and the planar lightwave circuit that is configured to combine the plurality of light beam and produces the beam combination of the first wavelength; And be placed on substrate and be coupled to planar lightwave circuit, and be configured to use nonlinear frequency transformation the beam combination of the first wavelength to be transformed into the nonlinear optical element of the light beam of second wave length.This system can provide optically-coupled feedback to laser diode array, and this feedback non-linearly depends on the power of the beam combination of the first wavelength.
Another embodiment of the present invention comprises a system, and this system comprises: the light source (such as semiconductor laser array or one or more LED) that is configured to the multiple light beams that produce the first wavelength; Be coupled to light source, and be configured to the array waveguide optical grating of the beam combination that combines the plurality of light beam and export the first wavelength; Be coupled to array waveguide optical grating, and be configured to occur with second harmonic the accurate phase matching wavelengths conversion waveguide of the output beam of the next beam combination generation second wave length based on the first wavelength.
Electric power can obtain and feed back to source in wavelength conversion equipment.The electric power of feedback has the non-linear dependence that offers the input electric power of wavelength conversion equipment about source.Nonlinear feedback can reduce output power from the wavelength conversion equipment susceptibility to the variation in the nonlinear factor of equipment, increases because if nonlinear factor reduces reverse power.The feedback increasing often increases the electric power that offers wavelength conversion equipment, has therefore alleviated the effect of the nonlinear factor reducing.
With reference to figure 109A and 109B, processor 10902(for example, digital signal processor) can provide DISPLAY ORDER frame 10924 for the display module 10928(by eyepiece 100 for example, LCOS display module) carry out image demonstration.In each embodiment, sequence frames 10924 can adopt or not adopt as the display driver 10912 of the intermediary's assembly between processor 10902 and display module 10928 and produce.For example, and with reference to figure 109A, processor 10902 for example can comprise frame buffer zone 10904 and display interface 10908(, mobile industry processor interface (MIPI), and show serial line interface (DSI)).Display interface 10908 can offer RGB data 10910 according to pixels the display driver 10912 of the intermediary's assembly between processor 10902 and display module 10928, wherein display driver 10912 is accepted RGB data according to pixels 10910 and generates for the independent full frame of redness to show data 10918, show that for the independent full frame of green data 10920 and the independent full frame for blueness show data 10922, offer display module 10928 by DISPLAY ORDER frame 10924 thus.In addition, display driver 10912 can provide timing signal to display module 10928, such as the transmission that is the synchronous full frame 10918,10920,10922 as DISPLAY ORDER frame 10924.In another example, and with reference to figure 109B, display interface 10930 can be configured to show data 10934, show data 10938 and show that for blue full frame data 10940 remove display driver 10912 as DISPLAY ORDER frame 10924 for green full frame by be directly provided for red full frame to display module 10928.In addition, timing signal 10932 can directly offer display module from display interface 10930.This configuration can provide significantly lower power consumption to the demand of display driver by removing.The not only removable demand to driver of this direct panel information, and can simplify the overall logic of configuration, and remove and generate the required redundant memories such as Pixel Information from the panel information of pixel, from frame to changing.
Figure 89 is according to the block diagram of the irradiation module of one embodiment of the invention.Irradiation module 8900 comprises light source, combiner and optical frequency conversion device according to an embodiment of the invention.Light source 8902,8904 sends the optical radiation 8910,8914 towards the input port 8922,8924 of combiner 8906.Combiner 8906 has combiner output port 8926, and this port sends the radiation 8918 of combination.The radiation 8918 of combination is received by optical frequency conversion device 8908, and optical frequency converter provides output optical radiation 8928.Optical frequency converter 8908 also can provide feedback radiation 8920 to combiner output port 8926.Combiner 8906 makes to feed back the separately source feedback radiation 8916 to provide the source feedback radiation 8912 of sending from input port 8922 and send from input port 8924 of radiation 8920.Source feedback radiation 8912 is received by light source 8902, and source feedback radiation 8916 is received by light source 8904.Optical radiation 8910 between light source 8902 and combiner 8906 and source feedback radiation 8912 can for example, be propagated according to any combination of free space and/or guide structure (, optical fiber or any other optical waveguide).Radiation 8918 and the feedback radiation 8920 of optical radiation 8914, source feedback radiation 8916, combination also can be propagated according to any combination of free space and/or guide structure.
Suitable light source 8902 and 8904 comprises one or more LED or has any optical emitter of the emission wavelength that is subject to light feedback influence.The example in source comprises laser instrument, and can be semiconductor diode laser.For example, light source 8902 and 8904 can be the element in semiconductor laser array.Also can adopt the source (for example, optical frequency converter can be used as source) except laser instrument.Although two sources have been shown in Figure 89, the present invention also can adopt more than two sources and realize.Combiner 8906 is roughly depicted as three port devices with port 8922,8924 and 8926.Although port 8922 and 8924 is called as input port, and port 8926 is called as combiner output port, these ports can be two-way, and all can receive as described above and utilizing emitted light radiation.
Combiner 8906 can comprise the wavelength dispersion element and the optical element that define port.Suitable wavelength dispersion element can comprise waveguide optical grating, reflectivity diffraction grating, transmittance diffraction grating, the holographic optical elements (HOE) of array, assembly and the photonic band gap structure of wavelength selective filter device.Therefore, combiner 8906 can be wavelength combiner, and wherein input port has corresponding, non-overlapped input port wavelength coverage separately, to be coupled to efficiently combiner output port.
Each two-phonon process can carry out in optical frequency converter 8908, includes but not limited to: harmonic wave occurs and (SHG), difference frequency generation, parameter generation, parameter amplification, parametric oscillation, three wave mixings, four-wave mixing, stimulated Raman scattering, stimulated Brillouin scattering, stimulated emission, acousto-optic frequency displacement and/or electric light frequency displacement occur for take place frequently raw (SFG), second harmonic.
Substantially, optical frequency converter 8908 is accepted the optics input according to the input set of optical wavelength, and the optics output according to the output collection of optical wavelength is provided, and wherein output collection is different from input set.
Optical frequency converter 8908 can comprise nonlinear optical material, such as lithium niobate, lithium tantalate, potassium titanium oxide phosphate, potassium niobate, quartz, silicon, silicon oxynitride, gallium arsenide, lithium borate and/or barium metaborate.Light in optical frequency converter 8908 can carry out alternately in various structures, comprises body structure, waveguide, quantum well structure, quantum line structure, quantum-dot structure, photonic band gap structure and/or multicompartment waveguiding structure.
Optical frequency converter 8908 provides in the situation of parameter nonlinear optical process therein, preferably phase matching of this nonlinear optical process.Such phase matching can be birefringent phase matching or accurate phase matching.Accurate phase matching can comprise disclosed method in the United States Patent (USP) 7,116,468 of authorizing Miller, and the open of this patent is contained in this by reference.
Optical frequency converter 8908 also can comprise the various elements that improve its operation, such as the wavelength selectivity reverberator for wavelength selectivity output coupling, for the wavelength selectivity reverberator of wavelength selectivity resonance and/or for the wavelength selectivity loss element of the spectral response of control change device.
In each embodiment, the multiple irradiation module described in Figure 89 can be associated to form compound irradiation module.
An assembly of irradiation module can be diffraction grating or the grating as further described herein.Diffraction grating sheet thickness can be less than 1mm, but still enough firm with the bonding cover glass that puts in place or replace LCOS permanently.In irradiation module, use an advantage of grating to be, it will increase efficiency and reduce power with Ear Mucosa Treated by He Ne Laser Irradiation light source.Grating can have less parasitic light inherently, and because wavestrip is narrower, by be allowed for reduction in perspective brightness less the luminous more more options of filtering eyes.
Figure 90 has described according to the block diagram of the optical frequency converter of one embodiment of the invention.Figure 90 shows feedback radiation 8920 and how to be provided by exemplary optical frequency transducer 8908, and optical frequency converter 8908 provides parameter frequency transformation.Combination radiation 8918 forward radiation 9002 of the right part that propagates into Figure 90 is provided in optical frequency converter 8908, and also propagate into Figure 90 right part parametric radiation 9004 the interior generation of optical frequency converter 8908 and from optical frequency converter 8908 outgoing as output optical radiation 8928.Conventionally,, along with carrying out alternately (in this example, arriving right part with radiation propagation), there is the net power transmission from forward radiation 9002 to parametric radiation 9004.The reverberator 9008 may with the transmittance relevant to wavelength is placed in optical frequency converter 8908, to reflect (or part reflection) forward radiation 9002 to backward radiation 9006 is provided, or can after end face 9010, be placed in outside optical frequency converter 8908.Reverberator 9008 can be grating, inner boundary, be added with coating or do not add end face or its any combination of coating.The preferred levels of the reflectivity of reverberator 9008 is for being greater than 90%.The reverberator that is positioned at inputting interface 9012 places provides pure linear feedback (, with the incoherent feedback of process efficiency).The reverberator that is positioned at end face 9010 places provides maximum nonlinear feedback, because forward power maximizes (take the parameter of phase matching mutual) at output interface place to the correlativity of process efficiency.
Figure 91 is the block diagram of Ear Mucosa Treated by He Ne Laser Irradiation module according to an embodiment of the invention.Although use laser instrument in this embodiment, be appreciated that and also can use other light source, such as LED.Ear Mucosa Treated by He Ne Laser Irradiation module 9100 comprises diode laser array 9102, waveguide 9104 and 9106, star-type coupler 9108 and 9110 and optical frequency converter 9114.Diode laser array 9102 has the Laser emission element of waveguide of being coupled to 9104, and the input port (such as the port 8922 and 8924 on Figure 89) to slab guide star-type coupler 9108 is taken in waveguide 9104.Star-type coupler 9108 is coupled to another slab guide star-type coupler 9110 by having the waveguide 9106 of different length.Star-type coupler 9108 and 9110 and the combination of waveguide 9106 can be array waveguide optical grating, and take on the wavelength combiner (for example, the combiner 8906 on Figure 89) that combination radiation 8918 is provided to waveguide 9112.Waveguide 9112 provides combination radiation 8918 to optical frequency converter 9114.In optical frequency converter 9114, optionally reverberator 9116 provides the backreflection of combination radiation 8918.As above, in conjunction with as described in Figure 90, this backreflection provides nonlinear feedback according to various embodiments of the present invention.In the element of describing with reference to Figure 91, one or morely can come to manufacture in common substrate with plane coating process and/or photoetching method, reducing costs, number of parts and to alignment request.
The second waveguide can be placed with and make near the waveguide core of its core in optical frequency converter 8908.As known in the art, the arrangement of this waveguide is as directional coupler, to make the radiation in waveguide that the additional radiation in optical frequency converter 8908 can be provided.Can be by providing the wavelength radiation different from the wavelength of forward radiation 9002 to avoid significant coupling, or spurious radiation can be coupled in optical frequency converter 8908 in the depleted position of forward radiation 9002.
Although the standing wave feedback configuration of the same paths backpropagation that wherein feedback power is advanced along power input is useful, also can use row ripple feedback configuration.Be expert in ripple feedback configuration, feedback reenters gain media in a position different from the position of outgoing power input.
Figure 92 is the block diagram of recombination laser irradiation module according to another embodiment of the present invention.Recombination laser irradiation module 9200 comprises with reference to the one or more Ear Mucosa Treated by He Ne Laser Irradiation modules 9100 described in Figure 91.Although Figure 92 shows the recombination laser irradiation module 9200 that comprises three Ear Mucosa Treated by He Ne Laser Irradiation modules 9100 for simplicity's sake, recombination laser irradiation module 9200 can comprise more or less Ear Mucosa Treated by He Ne Laser Irradiation module 9100.Diode laser array 9210 can comprise one or more diode laser arrays 9102, diode laser array can be laser diode array, diode laser array ,/or be configured to the semiconductor laser array of the optical radiation of (, wavelength ratio radiowave is short but longer than visible ray) in outgoing infrared spectrum.
Laser array output waveguide 9220 is coupled to the diode laser in diode laser array 9210, and by the output directional of diode laser array 9210 to star-type coupler 9108A-C.Laser array output waveguide 9220, array waveguide optical grating 9230 and optical frequency converter 9114A-C can use planar lightwave circuit to manufacture on single substrate, and can comprise silicon oxynitride waveguide and/or lithium tantalate waveguide.
Array waveguide optical grating 9230 comprises star-type coupler 9108A-C, waveguide 9106A-C and star-type coupler 9110A-C.Waveguide 9112A-C provides respectively the radiation of combination to optical frequency converter 9114A-C, provide feedback radiation to star-type coupler 9110A-C.
Optical frequency converter 9114A-C can comprise nonlinear optics (NLO) element, for example optical parametric oscillator element and/or accurate phase matching optical element.
Recombination laser irradiation module 9200 can produce the output optical radiation of multiple wavelength.These multiple wavelength can be in visible spectrum, and wavelength ratio infrared light is short but longer than ultraviolet light.For example, waveguide 9240A can provide the output optical radiation between about 450nm and about 470nm similarly, and waveguide 9240B can provide the output optical radiation between about 525nm and about 545nm, and waveguide 9240C can provide the output optical radiation between about 615nm and about 660nm.These scopes of output optical radiation can be selected again to provide makes us the visible wavelength that class beholder is joyful (for example, being respectively blue, green and red wavelength), and can again be combined to produce white light output.
Waveguide 9240A-C can manufacture on the planar lightwave circuit identical with optical frequency converter 9114A-C with laser array output waveguide 9220, array waveguide optical grating 9230.In certain embodiments, the output optical radiation that waveguide 9240A-C provides separately can provide the optical power in the scope of scope between approximately 1 watt and approximately 20 watts.
Optical frequency converter 9114 can comprise and be configured to the combination radiation of the first wavelength to carry out the accurate phase matching wavelengths conversion waveguide that second harmonic generates (SHG) and generates the radiation of second wave length.Accurate phase matching wavelengths conversion waveguide can be configured to carry out pumping with the radiation of second wave length and be integrated into the radiation that the optical parametric oscillator in accurate phase matching wavelengths conversion waveguide produces three-wavelength, and three-wavelength is optionally different from second wave length.Accurate phase matching wavelengths conversion waveguide also can produce the feedback radiation that propagates into diode laser array 9210 via waveguide 9112 by array waveguide optical grating 9230, and each laser instrument that makes to be thus placed in diode laser array 9210 can the determined unique wavelength work with the corresponding port on array waveguide optical grating.
For example, recombination laser irradiation module 9200 can be configured to use the diode laser array 9210 with the wavelength nominal operation of about 830nm, generates the output optical radiation in the visible spectrum corresponding with arbitrary color in red, green or blueness.
Recombination laser irradiation module 9200 can be optionally configured in the case of there is no to get involved direct irradiation spatial light modulator optical device therebetween.In certain embodiments, recombination laser irradiation module 9200 can be used the diode laser array 9210 with single the first wavelength nominal operation, to produce the output optical radiation of multiple second wave lengths (such as the wavelength corresponding with redness, green or blueness) simultaneously.Each different second wave length can be produced by the example of Ear Mucosa Treated by He Ne Laser Irradiation module 9100.
Recombination laser irradiation module 9200 can be configured to by for example the output optical radiation of multiple second wave lengths being combined to the white light that produces diffraction-limited in single waveguide with waveguide selectivity tap (not shown).
Diode laser array 9210, laser array output waveguide 9220, array waveguide optical grating 9230, waveguide 9112, optical frequency converter 9114 and frequency changer output waveguide 9240 can be used such as the manufacture craft such as coating or photoetching to be manufactured in common substrate.With reference to described in Figure 92, beam forming element 9250 is coupled to recombination laser irradiation module 9200 by waveguide 9240A-C.
Beam forming element 9250 can be placed on the substrate identical with recombination laser irradiation module 9200.Substrate for example can comprise Heat Conduction Material, semiconductor material or stupalith.Substrate can comprise copper-tungsten, silicon, gallium arsenide, lithium tantalate, silicon oxynitride and/or gallium nitride, and can process with the semiconductor fabrication process that comprises coating, photoetching, etching, deposition and injection.
Some element (such as diode laser array 9210, laser array output waveguide 9220, array waveguide optical grating 9230, waveguide 9112, optical frequency converter 9114, waveguide 9240, beam forming element 9250 and various relevant planar lightwave circuit) in described element can be by passive coupling and/or aligning, and in certain embodiments, in common substrate, press height packaging passive alignment.The different instances of each the be coupled to beam forming element 9250 in waveguide 9240A-C, but not be coupled to as shown in figure discrete component.
Beam forming element 9250 can be configured to the in the future output optical radiation of self-waveguide 9240A-C and be configured as the diffraction limited beam of essentially rectangular, and also the output optical radiation of self-waveguide 9240A-C is in the future configured on the beam shape of essentially rectangular, to have the brightness uniformity higher than approximately 95%.
Beam forming element 9250 can comprise non-spherical lens, such as " high cap " (top-hat) lenticule, holographic element or grating.In certain embodiments, the diffraction limited beam that beam forming element 9250 is exported produces the spot considerably reducing or is speckless.The light beam that beam forming element 9250 is exported can provide the optical power of scope between approximately 1 watt and approximately 20 watts, and to have basic be smooth phase front.
Figure 93 is according to the block diagram of the imaging system of one embodiment of the invention.Imaging system 9300 comprises photo engine 9310, light beam 9320, spatial light modulator 9330, through modulated beam of light 9340 and projecting lens 9350.Photo engine 9310 can be complex light irradiation module, the multiple irradiation module described in Figure 89, and with reference to the recombination laser irradiation module 9200 described in Figure 92, or with reference to the LASER Illuminator System 9300 described in Figure 93.Spatial light modulator 930 can be 3LCD system, DLP system, LCoS system, transmissive type liquid crystal display (for example, transmission-type LCoS), liquid crystal over silicon array, light valve or other micro-display or micro-optical projection system or reflective display based on grating.
Spatial light modulator 9330 can be configured to spatially light beam 9320 be modulated.Spatial light modulator 9330 can be coupled to electronic circuit, this electronic circuit is configured to make spatial light modulator 9330 that video image (such as, the video image that can be shown by televisor or computer monitor) is modulated on light beam 9320 to produce modulated light beam 9340.In certain embodiments, use optical reflection principle, modulated light beam 9340 can be exported from spatial light modulator from the homonymy of spatial light modulator receiving beam 9320.In other embodiments, use optical transmission principle, modulated light beam 9340 can be exported from spatial light modulator from the offside of spatial light modulator receiving beam 9320.Modulated light beam 9340 is optionally coupled in projecting lens 9350.Projecting lens 9350 is configured to modulated light beam 9340 to project on the display such as video display screen conventionally.
The method of irradiating video display can be used compound irradiation module (such as the compound irradiation module that comprises multiple irradiation module 8900), recombination laser irradiation module 9100, LASER Illuminator System 9200 or imaging system 9300 to carry out.For the output beam of diffraction-limited, compound irradiation module, recombination laser irradiation module 9100, LASER Illuminator System 9200 or photo engine 9310 generate.Output beam usage space photomodulator (such as spatial light modulator 9330) and optional projecting lens 9350 are directed.Spatial light modulator can project image onto on the display such as video display screen.
Irradiation module can be configured to the wavelength of any number of outgoing, comprises one, two, three, four, five, six or more, and wavelength is spaced apart according to different quantity, and has equal or unequal power level.Irradiation module can be configured to the single wavelength of every light beam outgoing, or the multiple wavelength of every light beam outgoing.Irradiation module also can comprise additional assembly and function, comprises Polarization Controller, polarization rotator, power supply, power circuit, electronic control circuit, heat management system, heat pipe and safety interlocking such as power fet.In certain embodiments, irradiation module can be coupled to optical fiber or photoconduction, for example, such as glass (, BK7).
Some selections of LCoS headlamp designs comprise: the wedge shape 1) with laminated coating (MLC).This conception of species defines the angle of specific reflection and transmission with MLC; 2) there is the wedge shape of polarization beam apparatus coating.This conception of species is as conventional PBS cubic block but with much narrow angle work.This can be PBS coating or wiregrating film; 3) (these are similar to and select #2 PBS prism bar, but have the seam down to face plate center; 4) wire-grid polarizer sheet beam splitter (being similar to PBS wedge shape, but being only sheet, is mainly air but not solid glass to make it); And 5) comprise the polarization beam apparatus (PBS) of flexible membrane, such as by have customization refractive index different plastics replace each layer (so that its only in a face direction but not in another side, in direction, mate) the 3M polarization beam apparatus made.In matching direction not, formed the quarter-wave stack of high reflection, and in matching direction, film is taken on transparent plastic plate.This film can be laminated between glass prism, provides high performance wide-angle PBS with the quick light beam being formed as on whole visual range.MLC wedge shape can be firm, and can be by gluing putting in place securely, and does not have the air gap can cold or thermal deflection.It can use together with broadband LED light source.In each embodiment, for complete module, the cover glass of the alternative LCOS of MLC wedge shape.MLC wedge shape can be that to be approximately less than 4mm thick.In one embodiment, MLC wedge shape can be that 2mm is thick or thinner.
Be appreciated that and the invention provides the deployment of front light system in all types of optical arrangement as described here, all types of optical arrangement can comprise but not must comprise augmented reality eyepiece.Front light system can be used as assembly in the optical system of any type source as direct or indirect irradiation, and be especially preferred for the irradiation to any or multiple optical element, optical surface or optical sensor, most preferably there are those assemblies of the light path of alternative configuration, such as LCoS or liquid crystal display, and/or reflected light.In certain embodiments, at least some in the light that front light system produces can be reflected, for example, to (arrive its final destination at light, eyes, optical sensor etc.) way in passed back by a part for front light system, and in other embodiments, the light producing is not passed back by front light system in it arrives the way of its final destination.For example, front light system can irradiate the optical device such as LCoS, and to obtain image light, image light can be returned by the assembly orientation of front light system, afterwards by regulating image light so that the one or more additional optical system finally being received by eyes of user.Other optical system like this can be or can comprise in its assembly: waveguide (can be the waveguide of free form surface), beam splitter, collimating apparatus, polarizer, eyeglass, lens and diffraction grating.
Figure 95 has described the embodiment of LCoS headlamp designs.In this embodiment, from the irradiation headlight 9504 of RGB LED9508, this headlight can be wedge shape, PBS etc.Light clashes into polarizer 9510 and is launched into LCoS9502 with its S state, and it is reflected back toward by aspheric surface 9512 with its P state as image light there.Inline polarizer 9514 can again carry out polarization to image light and/or cause the 1/2 ripple rotation to S state.Then image light hit wire-grid polarizer 9520, and reflex to bending (sphere) part mirror 9524, in its way, passes through 1/2 ripple delayer 9522.Image light is the eyes 9518 to user from lens reflecting, again through 1/2 ripple delayer 9522 and wire-grid polarizer 9520.To each example of headlight 9504 be discussed now.
In each embodiment, optics assembly comprises part reflectivity, part transmittance optical element, the reflection of these optical elements is the scene light from the see-through view of surrounding environment from the appropriate section of the image light of image source transmission, to make providing to user's eyes the combination image being made up of the each several part of the scene light of the image light being reflected and institute's transmission.
In portable display system, importantly provide the display of bright, compactness and light weight.Portable display system comprises mobile phone, laptop computer, flat computer and head mounted display.
The invention provides the headlight for compactness and the light weight of portable display system, headlight is made up of bending or other on-plane surface wire-grid polarizer film as part reverberator, and polarizer film makes light deflection from edge light to irradiate reflective image source efficiently.Known wire-grid polarizer provides the high-efficiency reflective to a polarization state and allows other polarization state to pass through simultaneously.Although glass sheet wire-grid polarizer is known in the industry, and can use in the present invention the wire-grid polarizer of rigidity, in a preferred embodiment of the invention, flexible wires grid polarizer film is used as to sweep grid polarizer.Suitable wire-grid polarizer film can obtain from the Asahi-Kasei E-materials company in Tokyo city.
Edge-light provides the illumination of compact form for display, but because it is positioned at the edge of image source, light must be deflected 90 and spend to irradiate image source.In one embodiment of this invention, sweep grid polarizer film is used as part reflective surface will, deflects down to irradiate reflective image source with the light that makes to be provided by edge light.Adjoin polarizer is provided with edge light, so that be provided for the light irradiation polarization of sweep grid polarizer.Polarizer and wire-grid polarizer are oriented to the light making by polarizer by wire grid polarizer reflects.Because the quarter-wave being included in reflective image source delays film, institute's reflected image polarisation of light, compared with light irradiation, is relative polarization state.Therefore, institute's reflected image light arrives display optics by wire-grid polarizer film continuation.By using flexible wires grid polarizer film as part reverberator, part reflective surface will can be bending in lightweight structure, and in this structure, wire-grid polarizer is taken on as for irradiating light reflector and the dual role for the transparent component of image light.The advantage that wire-grid polarizer film provides is that it can receive image light in a wide range of incident angle, to make curve can not interfere the image light by arriving display optics.In addition, for example,, because wire-grid polarizer film is thin (, being less than 200 microns), curved shape can not make image light distortion when display optics by arriving significantly at image light.Finally, wire-grid polarizer makes the tendentiousness of light scattering very low, so hi-vision contrast is maintained.
Figure 136 illustrates the schematic diagram of the image source 13600 of headlight of the present invention.Edge light 13602 provides by the light irradiation of polarizer 13614, and so that light irradiation 13610 is polarized, wherein polarizer 13614 can be absorption polarizers or reflective polarizer.Polarizer is oriented to and makes the polarizer state of light irradiation 13610 is so: light is bent wire-grid polarizer 13608 and reflects, and makes thus light irradiation 13610 deflect down towards reflective image source 13604.Therefore, polarizer 13614 pass through the pass through axle of axle perpendicular to wire-grid polarizer 13608.Although one skilled in the art will recognize that the image source 13600 that Figure 136 shows headlight is horizontal orientations, other direction is equally probable.As described, conventionally comprise that such as the reflective image source of LCOS image source quarter-wave delays film, change during the reflection of being reflected property image source with the polarization state that makes light irradiation, and as a result of, image light has compares relative substantially polarization state with light irradiation.As those skilled in the known and described in United States Patent (USP) 4398805, this change in polarization state is basic to the operation of all displays based on liquid crystal.To the various piece of image, the liquid crystal cell in reflective image source 13604 will cause polarization state change more or less, and the image light 13612 being reflected to make has the elliptic polarization state of mixing before by sweep grid polarizer.When by sweep grid polarizer 13608 with after can being included in any additional polariser in display optics, the polarization state of image light 13612 is bent wire-grid polarizer 13608 to be determined, and the picture material comprising in image light 13612 is determined the local strength of image light 13612 in the shown image of portable display system.
The flexible nature of the wire-grid polarizer film using in sweep grid polarizer 13608 allows it to be shaped as light irradiation 13610 is focused on to the shape on reflective image source 13604.The shape of the curve of sweep grid polarizer is selected as the uniform irradiation of cremasteric reflex image source.Figure 136 illustrates the sweep grid polarizer 13608 with parabolic shape, also likely makes light irradiation 13610 evenly deflect on reflective image source 13604 but radiation curve, complicated SPL or plane depend on the character of edge light 13602.Experiment shows that para-curve, radiation and complicated SPL are compared with flat surfaces, all provides more uniformly and irradiates.But in some very thin headlight image sources, can effectively use smooth wire-grid polarizer film so that the portable display system of light weight to be provided.The shape of flexible wires grid polarizer film can maintain with side frame, and the shape groove that side frame has suitable profile as shown in Figure 138 is so that wire-grid polarizer is kept in position, and Figure 138 shows the schematic diagram of headlight image source assembly 13800.Side frame 13802 is shown having bent groove 13804 and is maintained in required curved shape for flexible wires grid polarizer film.Although a side frame 13802 is only shown in Figure 138, can carry out support bends shape on either side with other assembly of two side frames 13802 and headlight image source.In any situation, because the major part of headlight image source of the present invention is made up of air and wire-grid polarizer film is very thin, so compared with the front light system of prior art, weight is light.
In another embodiment of the present invention, headlight image source 13700 is provided, it has two or more edge light 13702 of settling along two or more edges in reflective image source 13604.Adjoin and provide polarizer 13712 so that light irradiation 13708 polarizations with each edge light 13702.Light irradiation 13708 is bent wire-grid polarizer 13704 deflections to irradiate reflective image source 13604.The image light 13710 reflecting is then by sweep grid polarizer 13704 and arrive in display optics.Use the advantage of two or more edge light 13702 to be, more light can be applied to reflective image source 13604, and brighter image is provided thus.
Edge light can be fluorescent light, incandescent lamp, Organic Light Emitting Diode, laser instrument or electroluminescent lamp.In a preferred embodiment of the invention, edge light is the array of 3 or more light emitting diodes.For uniform irradiation reflective image source, edge light should have sizable cone angle, and for example edge light can be Lambertian source.To the situation of LASER Light Source, the cone angle of light need to be expanded.By using array or multiple edge light of light source, light is adjustable to provide more uniformly to the distribution in reflective image source and irradiates, and as a result of, can make the brightness of shown image more even.
The image light that headlight image source of the present invention provides is passed in the display optics of portable display system.How the image that depends on demonstration is used, and various display optics are possible.For example, in the time that display is flat screen display, display optics can be dispersion, or alternatively in the time that display is near-to-eye or head mounted display, display optics can be refraction or diffraction.
Figure 139 is for having the process flow diagram of method of portable display system in reflective image source in the present invention.In step 13900, provide the light irradiation of polarization to one or more edges in reflective image source.In step 13902, sweep grid polarizer receives light irradiation, and makes its deflection to irradiate reflective image source, and wherein the curve of wire-grid polarizer is chosen as the homogeneity of the irradiation that improves the region to reflective image source.In step 13904, reflective image source receives light irradiation, and reflected illumination light also irradiates polarized state of light corresponding to the image modification showing at the same time.Image light then in step 13908 by sweep grid polarizer and be passed in display optics.In step 13910, image is shown by portable display system.
In each embodiment, for showing that the light weight portable display system with reflective lcd image source of image can comprise one or more edge light (the one or more edges that adjoin reflective lcd image source provide polarized irradiation light), sweep grid polarizer part reverberator (can receive polarized irradiation light, and make its deflection to irradiate reflective lcd image source) and display optics (receive the image light being reflected and show image from reflective lcd image source).In addition, one or more edge light can comprise light emitting diode.In each embodiment, wire-grid polarizer can be flexible membrane, and flexible membrane can be remained in curved shape by side frame.In each embodiment, the sweep grid polarizer of display system can be para-curve, radiation or complicated SPL.In addition, the reflective lcd image source of display system can be LCOS.In each embodiment, the display optics of display system can comprise diffuser, and display system can be flat screen display.In each embodiment, the display optics of display system can comprise refraction or diffraction element, and display system can be near-to-eye or head mounted display.
In each embodiment, for providing the method for image to comprise on the light weight portable display system with reflective lcd image source: the one or more edges to reflective lcd image source provide polarized irradiation light, receive light irradiation and make light deflection to irradiate reflective lcd image source with sweep grid polarizer, reflection is also irradiated polarized state of light so that image light to be provided with respect to the image modification that will adopt reflective lcd image source to show, make image light by sweep grid polarizer, with display optics reception image light, and show image.In each embodiment of the method, the curved shape of sweep grid polarizer can be selected the homogeneity of the irradiation that improves reflective lcd image source.In addition, one or more edge light can comprise light emitting diode.In each embodiment, wire-grid polarizer can be flexible membrane.In addition, flexible membrane can be remained in curved shape by side frame.In various embodiments of the present invention, sweep grid polarizer can be para-curve, radiation or complicated SPL.In addition,, in each embodiment of above method, reflective lcd image source can be LCOS.In each embodiment, display optics can comprise diffuser, and display system can be flat screen display.In each embodiment of above method, display optics can comprise refraction or diffraction element, and display system can be near-to-eye or head mounted display.
Figure 96 has described to comprise an embodiment of the headlight 9504 of the optical bonding prism with polarizer.Prism is revealed as two cuboids, has the interface 9602 of substantial transparent therebetween.Each prism, to angle bisection, has been settled polarizing coating 9604 along the interface of dividing equally.The triangle of dividing the lower position that part forms equally of cuboid is optionally made into monolithic 9608.Prism can be made up of BK-7 or equivalent.In this embodiment, cuboid has the square end that is measured as 2mm and takes advantage of 2mm.In this embodiment, the length of cuboid is 10mm.In alternative embodiment, divide equally and comprise 50% eyeglass 9704 surfaces, and comprise can be according to P state transfer polarisation of light device 9702 interface between two cuboids.
Figure 98 has described three versions of LCoS headlamp designs.Figure 98 A has described to have the wedge shape of laminated coating (MLC).This conception of species defines the angle of specific reflection and transmission with MLC.In this embodiment, in P or S polarization state, the image light of arbitrary state is observed by user's eyes.Figure 98 B has described to have the PBS of polarizer coating.Only launch the image light of S polarization herein, to user's eyes.Figure 98 C has described right-angle prism, removed allow image light by air the many materials as the prism of S polarized light emission.
The wedge shape that Figure 99 has described to have the polarizing coating 9902 being stacked on LCoS9904 adds PBS.
Figure 100 has described light and has entered two embodiment of the prism that short end (A) and light enters along long end (B).In Figure 100 A, form wedge shape by the side-play amount of dividing cuboid equally, with the angle dividing at least one 8.6 degree of interface formation equally.In this embodiment, side-play amount is divided equally and in a radiative side, is obtained a part and high another part of 1.5mm that 0.5mm is high at RGB LED10002 by it.Along dividing place equally, settle polarizing coating 10004.In Figure 100 B, form wedge shape by the side-play amount of dividing cuboid equally, with the angle dividing at least one 14.3 degree of interface formation equally.In this embodiment, side-play amount is divided equally at RGB LED10008 and in a radiative side, 0 is obtained a part and high another part of 1.5mm that .5mm is high by it.Along dividing place equally, settle polarizing coating 10010.
Figure 101 has described the bending PBS film 10104 being irradiated by the RGB LED10102 being placed on LCoS chip 10108.PBS film 10104 will reflex to from the rgb light of LED array 10102 on the surface 10108 of LCOS chip, but order passes through to arrive optics assembly the final user's of arrival eyes without barrier from the light of imager chip reflection.The film using in this system comprises Asahi film, and this is triacetate cellulose or cellulose acetate substrate (TAC).In each embodiment, film can have the UV embossment ripple of 100nm, and is structured in the press polish coating on ridge, and this coating can be for the incident angle of light and angulation.Asahi film can by 20cm wide be multiplied by 30 meters long involve in into, and when there is BEF character while using in LCD irradiates.Asahi film can be supported the wavelength from visible ray to IR, and can until 100 DEG C all keep stable.
In another embodiment, Figure 21 and 22 has described the replacement arrangement of waveguide and projector with decomposition view.In this arrangement, projector is placed after abutting against the hinge of eyepiece arm, and it is oriented vertically to that to make initially advancing of RGB LED signal be vertical, until being reflected property of direction prism changes to enter waveguide lens.The vertical projection engine arranging can have PBS218 in central authorities, there is RGB LED array in bottom, have the hollow of film diffuser and the tunnel of taper with mixed color to collect and condenser lens in optical device.PBS can have Prepolarization device on the plane of incidence.Prepolarization device can be aligned so that the light of a certain polarization of transmission (such as p polarized light) reflection (or absorption) light (such as s polarized light) of polarization relatively.Then polarized light can arrive field lens 216 by PBS.The object of field lens 216 can be to set up the nearly telecentric iris of LCoS panel to irradiate.LCoS display can be really reflexive, with correct temporal order reflect color, so that image is correctly shown.Light can be from LCoS panel reflection, and for the bright areas of image, light can be rotated the polarization into s.Then light reflect by field lens 216, and can be reflected at the inner boundary of PBS, and leave projector, towards coupled lens.The tunnel 220 of hollow, taper can replace the lenslet that homogenizes from other embodiment.By orientation projection's instrument vertically and PBS is placed in to central authorities, space is saved, and projector can be placed in hinge space, does not almost have the arm of moment of flexure just can hang up from waveguide.
Can outwards pass to environment from the associated optics reflection of image source or eyepiece or the light of scattering.These light losses are perceived as " eyes are luminous " or " nightglow " by outside beholder, and wherein, in the time checking in rather dark environment, the each several part of lens or eyepiece region is around revealed as luminous.In some luminous situation of eyes as shown in FIG. 22 A, in the time being checked in outside by outside beholder, shown image can be counted as the observable image 2202A in viewing area.For according to keeping the privacy of the image just checked and according to the privacy that makes to keep aspect more inconspicuous two during with eyepiece in rather dark environment as user user's viewing experience, preferably reducing eyes luminous.It is luminous that each method and apparatus can reduce eyes by light control element, such as adopting part reflectivity eyeglass in the optical device being associated with image light source, adopting polarization optics device etc.For example, the light that enters waveguide can be polarized, such as s polarization.Light control element can comprise linear polarization.Wherein the linear polarization in light control element is with respect to the image light orientation of linear polarization, to make to be blocked by the Part II of part reflectivity eyeglass in the image light of linear polarization and eyes are luminous is reduced.In each embodiment, eyes luminous can by lens are attached to polarization relatively from the light of user's eye reflections (such as, in this example, be p polarization) waveguide or framework (all optical device that fastens as described here) minimize or eliminate.
In each embodiment, light control element can comprise the second quarter-wave film and linear polarization.The second quarter-wave film is transformed into the Part II of circularly polarized image light the image light of linear polarization, and the image light of linear polarization has the polarization state being stopped by the linear polarization in light control element, and to make, eyes are luminous to be reduced.For example, in the time that light control element comprises linear polarization and quarter-wave film, be transformed into linearly polarized photon from the scene light of the unpolarized of importing into of the external environment condition before user, and 50% light is blocked.In scene light, be linearly polarized photon by the Part I of linear polarization, these light are transformed into circularly polarized light by quarter-wave film.In scene light, have reverse circular polarization state from the Part III of part reflectivity lens reflecting, then these light are transformed into linearly polarized photon by the second quarter-wave film.Then linear polarization stops the Part III reflecting in scene light, reduces thus the light of escaping and reduce eyes luminous.Figure 22 B is illustrated in an example of the perspective demonstration assembly in spectacle-frame with light control element.Glasses xsect 2200B illustrates that the perspective in spectacle-frame 2202B shows each assembly of assembly.Light control element covers the whole see-through view that user sees.In the 2214B of the visual field of eyes of user, supporting member 2204B and 2208B are illustrated as support section reflectivity eyeglass 2210B and beam splitter layer 2212B respectively.Supporting member 2204B and 2208B and light control element 2218B are connected to spectacle-frame 2202B.Other assembly such as folding eyeglass 2220B and the first quarter-wave film 2222B is also connected to supporting member 2204B and 2208B, is structurally firm with the assembly that makes combination.
Such as the parasitic light in the compact optical of head mounted display, conventionally from the scattering sidewall of shell or other structure, light runs into surface with precipitous angle there.Such parasitic light produces around the bright areas of the scattered light of shown image.
There are two kinds of methods to reduce such parasitic light.One is to make sidewall or other structure is dimmed or roughening reduces reflection of light ratio.But, although this has increased the absorptivity of surface really, still can be noted from the reflected light of surface scattering.Other method is to provide baffle plate to stop or to prune parasitic light.Stop or prune the effect that has greatly reduced this parasitic light from the reflected light of surface scattering.In head mounted display, use these two kinds of methods to be of value to minimizing parasitic light, because shown image bright areas is around eliminated, and the contrast of shown image is increased.
United States Patent (USP) 5949583 provides the view window on head mounted display top to enter from top to stop parasitic light.But this does not solve the demand from the control of the parasitic light of wear-type display system inside to minimizing.
United States Patent (USP) 6369952 provides two covers to stop from the light around the edge in the liquid crystal display image source in head mounted display.The first cover is positioned on the input side in liquid crystal image source, adjoins backlightly, and the second cover is positioned on the outgoing side of liquid crystal display.Because these two covers are all positioned near liquid crystal display, " the first cover 222 and the second cover 224 have respectively perforate or window 232,234, and these perforates or window equate with the zone of action of LCD and congruence substantially " (the 15th hurdle, 15-19 is capable).By locate cover near image source, cover can to from image source outgoing, image source, from the center of the zone of action of image source, the light in the large cone angle in the each region close to has very little effect.This large cone angle light can be according to variety of way the sidewall reflects from shell, form thus the parasitic light of bright areas form and cause the contrast reducing.
Thereby, still exist reducing the demand from the method for the each parasitic light of originating in head mounted display inside.
Figure 160 illustrates the example of the display system with optics planar reflective surface, and this optics planar reflective surface is the beam splitter being made up of the blooming on substrate, and wherein display system is near-to-eye 16002.In this example, image source 16012 comprises that optical projection system (not shown) comprises that to adopt the optical layout of the folding optical axis 16018 that is arranged in near-to-eye 16002 provides image light.Can comprise that along the optical device of optical axis 16018 focusedimage light provides the lens from the focusedimage of image source 16012 with the eyes 16004 to user.Beam splitter 16008 folds into sphere or aspheric surface reverberator 16010 by optical axis 16018 from image source 16012.Beam splitter 16008 can be part reflectivity eyeglass or polarization beam apparatus.Beam splitter 16008 in near-to-eye 16002 is directed to an angle so that at least a portion of the image light from image source 16012 is redirected to reverberator 16010.From reverberator 16010, at least another part of image light is reflected back toward user's eyes 16004.Another part of the image light reflecting is passed and is focused on back user's eyes 16004 by beam splitter 16008.Reverberator 16010 can be eyeglass or part eyeglass.In the situation of part eyeglass at reverberator 16010, from the scene light of the scene above of near-to-eye 16002 can with the combination of image light, present to user's eyes 16004 the combination image light 16020 being formed by the image light along axle 16018 and scene light 16014 thus.The image light 16020 of combination presents scene and combination image from the overlay image of image source to user's eyes 16004.
Figure 161 shows the diagram of nearly eye display module 200.Module 200 is made up of reverberator 16104, image source module 16108 and beam splitter 16102.Module can, at side opening, wherein have attached between at least some in the link edge between reverberator 16104, image source module 16108 and beam splitter 16102.Or module 200 can be closed in side by sidewall, so that the closed module of the inside surface that prevents dust, dirt and water contact modules 200 to be provided.Together with reverberator 16104, image source module 16108 and beam splitter 16102 can be manufactured separately and be then attached at, or wherein at least some can be manufactured together in the subgroup piece installing linking.In module 200, blooming can be used on beam splitter 16102 or reverberator 16104.In Figure 161, beam splitter 16102 is illustrated as flat surfaces, and reverberator 16104 is illustrated as sphere.In nearly eye display module 200, both all can be used to reverberator 16104 and beam splitter 16102 provide image to user's eyes, and as shown in Figure 160, therefore surface is that the smooth or optics of optics is evenly important.
Suppose that image source 16108 comprises optical projection system, optical projection system has the light source of large cone angle light, and image light also has large cone angle.As a result of, the sidewall of image light and module 200 is mutual, this reflection and the scattered light that bright areas form (being viewed as shown image bright areas around by user) can be provided alternately.These bright areas are very distracting for user, look like shown image halation around because they may look.In addition, scattered light may be by providing randomly low-light to make the contrast degradation in image on shown image.
Figure 162 shows the diagram of the optical device being associated with the type of head mounted display 16200.In optical device, light source 16204 provides the large cone angle light that comprises central ray 16202 and marginal ray 16224.Light source 16204 can provide polarized light.Light passes to illumination beam splitter device 16210 from light source 16204, and this beam splitter is towards the catoptrical part in reflective image source 16208, and this reflective image source can be LCOS display.The Part I of light is reflected by image source 16208 and changes accordingly polarization state with shown picture material simultaneously.The Part II of light, then by illumination beam splitter device 16210, then passes through one or more lens 16212 of the cone angle of expansion light.The Part III of light is imaged beam splitter 16220 with an angle and reflects towards sphere (or aspheric surface) part eyeglass 16214.The catoptrical Part IV of part eyeglass 16214, causes that light converges simultaneously and image is focused on to user's eyes 16228.After the Part IV of light is reflected by part eyeglass 16214, the Part V of light is by imaging beam splitter 16220 and pass on user's eyes 16228, and the amplified version of the shown image of image source 16208 is provided for user's eyes 16228 there.In perspective head mounted display, from light 16218(or the scene light of environment) by part eyeglass 16214 and imaging beam splitter 16220 so that the fluoroscopy images of environment to be provided.So rear line provides the combination image being made up of the fluoroscopy images of the shown image from image source and environment.
Central ray 16202 along the optical axis of the optical device of head mounted display by the center of optical device.Optical device comprises: illumination beam splitter device 16210, image source 16208, lens 16212, imaging beam splitter 16220 and part eyeglass 16214.Marginal ray 16224 transmits along the side of shell 16222, and wherein light can be mutual with the sidewall of shell 16222, and wherein marginal ray 16224 can be by sidewall reflects or scattering, as shown in Figure 162.From this reflection of marginal ray 16224 or the light of scattering, user is visible as the reduction of contrast in bright areas around of shown image or image.The invention provides the whole bag of tricks by stopping or pruning from the reflection of sidewall or scattered light and reduce reflection and scattered light with minimizing bright areas.
Figure 163 shows the diagram of the first embodiment of the present invention, wherein baffle plate 16302 be added to shell 16222 inner sides, between illumination beam splitter device 16210 and lens 16212.Baffle plate 16302 stopped or trim edge light 16224 before marginal ray 16224 imports lens 16212 into.Baffle plate 16302 can be made up of opaque any material, marginal ray 16224 is blocked or is pruned.In a preferred embodiment, baffle plate 16302 can be made up of the black material with matt finish paint, so that incident light is absorbed by baffle plate.Baffle plate 16302 can be made up of the plate material with the hole that is arranged in shell 16222, or baffle plate 16302 can be made into a part for shell 16222.Because being placed in, baffle plate 16302 only disperses from image source 16,208 one distances and image, therefore the hole that around baffle plate 16302 causes is larger than the zone of action of image source 16208, so the image that image source 16208 provides can not pruned by baffle plate in edge, as a result of, the whole image that image source 16208 provides can be seen by user's eyes, as shown in Figure 163.In addition, baffle plate is preferably equipped with thin xsect (as shown in Figure 163) or sharp edges, to make the light can be from the edge scatter of baffle plate.
Figure 164 shows the plane of incidence place at lens has wherein added the diagram of the another embodiment of the present invention of baffle plate 16402.Baffle plate 16402 can be made into a part for shell 16222, or baffle plate 16402 can be applied to the cover on lens 16212.In arbitrary situation, baffle plate 16402 should be opaque, and preferably has the black of gloss finish paint not, to stop and to absorb incident light.
Figure 165 shows the diagram of the one embodiment of the invention that are similar to the embodiment shown in Figure 164, except baffle plate is positioned on the outgoing side of lens 16212.In this embodiment, baffle plate 16502 is provided to stop or trim edge light after marginal ray 16224 scioptics 16212.
Figure 166 shows baffle plate 16602 wherein and is attached between lens 16212 and imaging beam splitter 16220 diagram of the another embodiment of the present invention of shell 16222.Baffle plate 16602 can be a part for shell 16222, or baffle plate 16602 can be the separate structure that is positioned at shell 16222.Baffle plate 16602 stops or trim edge light 16224, to make not the eyes 16228 to user around shown image that bright areas is provided.
Figure 167 illustrates that wherein absorber coatings 16702 is applied in the sidewall of shell 16222 with the diagram of minimizing incident light and the reflection of edge-light 16224 and the another embodiment of the present invention of scattering.Absorber coatings 16702 can combine with baffle plate 16302,16402,16502 or 16602.
Figure 168 illustrates the diagram in another source of parasitic light in head mounted display, and wherein parasitic light 16802 directly enters from the edge of light source 16204.This parasitic light 16802 may be especially bright, because then first it directly do not reflect and reflect from image source 16208 from illumination beam splitter device 16210 from light source 16204.Figure 169 illustrates the diagram from another source of the parasitic light 16902 of light source 16204, and wherein parasitic light is from the surface reflection of image source 16208, and polarization state is changed and parasitic light 16902 then can be according to relatively precipitous angle by illumination beam splitter device there.This parasitic light 16902 then can be from shell any reflective surface will or the edge reflections of lens 16212, as shown in Figure 169.Figure 170 shows the wherein of the present invention again diagram of an embodiment that be provided with baffle plate 17002 adjacent with light source 16204.Baffle plate 17002 is opaque and stretches out from light source 16204, to make parasitic light 16802 and 16902 immediately be blocked or prune after light source 16204, prevents that thus parasitic light from arriving user's eyes 16228.
In another embodiment, the baffle plate shown in Figure 163-167 and 169-170 or coating are combined further to reduce the parasitic light in head mounted display, reduce thus bright areas around of shown image or increase the contrast in shown image.Can between light source 16204 and imaging beam splitter 16220, use multiple baffle plates.In addition, as shown in Figure 171, the absorber coatings 1702 with ridge can be used, and wherein a series of little ridges or step are taken on a series of baffle plates to stop or to prune the marginal ray in the whole sidewall areas of shell 16222.Ridge 17102 can be made into a part for shell 16222, or is attached to the madial wall of shell 16222 as the one deck separating.
Figure 172 illustrates the another embodiment of belt or thin slice 17210, and belt or thin slice comprise the slide glass 17212 and the ridge 17214 that can be used for blocking reflected light as shown in Figure 171.Ridge 17214 on-right angle in a side tilts and sharp keen inclination on opposite side, so that the incident light entering from sharp keen inclined side is blocked.Ridge 17214 can be the solid ridge having with the triangular cross section of a sharp edge, as shown in Figure 172, or they can be the thin inclination chis that is attached to an edge, or they can be the inclined fibers that is attached to one end, to make a surface with respect to sidewall angulation, and incident light is blocked.The advantage of belt or thin slice 17210 is, ridge 17214 can be relatively thin, and the main region that ridge can covering shell 16222.The another advantage of belt or thin slice 17210 is, the ridge shown in the comparable Figure 171 of ridge 17214 is more easily made, and the ridge shown in Figure 171 may be difficult to be molded into a part for shell.
In all embodiments, around baffle plate can cause hole, and the size in hole is the distance along optical axis from image source corresponding to them, so that image light can be dispersed along optical axis, provides thus the view without pruning of image source 16208 to user's eyes 16228.
In one embodiment, the absorption polarizers in optics assembly is used to reduce parasitic light.Absorption polarizers can comprise antireflecting coating.After absorption polarizers can be placed in the condenser lens of optics assembly, to reduce by the light of the optics planar film of optics assembly.Light from image source can be polarized to increase contrast.
In one embodiment, the antireflecting coating in optics assembly can be used to reduce parasitic light.Antireflecting coating can be placed on the polarizer of optics assembly or the delaying on film of optics assembly.Delaying film can be quarter-wave film or 1/2nd ripple films.Antireflecting coating can be placed on the outside surface of part reflectivity eyeglass.Light from image source can be polarized to increase contrast.
With reference to figure 102A, image source 10228 is directed to image light on the beam splitter layer of optics assembly.Figure 103 has described the amplification of image source 10228.In this specific embodiment, image source 10228 is shown to include light source (LED bar 10302), and light source is orientated light by diffuser 10304 and Prepolarization device 10308 and arrives sweep grid polarizer 10310, and light is reflected to LCoS display 10312 there.Then be reflected back the beam splitter layer of optics assembly 10200 by sweep grid polarizer 10310 and 1/2nd ripple films 10312 from the image light of LCoS.In each embodiment, comprise that the optics assembly of optical module 10204,10210,10212,10212,10230 can be provided as the optics assembly of sealing, such as detachable (for example, buckle and open), replaceable etc., and the black box that image source 10228 can be used as in eyepiece frame is provided.This can allow the optics assembly waterproof and dustproof, replaceable, customizable etc. of sealing.For example, given sealing optics assembly can be equipped with the corrective optics assembly for a people, and the second sealing optics assembly of available another people for example, with different corrective optical device demand (, different prescription) is replaced.In each embodiment, may exist and needn't all receive the application from the input of eyepiece by eyes.In this case, the people side of can dismantling simply, and only with one-sided come for the projection of content.In this way, user can have the light path of not blocked for eyes now, and wherein assembly is removed, and eyepiece will be saved battery life etc. in the situation that only half system is moved.
It is just sealed and be divided into part separately about what part that optics assembly can be considered to, such as being made up of image Core Generator 10228 and indicative optical tooling 10204,10210,10212 and 10230, as shown in Figure 102 A.In another diagram, Figure 147 shows the embodiment configuration that indicative optical device is shown to the eyepiece of ' projection screen ' 14608a and 14608b.Figure 102 A also illustrates the part of eyepiece electron device and optical projection system 14602, and wherein this part of optical projection system can be called as image Core Generator.Image Core Generator and indicative optical tooling can be the subgroup piece installings of sealing, such as the invasion and attack for optical device input not being subject to therein to the pollutant of surrounding environment.In addition, indicative optical device can be removed, such as for replacing, for removing to allow, user do not blocked checks, removes by force for adapting to non-destructive (for example, wherein indicative optical device is knocked and departs from and do not damage from the main body of eyepiece) etc.In each embodiment, the present invention can comprise the mutual wear-type eyepiece that user wears, wherein eyepiece comprises that optics assembly is (by this optics assembly, user checks surrounding environment and shown content) and integrated image source (being suitable for content to introduce in optics assembly), wherein optics assembly comprises the image Core Generator in the framework that is arranged on eyepiece and is arranged in the indicative optical tooling that also can remove from the framework of eyepiece of eyes of user above, and wherein image Core Generator is sealed in framework to reduce the pollution from surrounding environment.In each embodiment, sealing can be the optical window of sealing.As described here, eyepiece also can comprise handling implement, electrical management instrument, removal sensor, battery etc., wherein electrical management instrument can detect by the dismounting instruction from removal sensor the dismounting of indicative optical tooling, and the electric power of the each assembly of elective reduction eyepiece is to reduce the electric power of battery consumption.For example, the assembly that electric power reduces can be image source, such as reduce image source brightness, close the power supply etc. of image source, wherein electrical management instrument can monitor again adhering to of indicative optical tooling, and the electricity usage of image source is reverted to the working level before dismounting.Indicative optical tooling can be removed by disengaging mode, so that proper indicative optical tooling is while being removed unintentionally, its can be removed in the situation that not damaging eyepiece.Indicative optical tooling can be by connection mechanism and dismountable, such as magnet, bolt, rail, snap-on connector etc.Indicative optical tooling can provide the vision correction of the user to needing corrective glasses, wherein indicative optical tooling for a change the vision correction prescription of eyepiece object and be replaced.Eyepiece can have two dismountable optics assemblies that separate for each eye, and one of optics assembly wherein separating is removed to allow the simple eye use of remaining in optics assembly separately.For example, simple eye use can be that gun aim at use, a side that has wherein removed indicative optical tooling in eyepiece is used to gun and aims at, thus the visual pathway not blocked that allows user to aim at for gun, and retain to another eyes the instrument that eyepiece provides simultaneously.Indicative optical tooling can be removed the replacing that allows to be suitable for the indicative optical tooling of indoor use and be suitable for the indicative optical tooling of outdoor application.For example, for indoor use contrast outdoor application, may there be different filtrators, the visual field, contrast, shielding etc.Indicative optical tooling can be adapted accepts add ons, such as optical element, mechanical organ, adjustment element etc.For example, optical element can be inserted into the optical prescription to user and adjust.Indicative optical tooling also can be replaced to the visual field providing is provided, and replaces the indicative optical tooling with first visual field such as the indicative optical tooling by use with second visual field.
With reference to Figure 104, LED provides the light of unpolarized.Diffuser scatters from the light of LED and light is homogenized.Absorb Prepolarization device light is transformed to S polarization.Then S polarized light is bent wire-grid polarizer and reflects towards LCOS.LCOS reflection S polarized light, and depend on that topography's content is transformed into P polarized light.P polarized light, by sweep grid polarizer, becomes P polarization image light.P polarized light is transformed into S polarized light by 1/2nd ripple films.
Refer again to Figure 102 A, beam splitter layer 10204 is polarization beam apparatus, or image source provides polarization image light 10208 and beam splitter layer 10204 is polarization beam apparatus, making reflected image light 10208 is linearly polarized photons, and this embodiment and the Polarization Control that is associated are shown in Figure 102 A.Provide linear polarization image light and beam splitter layer 10204 is situations of polarization beam apparatus to image source wherein, image polarized state of light is aimed at polarization beam apparatus, makes image light 10208 be polarized beam splitter reflection.Figure 102 A is shown reflected image light to have S state polarization.Beam splitter layer 10204 is in the situation of polarization beam apparatus therein, and the first quarter-wave film 10210 is arranged between beam splitter layer 10204 and part reflectivity eyeglass 10212.Linear polarization image light is transformed into circular polarization image light (be illustrated as S and be just transformed into CR in Figure 102 A) by the first quarter-wave film 10210.The Part I being reflected in image light 10208 is then also by circular polarization, wherein circular polarization state is inverted (in Figure 102 A, being illustrated as CL), compared with the polarization state (being illustrated as S) of the image light 10208 that makes to be provided with image source, after passing back by quarter-wave film, the polarization state of the Part I being reflected in image light 10208 is kept (for P polarization), as a result of, the Part I being reflected in image light 10208 does not have reflection loss by polarization beam apparatus.When beam splitter layer 10204 is polarization beam apparatus and have an X-rayed and show that assembly 10200 is while comprising the first quarter-wave film 10210, light control element 10230 is the second quarter-wave film and linear polarization 10220.In each embodiment, light control element 10230 comprises can control dimmed layer 10214.The image light 10208(that wherein Part II of circularly polarized image light 10208 is transformed into linear polarization by the second quarter-wave film 10218 is illustrated as CR and is just being transformed into S), the image light of linear polarization has the polarization state being stopped by the linear polarization 10220 in light control element 10230, and to make, eyes are luminous to be reduced.
In the time that light control element 10230 comprises linear polarization 10220 and quarter-wave film 10218, scene light 10222 from the unpolarized of importing into of the external environment condition before user is transformed into linearly polarized photon (being illustrated as P polarization state in Figure 102 A), and 50% light is blocked.In scene light 10222, are linearly polarized photons by the Part I of linear polarization 10220, these light are transformed into circularly polarized light (be illustrated as P and be just transformed into CL in Figure 102 A) by quarter-wave film.The Part III reflecting from part reflectivity eyeglass 10212 in scene light has reverse circular polarization state (being illustrated as being transformed to CR from CL among Figure 102 A), and then these light are transformed into linearly polarized photon (be illustrated as CR and be transformed into S polarization in Figure 102 A) by the second quarter-wave 1 film 10218.Then linear polarization 10220 stops the Part III reflecting in scene light, reduces thus the light of escaping and reduce eyes luminous.
As shown in Figure 102 A, the Part II being transmitted in the Part I being reflected in image light 10208 and scene light has identical circular polarization state (being illustrated as CL), to make their combinations and to be transformed into linearly polarized photon (being illustrated as P) by the first quarter-wave film 10210, in the time that beam splitter layer 10204 is linear beam splitter, linearly polarized photon passes through beam splitter.Then linear polarization combined light 10224 shows that to being positioned at perspective user's the eyes 10202 at the back of assembly 10200 provide combination image, and wherein combination image is made up of the overlapping portion of the see-through view of the external environment condition before shown image and user from image source.
Beam splitter layer 10204 comprises optics planar film, all TAC of Asahi as described here films.Beam splitter layer 1024 can be placed in before user's eyes according to an angle, so that the appropriate section of beam splitter layer reflection transmission image light transmission from the scene light of the see-through view of surrounding environment, making provides to user's eyes the combination image being made up of the each several part of the scene light of image light and institute's transmission.Optics planar film can be polarizer, such as wire-grid polarizer.Optics planar film can be laminated to transparent substrates.Optics planar film can be molded, in the surface of one of optical surface of mold pressing, the eyepiece by the time such as gluing or surface is upper, such as beam splitter 10202.Optics planar film can be set to vertical line and be less than 40 degree.Bending polarizing coating can have the ratio of the width of the light source height that is less than 1:1 to irradiated area.The peak of bending film is lower than the length of the narrowest axle of display.In each embodiment, once optical thin film is positioned on beam splitter, additional optical device, can be added to surface such as corrective optical device, prescription optical device etc. upper, such as keeping smooth in order to make in film interlayer betwixt.
The present invention is also provided for providing the method for the optics flat surfaces with blooming.Blooming be form have the optical characteristics very different from the remainder of the structure of imaging device an optical texture facilitate method.In order to provide function to imaging device, blooming need to be attached to optical device.In the time that blooming is used in reflectivity mode, crucial is that reflective surface will is that optics is smooth, otherwise the wavefront of the light of self-reflection surface reflection will can not be kept, picture quality will be demoted.Optics flat surfaces can be defined as, when the light wavelength using for imaging device is measured, and with flat surfaces or required optical curve in any one relatively, surface uniformly within 5 optical wavelength of surperficial per inch.
Comprise that the optics flat surfaces of blooming can be included in display system as described in the present invention, display system comprises: projector, projection TV set, near-to-eye, head mounted display, see-through display etc.
Figure 140 illustrates the example of the display system with optics planar reflective surface, and optics planar reflective surface is the beam splitter being made up of the blooming on substrate, and wherein display system is near-to-eye 14000.In this example, image source 14010 comprises that optical projection system (not shown) comprises that to adopt the optical layout of the folding optical axis 14000 that is arranged in near-to-eye 14014 provides image light.Can comprise that along the optical device of optical axis 14014 focusedimage light provides the lens from the focusedimage of image source 14010 with the eyes 14002 to user.Beam splitter 14004 folds into sphere or aspheric surface reverberator 14008 by optical axis 14014 from image source 14010.Beam splitter 14004 can be part reflectivity eyeglass or polarization beam apparatus layer.Beam splitter 14004 in near-to-eye 14000 is directed to an angle so that at least a portion of the image light from image source 14010 is redirected to reverberator 14008.From reverberator 14008, at least another part of image light is reflected back toward user's eyes 14002.Another part of the image light reflecting is passed beam splitter 14004 back and is focused on user's eyes 14002.Reverberator 14008 can be eyeglass or part eyeglass.In the situation of part eyeglass at reverberator 14008, from the scene light of the scene above of near-to-eye 14000 can with the combination of image light, present to user's eyes 14002 the combination image light 14018 forming by the image light along axle 14014 and along the scene light of axle 14012 thus.The image light 14018 of combination presents scene and the combination image from the overlay image of image source to user's eyes.
Figure 141 shows the diagram of nearly eye display module 14100.Module 14100 is made up of reverberator 14104, image source module 14108 and beam splitter 14102.Module can adopt annex at side opening, between at least some in the link edge of annex between reverberator 14104, image source module 14108 and beam splitter 14102.Or module 14100 can be closed in side by sidewall, so that the closed module of the inside surface that prevents dust, dirt and water contact modules 14100 to be provided.Together with reverberator 14104, image source module 14108 and beam splitter 14102 can be manufactured separately and be then attached at, or wherein at least some can be manufactured together in the subgroup piece installing linking.In module 14100, blooming can be used on beam splitter 14102 or reverberator.In Figure 141, beam splitter 14102 is illustrated as flat surfaces, and reverberator 14104 is illustrated as sphere.In nearly eye display module 14100, both all can be used to reverberator 14104 and beam splitter 14102 provide image to user's eyes, and as shown in Figure 140, therefore surface is that the smooth or optics of optics is evenly important.
Figure 142 shows one embodiment of the invention---the schematic diagram of thin skin pattern film assembly 14200.Thin skin pattern film assembly 14200 comprises the framework 14202 being made up of upper ledge member 14202a and lower frame member 14202b.Blooming 14204 adopts bonding agent or securing member to be maintained between framing component 14202a and 14202b.In order to improve the flatness of blooming 14204, blooming 14204 can stretch in one or more directions, and bonding agent is applied in simultaneously, and framing component 14202a and 14202b are adhered to blooming 14204.After blooming 14204 is adhered to framework 14202, the edge of blooming can be trimmed to provide smooth surface to the outward flange of framework 14202.
In some embodiments of the invention, blooming 14204 is the folded membranes that are made up of a series of optics flat surfaces, and the interface of frame element 14202a and 14202b has the collapsed shape of coupling.Then along folding direction stretching folded membrane and by its bonding putting in place, to make framing component 14202a and 14202b keep blooming 14204 in collapsed shape, and each surface in a series of optics flat surfaces is maintained at appropriate location.
In all situations, after framing component 14202a and 14202b are adhered to blooming 14204, the thin skin pattern film assembly 14200 obtaining be can be placed in such as in the optical device of nearly eye display module 14100 to form the rigid group piece installing of beam splitter 14102.In this embodiment, thin skin pattern film assembly 14200 is replaceable beam splitter 14102 assemblies in nearly eye display module 14100.Sidewall in nearly eye display module 14100 can have the groove that framework 14202 snaps in, or alternatively can provide and connect the flat surfaces of sidewall and framework 14202 can be placed in flat surfaces top.
Figure 143 is the diagram that comprises the inserted mode system assembly 14300 of blooming 14302.In this embodiment, blooming 14302 is placed in mould, and viscosity plastics materials is injected in mould by molded door 14308, makes filling plastic mold cavity and form to adjoin blooming 14302 and be positioned at blooming 14302 molded structure 14304 afterwards.When plastic material hardening in mould, open mould along seam line 14310, and from mould, shift out inserted mode system assembly 14300.Then blooming 14302 is embedded into and is attached to inserted mode system assembly 14300.In order to improve the optics flatness of the blooming 14302 in inserted mode system assembly 14300, the inside surface of placing the mould of blooming 14302 is optics flat surfaces.In this way, viscosity plastics materials, during molding process, forces the optics flat surfaces of blooming 14302 against mould.This technique can be used to the optics flat surfaces that provides described above smooth or have required optical curve.In another embodiment, blooming 14302 can be equipped with adhesive phase or tie layer, to increase the adhesion between blooming 14302 and molded structure 14304.
In an embodiment again, blooming 14302 is placed in mould, and has protectiveness film between die surface and blooming 14302.Protectiveness film can be attached to blooming 14302 or mould.The comparable die surface of protectiveness film is level and smooth or smooth, so that for providing more level and smooth or smooth surface to its molded blooming 14302.Therefore, protectiveness film can be any material, such as plastics or metal.
Figure 144 illustrates the diagram for the laminating technology with blooming 14400 making layer pressing plates.In this embodiment, upper lower platen 14408a and 14408b are used to blooming 14400 to be laminated on substrate 14404.Bonding agent 14402 can optionally be used so that substrate 14404 is adhered to blooming 14400.In addition, one or more being heated in pressing plate 14408a and 14408b, or substrate 14404 can be heated to provide higher degree of adhesion between substrate 14404 and blooming 14400.Also can be used to softening substrate 14404 to heating one or more in substrate or pressing plate 14408a and 14408b, after blooming 14400, provide thus more uniform pressure to improve flatness or the flatness of blooming 14400 in laminate.The laminate with blooming 14400 of this embodiment can be used as above for the replaceable beam splitter in the nearly optics of the eye module 14100 described in thin skin pattern film assembly 14200.
Figure 145 A-C illustrates the diagram of the technique for applying for using the optical surface making molded structure 14502 that comprises blooming 14500.In this embodiment, blooming 14500 is applied to the optics flat surfaces 14504 in molded structure 14502 by rubber applicator 14508.Adhesive phase can be applied to any in the optics flat surfaces 14504 of molded structure 14502 or the basal surface of blooming 14500, so that blooming 14500 is adhered to molded structure 14502.Rubber applicator 14508 can be the relatively soft and resilient material with curved surface, to make the core of blooming 14500 be forced to the optics flat surfaces 14504 of contact molded structure 14502.In the time that rubber applicator 14508 further pushes away downwards, the contact area size between blooming 14500 and the optics flat surfaces 14504 of molded structure 14502 increases, as shown in Figure 145 A, 145B and 145C.This progressive process that applies provides applying very uniformly of pressure, and this allows the air of interface to be ejected during applying process.The progressive optics flat surfaces 14504 that applies process and molded structure 14502 provides the optics flat optical film 14500 of the inside surface that is attached to molded structure 14502, as shown in Figure 145 C.Can be attached to the optics flat surfaces 14504 on blooming 14500 or molded structure 14502 inside for the adhesive phase that blooming 14500 is adhered to molded structure 14502.Those skilled in the art will appreciate that this applies process and can apply blooming to the outside surface of molded structure similarly.In addition, optics flat surfaces can be flat surfaces or the surface with required optical curve or a series of optics flat surfaces, and wherein rubber applicator is formed, to provide the progressive of pressure to apply with applying of blooming.
In each embodiment, image display system can comprise the optics flat optical film that comprises display module shell, its housing comprises makes blooming keep substrate, the image source that optics is smooth and check position, and the image that wherein image source provides reflexes to and checks position from blooming.In each embodiment, the blooming of image display system can be molded in display module.In each embodiment, blooming can be applied to display module.In addition,, in each embodiment, the blooming of display system can be wire-grid polarizer, eyeglass, part eyeglass, holographic film etc.In each embodiment, image display system can be near-to-eye.In each embodiment, blooming is molded in display module, or in the time that blooming is molded in display module, can keep blooming with respect to optics flat surfaces.In each embodiment, the blooming of image display system can comprise the optics flatness of 5 optical wavelength of per inch.
In one embodiment, the image display system that comprises optics flat optical film can comprise makes blooming keep substrate, display module shell, the image source that optics is smooth and check position, the image that wherein image source provides reflexes to and checks position from blooming, and can in display module shell, replace the substrate with blooming.In such embodiments, the substrate of image display system can be framework, and blooming can be kept under the tension force of framework, and substrate can be film molded panel below, and/or substrate can be laminate.In addition, the blooming of image display system can be beam splitter, polarization beam apparatus, wire-grid polarizer, eyeglass, part eyeglass, holographic film etc.In addition, image display system can be near-to-eye.In each embodiment, in the time of molded panel after the blooming of image display system, can keep this blooming against optics flat surfaces.In addition,, in each embodiment, in the time that plate is in turn laminated to the blooming of image display system, can keep blooming against optics flat surfaces.In each embodiment, the blooming of image display system can comprise the optics flatness of 5 optical wavelength of per inch.
In one embodiment, the assembly collective in Figure 102 A forms electro-optical module.The angle of the optical axis being associated with display can be 10 degree or more vertical.This degree of tilt refers to the degree that leans forward on optical module top.This allows beam splitter angle to be reduced, and beam splitter angle reduces to make optical module thinner.The ratio of the width of the height of bending polarizing coating to reflective image display is less than 1:1.Curve on polarizing coating is determined the width of the irradiated area in reflective display, and the location of irradiated area in reflective display determined in the inclination of bending area.Bending polarizing coating reflexes to the light irradiation of the first polarization state in reflective display, and this has changed irradiation polarisation of light synthetic image light, and the image light that reflects of bending polarizing coating transmission.Bending polarizing coating is included in a part parallel with reflective display on light source.The height of image source can be at least display zone of action width 80%, at least 3.5mm or be less than 4mm.
In portable display system, importantly provide the display of bright, compactness and light weight.Portable display system comprises mobile phone, laptop computer, flat computer, near-to-eye and head mounted display.
The invention provides the headlight that compactness and light weight are provided for portable display system, headlight forms to be redirected light from edge light to irradiate reflective image source by part reflective membrane.Part reflective membrane can be part eyeglass beam splitter film or polarization beam apparatus film.Polarization beam apparatus film can be multilayer dielectric film or wire-grid polarizer film.Known polarization beam splitter film provides the high-efficiency reflective to a polarization state and allows another polarization state to pass through simultaneously.Multilayer dielectric film can be buied with the title of DBEF from 3M company of Minneapolis city, Minn..Wire grid polarization film can be buied with the title of WGF from the Asahi-Kasei E-Materials company in Tokyo city.
Edge-light provides compact light source for display, but because it is positioned at the edge of image source, light must be redirected 90 and spend to irradiate image source.When image source is reflective image source, during such as liquid crystal over silicon (LCOS) image source, light irradiation must be polarized.Polarized light is by the surface reflection of image source, and polarized state of light and the picture material that showing change accordingly.Then the light reflecting passed back by headlight.
Figure 187 shows to be had solid beam splitter piece 18718 and shows illustrating of assembly 18700 as the prior art of headlight.Show that assembly comprises headlight, one or more light source and image source.In demonstration assembly 18700, one or more light sources 18702 are included to provide the light that is illustrated as light 18712.Light source can be LED, fluorescent light, OLED, incandescent lamp or solid state lamp.Light 18712 irradiates to obtain with deflection dispersion light more uniformly by diffuser 18704.If the light of diffusion is polarized, diffuser comprises linear polarization.Diffused ray 18714, reflects towards reflective image source 18720 parts at part reflective layer 18708 diffused raies towards 18708 outgoing of part reflective layer by solid beam splitter piece 18718.Diffused ray 18714 being then reflected property image source 18720 reflects, and forms thus image light 18710, these being partially reflected property of image light layer 18708 transmissions.Then image light 18710 can import the image optics device (for illustrating) being associated into present image to beholder.But, as visible in Figure 187, be illustrated as light source identical with the width in irradiated reflective image source 18720 by the height of light area of diffuser 18704 herein.Part reflective layer 18708 is placed in 45 degree angles so that image light 18710 to be provided, this image light straight line or vertically advance to the image optics device being associated.As a result of, the headlight shown in Figure 187 is relatively large in size.
In imaging system, generally speaking, maintenance has fine resolution and contrast from the wavefront of image source high-quality to provide similarly is important.Therefore, as is known to the person skilled in the art, image light 18710 must advance orthogonally and provide uniform wavefront with the image optics device to being associated with reflective image source 18720, to obtain being provided for beholder's high quality graphic.Therefore, diffused ray 18714 must be redirected with orthogonal with reflective image source 18720 by being partially reflected property film 18708, to make them to be reflected and to import in the image optics device being associated by vertical (as shown in Figure 187-198).
Figure 188 illustrates that another prior art shows assembly 18802, and this assembly comprises part reflective membrane 18804, and this film is supported at edge and on reflective image source 18720, is independent unsupported.This demonstration assembly is worked in the mode that is similar to the demonstration assembly shown in Figure 187, and difference is to show that assembly 18802 is lighter than demonstration assembly 18700 in weight owing to there is no solid beam splitter piece 18718.As visible in Figure 188, the height of diffuser 18704 is again identical so that image light 18808 to be provided with the width in reflective image source 18720, and this image light vertically advances in the image optics device being associated at being reflected property image source 18720 reflex times.
Figure 187 illustrates and is arranged in the angle that is less than 45 degree to showing that the light of assembly 18902 can occur and so on to illustrate as fruit part reflective membrane 18804.In this case, the each several part in reflective image source 18720 is not by uniform irradiation.Irradiate in reflective image source from the light of diffuser part farthest or there is no straight ahead (in situation of light 18904) in the image optics device being associated, or before just from the surface reflection (as the situation of light 18908) in reflective image source, this can change polarization state, and if fruit part reflective membrane is polarization beam apparatus film (being also referred to as reflective polarizer film), light is then by this film.Therefore, when the image optics device being associated only can use from the image light time of reflective image source 18720 straight ahead, in the time that part reflective membrane 18804 is positioned at the angle that is less than 45 degree, in reflective image source 18720, irradiated region is reduced, and correspondingly the dark portion of image produces.
In the one embodiment of the invention shown in Figure 190, the diffused light 19010 that sweep reflective surface will 19004 is provided to that light source 18702 is provided redirects to irradiate reflective image source 18720 downwards.Bending part reflective surface will 19004 can be polarization beam apparatus film, and this film is thin and flexible.In this case, diffuser 19704 comprises linear polarization, so that light 18712 is diffused then by linear polarization, so diffused light 19010 is polarized.Linear polarization in diffuser 18704 and polarization beam apparatus film 19004 are oriented to and make to be polarized the reflection of beam splitter film by the light of linear polarization.In this way, in the time that reflective image source 18720 changes the polarization of diffused light 19010, it is relative polarization state that the polarization of the image light 19008 reflecting is compared with diffused light 19010.Then the image light 19008 reflecting arrive display optics by part reflective membrane 19004 continuation.By using flexible polarization beam apparatus film as part reflective surface will 19004, part reflective surface will 19004 can be bending and light weight.Polarization beam apparatus film is taken on the dual role of the transparent component of the reverberator of the diffused light 19010 that irradiates reflective image source 18720 and the image light 19008 of reflection.As known to those skilled, the advantage that polarization beam apparatus film provides is that they can receive light in incident angle on a large scale, to make curve can not interfere the light by arriving film.In addition, for example,, because polarization beam apparatus film is thin (, being less than 200 microns), curved shape can not make significantly image light distortion in the time that image light 19008 arrives display optics by film.Finally, polarization beam apparatus film makes the tendentiousness of light scattering very low, so hi-vision contrast is maintained.
The flexible nature of polarization beam apparatus film allows them to be formed curved shape, and curved shape is by the light-redirecting from diffuser and focus on reflective image source.The light that the shape of the curve of polarization beam apparatus film can provide based on diffuser distributes and selects, so that the uniform irradiation to reflective image source to be provided.Figure 190 illustrates the sweep reflective membrane 19004 with parabolic shape, but depend on the character of light source 18702 and the validity of diffuser 18704, curve, plane or the sectional plan of radiation curve, complicated SPL, relatively flat also may be used for redirecting equably diffused light 19010 and being focused on reflective image source 18720.Experiment shows, the curved surface in part reflective surface will 19004 often converges at diffused light 19010 center in reflective image source 18720, to make curved surface obtain best use in the time that diffuser 18704 provides light that edge more becomes clear to distribute.On the contrary, experiment shows that the surface of relatively flat in part reflective surface will 19004 obtains best use in the time that diffuser 18704 provides the light more becoming clear in center to distribute.In the time that part reflective surface will 19004 is made up of flexible membrane, the shape of part reflective surface will can maintain with side frame, and side frame has the groove of suitable profile with during flexible membrane is kept in position, as independent of support membrane in being shown in Figure 190.Two side frames are used to support bends shape on the either side that is showing assembly 19002 together with other assembly because show that the very most of of assembly 19002 is made up of air, and part reflective surface will 19004 is films, compared with showing assembly 18700 with the prior art shown in Figure 187, weight is light.In addition, as visible in Figure 190, the width in irradiated reflective image source 18720 is greater than the height of diffuser 18704, to make showing that assembly 19002 shows that than the prior art shown in Figure 188 assembly is more compact.
Figure 191 illustrates another embodiment of the present invention, wherein in demonstration assembly 19102, uses two light sources 19104, and wherein the part reflective surface will of two relatively flats is placed back-to-back.Being arranged in shown in Figure 191 has in the headlight of both sides provides solid film retainer 19120, to make showing that assembly 19102 is similar to the demonstration assemblies 18700 as shown in Figure 187 that use two to arrange back-to-back.In Figure 191, only show light for a side, but the parts of opposite side and light and shown in a side be symmetrical.It in solid film retainer 19120, is the part reflective membrane 19110 of extending continuously between both sides.Solid film retainer 19120 also between both sides continuously, with make image light 19112 the jointing line between the both sides of shown assembly 19102 interrupt or deflection.Solid film retainer 19120 provides constant optical thickness together with part reflective membrane 19110, so image light is not deflected or distorts.Therefore the image light 19112 that, has a consecutive image quality can be provided from the light-struck of two light sources 19104 simultaneously.Each light source 19104 provides light 19114 to diffuser 19108, this diffuser deflection dispersion light 19114 so that diffused light 19118 to be provided to irradiate the half in reflective image source 18720.Solid film retainer 19120 remains on part reflective membrane 19110 in required form.The most important thing is, when compared with the irradiating width in reflective image source 18720, the height of diffuser 19108 is reduced in Figure 187 for the half that shows the prior art diffuser 18704 shown in assembly 18700.
Figure 192 illustrates the independence that has two light sources 19104 and be only supported at edge illustrating without the demonstration assembly 19202 of support section reflective membrane 19204.In Figure 192, only show light for a side, but the parts of opposite side and light and shown in a side be symmetrical.The function that shows each assembly of assembly 19202 is identical with the assembly function shown in Figure 191, show the light benefit of assembly 19102 but there is the assembly 19202 of demonstration ratio in weight, because show that the major part of assembly 19202 is made up of air.
Figure 193 illustrates the independence that has two light sources 19104 and be only supported at the edge demonstration assembly 19302 without support section reflective membrane 19308, so that two curved surfaces are provided.In Figure 193, only show light for a side, but the parts of opposite side and light and shown in a side be symmetrical.Part reflective membrane 19308 is all continuous on both sides, has similar bending on both sides.Bending is selected to reflect the diffused light 19312 that diffuser provides and diffused light is focused on reflective image source 18720.Diffused light 19312 is reflected in reflective image source 18720, forms thus image light 19310.The height of diffuser 19304 is less than the half of the prior art diffuser 18704 shown in Figure 187, to make headlight and to show that assembly 19302 is very compact.
Figure 194 illustrates the illustrating of demonstration assembly 19402 of the continuous part reflective membrane 19308 having in solid film retainer 19404, shows that assembly 19402 is similar to the demonstration assembly 19302 shown in Figure 193 in other side.In Figure 194, only show light for a side, but the parts of opposite side and light and shown in a side be symmetrical.Solid film retainer 19404 is used on the either side of part reflective membrane 19308, film is remained in the both sides curve of appointment, and protects part reflective membrane 19308.The both sides of solid film retainer 19404 are connected by the relatively thin part in the centre, bottom of solid film retainer 19404, further to avoid providing the jointing line by the image light 19310 of destruction picture centre.
In a preferred embodiment of the invention, the part reflective membrane in the demonstration assembly shown in Figure 191-194 is polarizing beam splitting film.In these embodiments, diffuser comprises linear polarization, so that diffused light is polarized.Linear polarization is aimed at polarization beam apparatus film, is polarized the polarization state of beam splitter film reflection so that diffused light is had.Polarization beam apparatus film is also taken on the analyzer of image light.In headlight, adopt polarization beam apparatus film to use the advantage of polarization diffused light to be, in demonstration assembly, parasitic light is reduced, because all polarization diffused lights are polarized beam splitter film towards the reflection of reflective image source, at reflective image source place, polarization diffused light is transformed into image light.If diffused light is not polarized, the diffusion polarized state of light not being reflected will carry out transmission by polarization beam apparatus film, if this light is not controlled, it will provide scattered light to image light, and this can reduce the contrast in the image presenting to beholder.
Figure 195 illustrates that the single light source 19104 that has in a side and polarization control are effectively to irradiate the illustrating of demonstration assembly 19502 in reflective image source 18720 from both sides.In this case, light source 19104 provides the light 19114 of unpolarized and the diffused light 19508 of unpolarized.Specific reflective membrane is the polarization beam apparatus film 19504 in solid film retainer 19514.Polarization beam apparatus film 19504 reflects a polarization state (being illustrated as light 19510) another polarization state of transmission (being illustrated as light 19518) simultaneously of diffused light.Polarization beam apparatus film 19504 is folding and continuous, to make the light with another polarization state 19518 by the both sides of folding polarization beam apparatus film 19504.Then this light 19518 delay film 19524 by quarter-wave, and this quarter-wave delays film and changes polarization state into circular polarization from linearity.Then circularly polarized light is reflected by eyeglass 19528 and delay film 19524 by quarter-wave passes back, quarter-wave delays film polarization state is changed into linear polarization but has a kind of polarization state (being illustrated as light 19520) from circular polarization, reflects towards reflective image source 18720 to make light 19520 then be polarized beam splitter film 19504.The light of the light same state of polarization that therefore, the light source 19104 in demonstration assembly 19502 provides irradiates reflective image source 18720 on both sides.Because diffused light 19508 is not polarized, and two kinds of polarization states (19510,19518) are all used to irradiate reflective image source 18720, light source provide substantially all light be transformed into image light (19512,19522).Image light (19512,19522) is directly provided to the image optics device being associated.Again, the height of diffuser 19108 is half of the diffuser 18704 shown in Figure 187, compact and efficient headlight is provided thus and shows assembly.
Figure 196 illustrates to have and is similar to the demonstration assembly 19602 that how much shown in Figure 195 are arranged, but polarization beam apparatus film 19604 is independent unsupported, and be only supported at edge, to reduce the weight of headlight, still provide the diffuser height lower with respect to the width in irradiated reflective image source simultaneously.
Figure 197 illustrates an embodiment more of the present invention, comprises and has two light sources 19704 and 19708 and the demonstration assembly 19702 of folding polarization beam apparatus film 19714, and wherein the both sides of folding polarization beam apparatus film 19714 are bending.From the light the 19718, the 19720th of light source 19704,19708, be not polarized, diffuser 19710,19712 does not comprise polarizer, thus diffused light 19722,19724 is not polarized yet.The bending of polarization beam apparatus film 19714 and angled each side redirect a polarization state of diffused light (being illustrated as light 19728,19730) towards reflective image source 18720, also light 19728,19730 is converged on the imaging region in reflective image source 18720 simultaneously.In this demonstration assembly, two light sources 19704,19708 and folding polarization beam apparatus 19714 are with work in complementary fashion, because polarization beam apparatus film 19714 is continuous.Therefore, in each side that shows assembly 19702, provide respectively the diffused light 19722,19724 not being polarized, and the first polarization state (being generally S) is polarized beam splitter film 19714 and is redirected towards reflective image source 18720, and the light 19740,19738 simultaneously with another polarization state (being generally P) is polarized 19714 transmissions of beam splitter film.There is the transmitted light 19740,19738 of another polarization state by the bilateral of folding polarization beam apparatus film 19714, to make it arrive respectively the diffuser 19712,19710 on opposite side.In the time that light 19740,19738 affects respectively diffuser on opposite side 19712,19710, light is diffused body reflection, and during the course, and light becomes and is not polarized.Can add reverberator to increase the reflection to light 19740,19738 to light source 19704,19708 and peripheral region.The diffused light 19722,19724 that then this irreflexive light not being polarized provide with light source 19704,19708 mixes in a corresponding side, then pass back towards polarization beam apparatus film 19714, the light 19730,19728 at polarization beam apparatus film place with the first polarization state reflects towards reflective image source, and the light 19738,19740 with another polarization state is transmitted, this process repeats continuously.Therefore, in this embodiment of the invention, the only continuous recycle of another polarization state, increase thus the efficiency that shows assembly 19702, because two polarization states of the light 19718,19720 that two light source 19704,19708 provides are all used to irradiate reflective image source 18720.The irradiation uniformity of light providing to reflective image source 18720 has also been provided in the increase diffuse reflection of the light of recycle.Image light (19732,19734) can directly be provided to the image optics device being associated.
With in Figure 197, provide and can use in another embodiment in the similar method of method described above, wherein show that assembly has flat surfaces in each side of folding polarization beam apparatus film.In this embodiment, because each side of reflectivity polarizing coating is all smooth, the irradiation homogeneity that keeps diffuser to provide from the light of side lamp.
In another embodiment of the demonstration assembly shown in Figure 197, can use solid film retainer, wherein the light of another polarization state is recycled to raise the efficiency.In this embodiment, each side of folding polarization beam apparatus film can be smooth or bending.
Figure 198 illustrates for the manufacture of the illustrating of the method for the headlight 19902 shown in Figure 199, and this headlight has folding reflectivity beam splitter film 19808 and the two light sources in each side.In Figure 198, two light sources are not illustrated, because they can be parts for another number of assembling steps, or are arranged in module around.The process flow diagram of assemble method provides in Figure 20 4.In this method, in step 20402, top 19810 and bottom 19812 film retainers are provided.Top and bottom film retainer 19810,19812 can be made up of any transparent material by diamond turning, injection moulding, compression molding or grinding.The combination of material and manufacturing technology is selected so that top 19810 and the bottom 19812 film retainers with low-birefringence to be provided.Suitable low birefringence material for film retainer 19810,19812 comprises glass material or plastics, such as Zeonex F52R, the APL5514 of Mitsui company or the OKP4 of Osaka Gas company of Zeon Chemicals company.Top with in bottom film retainer, the surface that contacts folding polarization beam apparatus film 19808 is mated film 19808 is remained in position in required form and angle and can not introduced significant air gap, therefore image light can be deflected hardly by headlight 19902.In step 20404, bottom film retainer 19812 or bonding or by providing, bottom film retainer 19812 is remained on the surrounding structure of (or contact or at distance to a declared goal) in the relation in reflective image source 18720 and is attached to reflective image source 18720 by bonding agent.In step 20408, polarization beam apparatus film is folded.Then in step 20410, folding polarization beam apparatus film 19808 is placed in lower film retainer 19812, and upper membrane retainer 19810 is placed in top, force thus polarization beam apparatus film 19808 conformal with the match surface of top 19810 and bottom 19812 film retainers.In the alternative embodiment of the inventive method, bonding agent is applied in the surface of top 19810 or bottom 19812 film retainers, to make polarization beam apparatus film 19808 be adhered to top 19810 or bottom 19812 film retainers.In step 20412, diffuser 19802,19804 is attached to each side of bottom 19812 film retainers.Being schematically represented in shown in Figure 199 of the headlight 19902 of assembling.Can manufacture the headlight shown in Figure 191,194 and 195 by similar method.Within the scope of the invention, the order of assembling can be changed.
In the alternative embodiment of said method, before film retainer 19810,19812 is attached to diffuser 19802,19804 or reflective image source 18720 or any other part, film retainer 19810,19812 and folding polarization beam apparatus film 19808 fit together.Then step 20402,20408 and 20410 carries out in order, to manufacture the solid film retainer that is similar to the inner side shown in Figure 191,194 and 195 and has folding polarization beam apparatus film 19808.Reflective image source 18720 and diffuser 19802,19804 after a while (step 20404,20412) adhere to.
Can make in all sorts of ways and between top and bottom film retainer, reflectivity beam splitter film be remained in appropriate location.Film can in position be adhered to top or bottom film retainer.The image optics device (not shown) that top or bottom film retainer can be adhered to surrounding structure part (not shown) or be associated.When reflectivity beam splitter film is while having the polarization beam apparatus film of wire-grid polarizer, if use bonding agent in wire grid construction one side, the performance of wire-grid polarizer may be impaired.In this case, polarization beam apparatus film can be adhered to top or bottom film retainer (depending on that in top or bottom film retainer, which adjoins wire grid construction) on the offside of wire grid construction.The bonding agent that is used to the polarization beam apparatus film to be adhered to film retainer must be transparent and low-birefringence.The example of suitable bonding agent comprises UV cure adhesive or contact adhesive.
Figure 200-203 illustrate for the manufacture of a series of of other method of the headlight with bilateral lamp and illustrate.Figure 20 5 is process flow diagrams of listing the step of the method.In the method, casting top, appropriate location and bottom film retainer that can be around folding reflectivity beam splitter film.In step 20502, polarization beam apparatus film 20008 is folded.In step 20504, folding polarization beam apparatus film 20008 is inserted in side frame, and this side frame has groove or polarization beam apparatus film 20008 is remained on the required form (referring to the tangent bend shape shown in Figure 200) for headlight by matching parts.In step 20508, then side frame is attached to reflective image source 18720.In step 20510, diffuser 20002,20004 is attached to each side of side frame.Now, folding polarization beam apparatus film 20008 on side by side frame and diffuser 20002,20004 around, and on bottom being reflected property image source 18720 around.Figure 200 illustrates illustrating of reflective image source 18720, reflective image source 18720 has the diffuser 20002,20004 and the independent unsupported reflectivity beam splitter film 20008 that adhere to, this reflectivity beam splitter film is supported at edge, to make required form be given reflectivity beam splitter film 20008.
Figure 20 1 illustrates the hole in side frame or surrounding structure, and these holes are used to transparent cast material to cause under folding reflectivity beam splitter film.As shown in the figure, near the larger hole 20102 reflective image source 18720 is used to introduce transparent cast material, and less hole 20104 is used to allow air to escape for 20008 from folding reflectivity beam splitter film.In this method, folding reflectivity beam splitter film 20008 forms closed cavity on reflective image source 18720, and this cavity is diffused body 20002,20004 and side frame or surrounding structure and comprises.In the time of the slow filling orifice 20102 of transparent casting resin quilt, discharge from less hole 20104 from the air of closed cavity.In the time that cavity is full, the each several part tap hole 20104 of transparent cast material, prevents from dividing beam splitter film 20008 times mineralization pressures in reflectivity thus, and this pressure can make the shape distortion of film.Then hole 20102 and 20104 can be plugged to prevent that transparent cast material from leaking out.
In step 20512, transparency liquid cast material 20202 is poured in polarization beam apparatus film 20007 tops, as shown in Figure 20 2.In step 20514, then apply transparent top flat or plate 20302 to provide flat top to material 20202, as shown in Figure 20 3.In the time that transparent cast material is applied to the flat panel of transparent material, must be carefully to prevent that air from remaining under the flat panel of transparent material.In structure, stopper is set around, so that the maintenance of the flat panel of transparent material is parallel with reflective image source.
Transparency liquid cast material can be any transparency liquid cast material, such as epoxy resin, acrylic acid or urethane.Reply top-film retainer uses the transparency liquid cast material identical with bottom film retainer, and therefore image light is exposed to the solid block of uniform optical thickness, and the surface deflections of the not folded polarization beam apparatus film of image light.Transparency liquid cast material can be cured by allowing set time, be exposed to UV or be exposed to heating power after casting.Solidifying of transparent cast material can be carried out in single step or multiple step.Solidifying of bottom as shown in Figure 20 1 can be carried out before the casting on the top as shown in Figure 20 2.Or solidifying of whole casting headlight can be carried out after the step shown in Figure 20 3.
The advantage of the method shown in Figure 200-203 is: between transparent cast material and reflectivity beam splitter film, obtain close contact, therefore light can be unimpededly by the each several part of headlight.This casing process also can be used to solids top or bottom film retainer, only to make top or bottom film retainer be cast.Manufactured the headlight with curved surface although Figure 200-203 illustrate, the method also can be used for manufacturing the headlight with flat surfaces.
In another embodiment, one of film retainer is manufactured to solid members, and another film retainer casts in appropriate location with folding polarization beam apparatus film.Folding polarization beam apparatus film can be adhered to solid members before another film retainer is cast in to correct position.In this way, the film retainer of casting will have and the surperficial close contact of polarization beam apparatus film.Should there is the refractive index identical with the film retainer of casting for the material of solid film retainer, to avoid the making deflection of image light in the time that image light self-reflection image source passes to the image optics device being associated.Suitably the example of the material of coupling is the APEC2000 from Bayer company, and this material has 1.56 refractive index, and can be used to the injection moulding from the EpoxAcast690 of Smooth-On company, and EpoxAcast690 has 1.565 refractive index and can cast.
In an embodiment more of the present invention, use the multistep molding process as shown in the process flow diagram of Figure 20 6 to manufacture solid film retainer.In step 20602, bottom film retainer is molded.Suitable molding technique comprises injection moulding, compression molding or casting.In step 20604, polarization beam apparatus film is folded.In step 20608, folding polarization beam apparatus film is placed on molded bottom film retainer, is then placed in the mould of top-film retainer as insert.In step 20610, then top-film retainer is molded on folding polarization beam apparatus film and bottom film retainer.Net result is the solid film retainer that inner side has the folding polarization beam apparatus film shown in Figure 191,194 and 195.The advantage of this multistep molding technique is, force the surface of folding polarization beam apparatus film and bottom film retainer conformal, and top and bottom film retainer and folding polarization beam apparatus film is in close contact.In a preferred embodiment, the refractive index of top and bottom film retainer is identical, and error is in 0.03.In another preferred embodiment, glass transition point for the material of bottom film retainer is higher than the glass transition point of the material for top-film retainer, or the material for bottom film retainer is crosslinked, make with must be on folding polarization beam apparatus film and bottom film retainer when molded top-film retainer, bottom film retainer can deformation.One example of appropriate combination that can injected plastics material is cyclic olefin material, is that the Zeonex E48R from Zeon Chemicals company and the Tg that 139C and refractive index are 1.53 is the Topas6017 from Topas Advanced Polymers company that 177C and refractive index are 1.53 such as Tg.
Some embodiment that are appreciated that AR eyepiece of the present invention have the allowance high modulation transition function of the various combinations of the equipment size such as irrealizable resolution levels and such as frame thickness before.For example, in certain embodiments, the virtual image pixel resolution rank presenting to user can be in the scope of every degree approximately 28 to 46 pixels.
With reference to figure 105A, to C, the angle control chart of sweep grid polarizer is as direction of light.The width of the curve controlled image light of sweep grid polarizer.Curve allows to use narrow light source, because in the time that light clashes into curve, it scatters light, and then folding light/reflected light is with uniform irradiation image display.The image of passing back by wire-grid polarizer is only upset.Therefore, curve also allows the miniaturization of optics assembly.
In Figure 21-22, augmented reality eyepiece 2100 comprises framework 2102 and left and right mirror pin or temple part 2104.Such as the protectiveness lens 2106 of trajectory lens be installed in framework 2102 before, to protect user's eyes or correct the view of user to surrounding environment in the situation that they be prescription lens.The front portion of framework also can be used to install camera or imageing sensor 2130 and one or more microphone 2132.Invisible in Figure 21, waveguide is installed in framework 2102 after protectiveness lens 2106, in each side of center or the property the adjusted bridge of the nose 2138, has one.Protecgulum 2106 can be removable, so that make can be easily as the specific user of augmented reality equipment changes color and luster or prescription.In one embodiment, each lens can be changed rapidly, thereby allows there are different prescriptions for each eye.In one embodiment, lens can be with changing rapidly as buckling piece of other local discussion herein.Some embodiment can only have projector and waveguide combination in a side of eyepiece, and opposite side can be used conventional lenses, read the fillings such as lens, prescription lens.Left and right mirror pin 2104 can be vertically mounted on projector or micro-projector 2114 or other image source on the top of hinge 2128 of loading spring separately, for easier assembling and vibration/shock protection.Each temple part also comprises the temple part shell 2116 of the associated electronics for eyepiece is installed, and also can comprise that separately elastic head holds pad 2120, for to the better maintenance of user.Each temple part also comprises extension circulating earphone 2112 and the aperture 2126 for mounting head headband 2142.
As noted, temple part shell 2116 comprises the electron device being associated with augmented reality eyepiece.Electron device can comprise as directed some circuit boards, such as for microprocessor and wireless circuit board 2122, for circuit board 2124 and open multimedia application processor (OMAP) processor plate 2140 of chip-on communication system (SOC).Chip-on communication system (SOC) can comprise the electron device for one or more communication capacities, and communication capacity comprises wide LAN (Local Area Network) (WLAN), BlueTooth tMcommunication, frequency modulation (PFM) (FM) radio, GPS (GPS), 3 axle accelerometers, one or more gyroscopes etc.In addition, the right support member can comprise optical touch plate (not shown), the control for user to eyepiece and one or more application on the outside of temple part.
In one embodiment, digital signal processor (DSP) can be programmed and/or be configured to receiver, video fed information, and video feed is configured to drive the no matter image source of which kind just using by optical display.DSP can comprise bus or other communication mechanism for the communication information, and with the internal processor of bus coupling for the treatment of information.DSP can comprise and is coupled to bus for storage information and the command memory that will be performed, for example, such as random-access memory (ram) or other dynamic memory (, dynamic ram (DRAM), static RAM (SRAM) (SRAM) and synchronous dram (SDRAM)).DSP can comprise the nonvolatile memory that is coupled to bus and is used to internal storage storage static information and instruction, for example, such as ROM (read-only memory) (ROM) or other static storage device (, programming ROM (PROM), erasable PROM(EPROM) and electric erasable PROM(EEPROM)).DSP can comprise special purpose logic devices (for example, special IC (ASIC)) or configurable logic device (for example, simple programmable logical device (SPLD), CPLD (CPLD) and field programmable gate array (FPGA)).
DSP can comprise at least one computer-readable medium or storer, for keeping programmed instruction and driving required data structure, table, record or other data of optical display for comprising.The example that is applicable to computer-readable medium of the present invention can be compact-disc, hard disk, floppy disk, tape, magneto-optic disk, PROM(EPROM, EEPROM, sudden strain of a muscle EPROM), DRAM, SRAM, SDRAM or any other magnetic medium, compact-disc (for example, CD-ROM) or any other optical medium, card punch, paper tape or there is any other medium that any other physical medium, carrier wave (describing below) or the computing machine of sectional hole patterns can therefrom read.Realize one or more sequences for one or more instructions of optical display to carry out the various forms that can relate to computer-readable medium.DSP also can comprise communication interface, so that the network link of for example Local Area Network to being connected or the data communication coupling such as the alternative communication network of the Internet to be provided.Wireless link also can be implemented.In any such realization, suitable communication interface can be carried the electricity from the digit data stream of various types of information (such as video information) to optical display, electromagnetism or the light signal that represent by sending and receiving.
Eyepiece can carry out the context-aware of video to catch, and video capture parameter is adjusted in the motion of this seizure based on beholder, and wherein parameter can be image resolution ratio, video compression ratio, frame per second per second etc.Eyepiece can be used to various video application, such as record take by integrated camera or from external video equipment send video, by eyepiece to wearer's playback video (by method and system described herein), stream transmit or from external source (for example, Conference Calling, live news feed, from the video flowing of another eyepiece) or from live video of integrated camera (for example,, from integrated non-sight line camera) etc.In each embodiment, eyepiece can hold the multiple Video Applications that can once be presented to wearer, and such as for example checking streamed external video link, playback is simultaneously stored in the video file on eyepiece.Eyepiece can provide 3D viewing experience, such as by providing image to any eye, or alternatively provides and simplifies 3D and experience, such as the content that provides quantity to reduce to one of two eyes.Eyepiece can provide Text enhancement video, such as when audio conditions too noisy and can not hear included audio frequency time, audio frequency while using user's foreign language, user be while wanting transcribing of record audio etc.
In each embodiment, eyepiece can provide context-aware Video Applications, such as adjusting at least one parameter of video capture and/or checking according to wearer's environment.For example, can be needing wearer's focal attention external environment condition present video by eyepiece to the wearer of eyepiece in the context of the external environment condition of non-video, wherein at least one parameter is adjusted (for example, the adjustment of spatial resolution of presented video so that present more not distracting mode; The adjustment of number of pictures per second; Replace video with the still image that represents video content and present, photo that all people in this way of still image store, from the single frame of video) etc.In another situation, can (for example move wearer, walking, run, by bike, drive) context in catch video by the integrated camera on eyepiece, the video that wherein at least one parameter adjustment is catching for example, to help to adapt to motion (make adjustment during rapid movement that, will be fuzzy at eyepiece sensing video, make adjustment during wearer just slowly walks or be mobile).
In each embodiment, this at least one parameter can be spatial resolution parameter (for example, by the pixel in region, by the pixel of the appointment color in region, be limited to by only single (' black and white ') pixel of region), the visual field, the frame recording by the time, the frame presenting by the time, data compression rate, time period of not being recorded/presenting etc.
In each embodiment, this at least one parameter can be adjusted in the input based on eyepiece institute sensing, for example, such as (determining head movement according to being used for, to determine quick head movement, slowly head movement) motion detection input (as described here), by processing the image receiving via integrated camera for the motion of surrounding's video capture environment of the relative motion between definite wearer and environment or the motion in environment, wearer's eye movement (as described here) is divert one's attention to determine the video whether wearer is presented by forward wearer, surround lighting and/or sound conditions etc.
In each embodiment, eyepiece can provide image processing about the impact that reduces the impact of motion or the quality of the video tastes of environment on wearer or the quality on video stored when the seizure video, such as for compensating slight movement, jump, rapid movement; Adjust background illumination and/or acoustic environment, such as passing through to adjust color mixture, brightness etc.Can be according to the input of institute's sensing, environmental aspect, video content etc. to the selection of processing.For example, preference high quality graphic in some cases, to make in some cases, the reduction of quality is unacceptable, therefore video may be suspended under these situations.In another situation, in the time determining the seizure of situation prevents acceptable quality level but still need certain continuity of seizure, video and/or audio compression can be employed.Process and also can be differently applied to each eye of eyepiece, such as the leading eye about wearer, contrast varying environment situation that another eyes experience etc. about eyes.Processing can compensate for bright environment, wherein embedded type sensor is used to check ambient light level so that may adjusting of carrying out content to show, such as in case based on environment determine to carry out the compression of what color channel and/or handle, amendment color curve/palette so as more visible or more invisible with respect to environment, change color depth, color curve, change and how to compress color etc.
In each embodiment, as the result of institute's sensing situation, eyepiece can start an action, if such as making predetermined quality level degradation when for example exceed the audio-frequency unit motion of going to screenshot capture pattern to continue in video when the predetermined amount of movement etc. of eyepiece exceedes a condition simultaneously, stop capture video, trigger video change in presenting etc. in the time that sports level is exceeded in received video.
In each embodiment, as the result of reception control signal, eyepiece can start an action.Control signal can be based on eyepiece the current interior perhaps user gesture of watching of position, eyepiece.Action can be video the uploading or downloading memory location being caught by eyepiece.Action can only be answered the reception of the reception of control signal itself or the acknowledgement control signal of control signal and user's startup and is activated.Action can be to move to assigned address in the video just being shown by glasses, align the startup that assigned address in the video being shown by glasses adds the process of bookmark etc.
In each embodiment, the adjustment of carrying out as the result of sensing situation can be controlled by user preference, organizational policy, state or federal regulations etc.For example, a preference can be to be always to provide a certain quality, resolution, compressibility etc., and no matter what the input of institute's sensing indicates.
In an example, the wearer of eyepiece can be in their head and environment that therefore the integrated camera of eyepiece rocks fast in eyepiece recorded video wherein.In this case, at least one parameter of eyepiece capable of regulating is to reduce the degree of rocking video that catches, be applied to the compressibility of video, the frame number that minimizing catches on a time period (for example, catching a frame for every several seconds), be discarded in frame, the reduction spatial resolution etc. in image between each frame with great change such as increase.
In an example, the wearer of eyepiece may just use video conference by eyepiece, and wherein eyepiece senses wearer by motion sensor and moves.As a result, between this moving period, the replaceable participant's of still image video feed, such as one of other participants' image or as be sent to other members' user's image.In this way, can reduce wearer's distracting impact causing of moving to other participants in wearer and/or video conference.
In an example, wearer may watch video then to start to drive, if wearer continues to watch the video of current demonstration, this may become safety problem.In this case, eyepiece can be by the motion detection of environment for instruction be in automobile, and viewing experience is changed to more not distracting, if such as wearer's eye movement indicating user just in sight line (driving direction) or the situation of immediately alternately changing fast between the visual field after automobile and shown video.Eyepiece can for example stop video, and the option of continuation is provided to beholder.Eyepiece can also sensing environment motion to distinguish in automobile, between bicycle, walking etc., and correspondingly adjust.
In an example, no matter wearer may need to assist to navigate to a position in automobile, on bicycle, when walking or other etc.In this case, eyepiece will be to user's display video navigation application.The navigation instruction that eyepiece shows to user can be selected by control signal.The destination that in position that control signal can be specified by wearer, glasses, the current content showing or wearer say generates.Position can be one of meal/drink, education, event, exercise, family, outdoor, retail shop, communications and transportation position etc.
In an example, wearer may catch video, and wherein surrounding environment is distracting or reduced in one aspect the quality of video, such as due to color contrast, mixing, the degree of depth, resolution, brightness etc.Eyepiece can be for wearer in outdoor contrast in indoor situation, under different illumination conditions, inferior adjustment of unfavorable sound condition.In this case, the image that eyepiece capable of regulating is recorded and sound are to obtain the video product as the more efficiently expression to the content catching.
In each embodiment, eyepiece can provide external interface to computer peripheral, the all monitors in this way of peripherals, display, TV, keyboard, mouse, memory stores are (for example, external fixed disk drive, optical drive, solid-state memory), network interface (for example, the network interface to the Internet) etc.For example, external interface to the direct connection of outside computer peripheral (for example can provide, be connected directly to monitor), indirect communication (for example,, by central outer peripheral interfacing equipment) to outside computer peripheral, by wired connection, by wireless connections etc.In an example, eyepiece can be connected to the central outer peripheral interfacing equipment that the connection to external peripheral is provided, wherein outer peripheral interfacing equipment can comprise computer interface instrument, such as computer processor, storer, operating system, peripheral driver and interface, USB port, outside display interface, the network port, speaker interface, microphone interface etc.In each embodiment, eyepiece can be by wired connection, wireless connections, be directly connected to central outer peripheral interface in bracket is medium, and in the time connecting, can be equipped with the computational tool similar or identical with personal computer to eyepiece.In each embodiment, selected select to can seen eyepiece by user, point to eyepiece, select in the user interface that shows from eyepiece etc. by the equipment of eyepiece control.In other embodiments, eyepiece can show the user interface of this equipment in the time that user checks equipment or sensing equipment.
In the general shape of framework 2102 in a secondary circulating type sunglasses.The both sides of glasses comprise marmem headband 2134, such as Nitinol headband.Nitinol or other marmem headband are applicable to the user of augmented reality eyepiece.Headband is customized so that proper they worn by user and heated when approaching body temperature, present its training or preferred shape.In each embodiment, the applicable of eyepiece can provide eyes of user width technique of alignment and measurement.For example, the demonstration of institute's projection is for the wearer's of eyepiece position and/or to being adjusted to appropriate location, to adapt to different wearers' various eye widths.Location and/or to aim at can be automatically, such as for example, by the position (, iris or pupil detection) via optical system detection wearer eyes, or manual, such as being undertaken by wearer etc.
The further feature of this embodiment comprises detachable noise elimination earphone.As visible in accompanying drawing, earphone is intended to be connected to the control of augmented reality eyepiece to transmit sound to user's ear.Sound can comprise the input from wireless Internet or the telecommunication capability of augmented reality eyepiece.That earphone also can comprise is soft, can deformation plastics or foamed material part so that be similar to the mode of earplug and protected user's inner ear.In one embodiment, the input of the ear to user is limited to about 85dB by earphone.This allows normally listening to of wearer, provides for the protection of gunslinging noise or other explosive noise and in high background noise environment simultaneously and listens to.In one embodiment, the control that noise is eliminated earphone can have automatic gain control, for eliminate adjusting very fast of feature in the time protecting wearer's ear.
Figure 23 has described the layout of the projector 2114 of the vertical arrangement in eyepiece 2300, wherein light irradiation at it to can be Down-Up by a side of PBS in the display of silicon backboard and the way of imager plate, and in the time that light irradiation is clashed into the inner boundary of the Tp that forms polarization beam apparatus, it is refracted as image light, and reflects and go forward side by side into waveguide lens from projector.In this example, the width of the imaging plate of 11mm for the dimension of projector, illustrate to the distance of the distance of picture centre line 10.6mm and the about 11.8mm from picture centre line to LED board one end from one end of imaging plate.
The view of the detailed and assembling of each assembly of the visible projector to above-mentioned discussion in Figure 25.This view has been described in the time that micro-projector 2500 is assembled near the hinge of for example augmented reality eyepiece, and how compact micro-projector 2500 has.Micro-projector 2500 comprises shell and for the retainer 2508 of some optical element is installed.Because each colour field is by optical display 2510 imagings, corresponding LED color is opened.RGB LED lamp engine 2502 is depicted near bottom, is arranged on heating radiator 2504.Retainer 2508 is installed on the top of LED lamp engine 2502, and retainer has been installed light tunnel 2520, diffusing lens 2512(to eliminate focus) and condenser lens 2514.Light passes to polarization beam apparatus 2518 from condenser lens, is then transferred to field lens 2516.Then light be refracted to LCoS(liquid crystal over silicon) on chip 2510, form there image.Then light for image be reflected back toward by field lens 2516, and be polarized and reflected 90 ° by polarization beam apparatus 2518.Then light leave micro-projector to be transmitted into the optical display of glasses.
Figure 26 has described exemplary RGB LED module 2600.In this example, LED is the 2x2 array with 1 redness, 1 blueness and 2 green tube cores, and LED array has 4 negative electrodes and a general anode.Maximum current can be each tube core 0.5A, and may need maximum voltage (≈ 4V) to green and blue dies.
In each embodiment, system utilizable energy is enough generates the monochromatic optical system showing to wearer, and this monochrome shows can provide benefit to image definition, image resolution ratio, frame per second etc.For example, frame per second can be three times (compared with RGB systems), and this may be in night vision or similarly useful in situation, in night vision or analogue camera to around imaging, wherein these images can be processed and be shown as content.Image may be more bright, if such as having used three LED, three times bright, or only adopt a LED to save space.If used multiple LED, they can be homochromy or they can be not homochromy (RGB).System can be changeable monochrome/color system, has wherein used RGB, but in the time that wearer wants monochrome, or can select independent LED or multiple LED.Whole three LED can be used simultaneously, but not use in order, to obtain white light.Using three LED but not use in order may be as any other white light that wherein frame per second rises in three times of situations." switching " between monochrome and colour can " manually " (for example, physical button, gui interface are selected) be carried out, or switching can be depending on the application moving and automatically carried out.For example, wearer may enter night vision pattern or fog dispersal pattern, and the processing section of system determines that eyepiece need to enter monochromatic high refresh rate pattern automatically.
Fig. 3 has described the embodiment of the projector of the horizontal setting in use.Projector 300 can be placed in the arm of eyepiece frame.LED module 302 under processor control 304 can be according to single color of outgoing of rapid serial.The light of institute's outgoing can be advanced downwards in light tunnel 308, and run into polarization beam apparatus 312 and towards before 314 deflections of LCoS display by least one lenslet 310 that homogenizes, shown in LCoS display 314 place's full color images.LCoS display can have the resolution of 1280x720p.Then image can be reflected back toward by polarization beam apparatus, reflects and advances by collimating apparatus it leaves the way of projector from folding eyeglass 318, goes forward side by side into waveguide.Projector can comprise the diffraction element of eliminating aberration.
In one embodiment, interactive wear-type eyepiece comprises optics assembly, the content that user checks surrounding environment and demonstration by this optics assembly, wherein optics assembly comprise correct the view of user to surrounding environment corrective element, allow internal reflection free form surface optical waveguide and be arranged to the coupled lens to optical waveguide by the image orientation of the optical display from such as LCoS display.Eyepiece also comprise for the treatment of content in case to user show one or more integrated processors, and such as projector instrument for content being introduced to the integrated image source of optics assembly.Image source is that in each embodiment of projector, projector instrument comprises light source and optical display therein.From the light of the light source such as RGB module under the control of processor by outgoing, through polarization beam apparatus, it is polarized there, from the optical display reflection of the LCD display such as LCoS display or some other embodiment, goes forward side by side into optical waveguide afterwards.The surface of polarization beam apparatus can reflex to the coloured image from optical display in optical waveguide.The RGB LED module coloured image that sequentially emergent light reflects from optical display with formation.Corrective element can be perspective correcting lens, and it is attached to optical waveguide, and to allow correctly checking of surrounding environment, no matter image source is open or close.This corrective element can be wedge shape correcting lens, and can be prescription, painted, coating etc.The optical waveguide of the free form surface that can be described by higher-order polynomial expression can comprise two free form surfaces surface of the curvature and the adjusted size that allow waveguide.The curvature of waveguide and adjusted size allow it to be placed in the framework of interactive wear-type eyepiece.This framework can be resized to be applicable to according to the mode that is similar to sunglasses or glasses user's head.Other element of the optics assembly of eyepiece comprises homogenizer and collimating apparatus, propagated to guarantee that light beam is uniformly, and collimating apparatus improves the resolution of the light that enters optical waveguide by this homogenizer from the light of light source.
In each embodiment, prescription lens can be installed in inner side or the outside of eyepiece lens.In certain embodiments, prescription ability can be divided into the prescription lens on outside and the inner side that is arranged on eyepiece lens.In each embodiment, prescription is corrected and is provided by corrective optical device, and corrective optical device is such as an assembly that depends on eyepiece lens or optics assembly by surface tension, such as beam splitter.In each embodiment, corrective optical device can be partly arranged in a position of light path, and is partly arranged in another position of light path.For example, the half of corrective optical device can be arranged on the outside of the convergence plane of beam splitter, and second half is arranged on the inner side of convergence plane.In this way, can differently provide rectification to the image light from inside sources and scene light.; light from source can only be corrected by the part on convergent lens inner side in corrective optical device, because image is reflected to user's eyes, and scene light can be corrected by two parts; because light is launched by beam splitter, be therefore exposed to different optical corrections.In another embodiment, the optics assembly being associated with beam splitter can be the assembly of sealing, such as making assembly waterproof and dustproof etc., the inside surface that wherein seals optics assembly has a part of corrective optical device, and the outside surface of sealing optics assembly has another part of corrective optical device.Suitable optical device can be provided by the Press-On Optics of 3M company, and this product at least can be used as that prism (, Fresnel prism), aspheric surface subtract lens, aspheric surface adds lens and bifocal lens.Corrective optical device can be the removable and interchangeable correction of refractive errors instrument of user, it is adapted in the correct position between eyes and the shown content that is attached to movably user, to make correction of refractive errors instrument correct user's eyesight about shown content and surrounding environment.Correction of refractive errors instrument can be suitable for being installed to optics assembly.Correction of refractive errors instrument can be suitable for being installed to wear-type eyepiece.Correction of refractive errors instrument can be applicable to installing with friction.Correction of refractive errors instrument can be installed with magnetic attachment instrument.Depend on user's eyesight, user can select from multiple different correction of refractive errors instruments.
In each embodiment, the present invention can provide ' button ' corrective optical device on eyepiece, such as user wherein, removable and interchangeable correction of refractive errors instrument is adapted in the correct position between eyes and the shown content that is attached to movably user, to make correction of refractive errors instrument correct user's eyesight about shown content and surrounding environment.Correction of refractive errors instrument can be suitable for being installed to optics assembly, wear-type eyepiece etc.Correction of refractive errors instrument can be applicable to friction, magnetic attachment instrument etc., installs.Depend on user's eyesight, user can select from multiple different correction of refractive errors instruments.
With reference to figure 4, can be polarized and the image light of collimation optionally through showing that coupled lens 412 goes forward side by side into waveguide 414, show that coupled lens itself can be or can not be collimating apparatus or be additional to collimating apparatus.In each embodiment, waveguide 414 can be free form surface waveguide, and wherein the surface of waveguide is described by Polynomial Equality.Waveguide can be straight.Waveguide 414 can comprise two reflective surface will.In the time that image light enters waveguide 414, it can an incident angle clash into first surface, and this incident angle is greater than the critical angle that total internal reflection (TIR) occurs.Image light can participate in the TIR bounce-back between first surface and the second surface on opposite, and region 418 is checked in the activity that finally arrives compound lens.In one embodiment, light can participate at least three TIR bounce-backs.Because waveguide 414 is tapered, to allow TIR bounce-back finally to leave waveguide, the thickness of compound lens 420 may not be uniform.The distortion in region of checking of compound lens 420 can settle wedge shape correcting lens 410 to provide at least uniform thickness on region of checking of lens 420 to minimize by the length along free form surface waveguide 414.Correcting lens 410 can be prescription lens, painted lens, polarized lens, trajectory lens etc., is installed in inner side or the outside of eyepiece lens, or in certain embodiments, is mounted inner side and the outside of eyepiece lens.
In certain embodiments, although optical waveguide can have first surface and the second surface of the total internal reflection that allows the light that enters waveguide, light may in fact can be not enter waveguide by the inside incident angle that causes total internal reflection.Eyepiece can comprise specular surface on the first surface of optical waveguide, to reflect shown content towards the second surface of optical waveguide.Therefore, specular surface allow the total reflection of the light that enters optical waveguide or enter optical waveguide light the reflection of at least a portion.In each embodiment, surface can be number percent minute surface 100% minute surface or less.In certain embodiments, replace specular surface, the air gap between waveguide and corrective element can cause to cause the incident angle of TIR to enter the reflection of light of waveguide.
In one embodiment, eyepiece comprises the integrated image source such as projector, and this image source is introduced content to show to optics assembly from adjoining the optical waveguide of eyepiece arm.The optics assembly that injects the prior art of carrying out from the top side of optical waveguide from image is wherein different, and the present invention provides from the side of waveguide the image of waveguide is injected.The aspect ratio of displayed content is in general square shape between essentially rectangular, and this rectangle has the major axis of approximate horizontal.In each embodiment, the aspect ratio of displayed content is 16:9.In each embodiment, can be by realizing the wherein rectangular aspect ratio of the displayed content of major axis approximate horizontal via the rotation of injecting image.In other embodiments, can be by stretching image until it reaches required recently realization in length and breadth.
Fig. 5 has described the design of the waveguide eyepiece that sample dimension is shown.For example, in this design, the width of coupled lens 504 can be 13~15mm, optical display 502 optically-coupled continuously.These elements can be placed in an arm of eyepiece, or are placed in redundantly two arms of eyepiece.Project to the waveguide 508 of free form surface by coupled lens 504 from the image light of optical display 502.The thickness that comprises the compound lens 520 of waveguide 508 and correcting lens 510 can be 9mm.In this design, waveguide 502 allows to have the 8mm exit pupil diameter in 20mm eye gap.The see-through view 512 obtaining is about 60-70mm.Enter the distance in the image light path footpath of waveguide 502 (dimension a) is about 50-60mm, and this can adapt to mankind's head breadth of large number percent from pupil to image light.In one embodiment, the visual field can be larger than pupil.In each embodiment, lens may not can be filled up in the visual field.Should be understood that these dimensions are for specific illustrative embodiment, and should not be interpreted as restriction.In one embodiment, waveguide, buckle optical device and/or corrective lens can comprise optical plastic.In other embodiments, waveguide, buckle optical device and/or corrective lens can comprise glass, marginal glass, bulk glass, metallic glass, palladium tempered glass or other suitable glass.In each embodiment, waveguide 508 and corrective lens 510 can be by being selected to cause hardly the different materials of chromatic aberation to be made.Material can comprise diffraction grating, holographic grating etc.
In all embodiment as shown in Figure 1, in the time that two projector 108 are used to left and right image, the image of institute's projection can be stereo-picture.In order to allow solid to check, projector 108 can be placed in Adjustable Range each other, and this is adjusted apart from the interpupillary distance that allows each wearer based on eyepiece.For example, single optics assembly can comprise having for two of the various adjustment of level, vertical and sloped position electro-optical modules independently.Or optics assembly can only comprise single electro-optical module.
Figure 146 to 149 schematically shows augmented reality (AR) eyepiece 14600(does not have its temple part) an embodiment, wherein the placement of image can be adjusted.Figure 146 and 147 illustrates respectively the front and back skeleton view of AR eyepiece 14600.In this embodiment, the electron device of optical projection system and each several part (being referred to as 14602) are positioned at lens 14604a, 14604b top.AR eyepiece 14600 has two projection screen 14608a, 14608b, they adjustably the adjustment platform 14610 wearer's side of lens 14604a, 14604b hang.Mechanism for the beam 14612 independent inclinations of adjusting lateral attitudes and each projection screen 14608a, 14608b with respect to AP eyepiece 14600 is installed on adjustment platform 14610.
Can be controlled by the combination of the motor of manual activation (for example,, via button) or software activation, manual control equipment (such as thumb wheel, lever arm etc.) or motorization and manual equipment for one or two the structure of position of adjusting display screen.AR eyepiece 14600 adopts manual equipment, and these equipment will be in present description.It will be apparent to one skilled in the art that adjusting mechanism is designed to make laterally adjust and tilt adjustments decoupling.
Figure 148 illustrates the perspective rear view of the part in the wearer left side of AR eyepiece 14600, wherein adjusts the adjusting mechanism 14614 for projection screen 14608a on platform 14610 and is clearly show that.Projection screen 14608a is installed on framework 14618, and framework 14618 is fixedly attached at movable carriage 14620(or its part).In its beam 14612 sides, bracket 14620 can be attached to adjust in the deep-slotted chip breaker of first 14624 of platform 14610 to carry axle 14622 rotatable and support slidably.In its temple part side, bracket 14620 can the rotatable and support slidably by yoke 14628.With reference to Figure 150, yoke 14628 has shaft portion 14630, and it is attached to bracket 14620 and regularly with to carry axle 14622 coaxial, to provide turning axle to bracket 14620.Yoke 14628 is adjusted the second back-up block 14632(of platform 14610 and is seen Figure 151 being attached to) deep-slotted chip breaker in slidably and rotatably supported.
Yoke 14628 also has from shaft portion 14630 radial outward extending two parallel arms 14634a, 14634b.The free ending tool of each arm 14634a, 14634b is porose, and the hole 14638 of for example arm 14634b, for catching regularly betwixt axle 14678, (seeing Figure 149) as described below.Arm 14634a has anchor portion 14640, and it is attached to the axial region 14630 of yoke 14628 there.Anchor portion 14640 has the through hole 14642 for catching slidably bolt 14660, (sees Figure 152) as described below.
Refer again to Figure 148, adjusting mechanism has for the first thumb wheel 14644 of the lateral attitude of controlling projection screen 14608a and for the second thumb wheel 14648 of the inclination of controlling projection screen 14608a.The first thumb wheel 14644 is partly extended by adjusting the groove 14650 of platform 14610, and can by the first thread spindle 14652 threaded engage and support.In the through hole of the first thread spindle 14652 in the third and fourth back-up block 14654,14658, supported slidably.Third and fourth 14654,14658 and/or the side of groove 14650 be used for preventing the first thumb wheel 14644 transverse shiftings.Therefore, indicated by arrow A around its axle rotation thumb wheel 14644() cause the first thread spindle 14652 transverse shiftings (being indicated by arrow B).As best visible in Figure 152, the first thread spindle 14652 has from the radial outward extending bolt 14660 of its beam side.(notice, the screw thread of the first thread spindle 14652 is not described in the accompanying drawings, but can be single or multiple pitch threads.) bolt 14660 caught slidably by the vertical vertical through hole 14642 of the anchor portion 14640 of the arm 14634a of yoke 14628.When the first thumb wheel 14644 is transferred to while causing the first thread spindle 14652 towards beam 14612 horizontal progressive direction, bolt 14660 squeezes towards beam 14612 thrusters of through hole 14642, and this makes again yoke 14628, bracket 14620, framework 14618 and the first projection screen 14608a all towards beam 14612 transverse shiftings (seeing arrow C).Similarly, the first thumb wheel 14644 being forwarded to relative direction causes the first projection screen away from beam 14612 transverse shiftings.
The second thumb wheel 14648 is used to control the inclination of the first projection screen 14608a around bracket axle 14622 and yoke axle part 14630 defined axles.With reference now to Figure 153,, the second thumb wheel 14648 is attached to the narrow portion 14662 of hollow flange axle 14664 regularly.The flange section 14668 of flange shaft 14664 can adopt the threaded shank portion 14670 of receive threaded eye hook 14672.(notice, the screw thread of threaded shank portion 14670 is not described in the accompanying drawings, but can be single or multiple pitch threads.) in use, the narrow portion 14662 of flange shaft 14664 is rotatably shown in Figure 151 by the countersunk hole 14674(adjusting in platform 14610), adjusting on the bottom side of platform 14610 to make thumb wheel 14648, and eye hook 14672 is on top side, and the flange section 14668 of flange shaft 14664 countersunk hole 14674 immerse oneself in part, be captured.Refer again to Figure 149, the eye of eye hook 14672 engages slidably around axle 14678, and axle 14678 is captured in the hole of the free end of yoke arm 14634a, 14634b.Therefore, rotate the second thumb wheel 14644(around its axle as indicated in arrow D) cause that flange shaft 14664 and its together rotate, this threaded shank portion 14670 that causes eye hook 14672 is with respect to the vertical turnover of flange section 14668 mobile (as indicated in arrow E), this eye that causes eye hook 14672 is pushed towards axle 14678, and this causes that yoke 14628 moves around its axle, therefore cause that the first projection screen 14608 deviates from or towards wearer tilt (F is indicated as arrow).
Refer again to Figure 148, notice, the electron device of optical projection system 14602a and each several part are positioned on the platform 14680 at the top of being fixed on bracket 14620.Therefore, projection platform 14608a and the electron device of its optical projection system 14602a and the spatial relationship of part associated with it keep any horizontal or tilt adjustments of substantially projection platform 14608a not being carried out to change.
AR glasses 14600 also comprise the adjusting mechanism that is similar to just now the adjusting mechanism 14614 of describing, for located lateral with tilt to be positioned at the second projection screen 14608b on the wearer right side of AR eyepiece 14600.
In one embodiment, eyepiece can comprise inclination or the bending guide rail adjusted for IPD, and this guide rail remains in curved frames optical module more.In certain embodiments, display can be used for being connected to such inclination or bending guide rail.
In each embodiment, one or more display screens of AR eyepiece are arranged to be parallel to the line that connects eyes of user.In certain embodiments, one or more display screens are around its vertical axis revolving, to make them start inwardly to rotate towards eyes with the angle approximately 0.1 to approximately 5 degree scope from the line parallel with the eyes that are connected user near that end of nose, i.e. and " toes are inside ".In some of embodiment after these, the inside angle of toes is permanently fixed, and in other embodiments, the inside angle of toes is that user is adjustable.In some of user's capable of regulating embodiment, adjustable is limited to two or more predeterminated positions, for example, represent closely to assemble, middle distance assembles and those positions of assembling far away.In other embodiments, adjustable is continuous.Preferably, also comprise therein as disclosed here in each embodiment of the AR glasses of vergence correction automatically, in the time of vergence correction, in-toed amount is taken into account.Toes are inwardly in constant each embodiment therein, in-toed amount can be directly included in automatic vergence correction and without position transducer, but in user's capable of regulating embodiment, preferably use location sensor is to transmit the in-toed amount existing to use in vergence corrected Calculation to processor.The inside angle of toes is in the adjustable each embodiment of user therein, adjust or can be manually to carry out, for example directly or for example indirectly selectively make the deflecting roller of one or two display screen around its vertical axis revolving by drive chain by using, or can be motorized to complete selectable rotation in the time being activated by user interface or gauge tap by user.
In some cases, the inside adjustment of toes can be used to during user's eyes remain on the long active session of particular focal length during (for example,, in the time reading, watch monitor, ball match or local horizon) user's eyes are loosened.The inside adjustment of above-mentioned toes can be used, to adjust by the interpupillary distance that to come better with user's eye alignment be user of rotary display screen effectively.
In each embodiment, the invention provides mechanical interpupillary distance adjustment, be suitable in picture frame, being adjusted by user position such as the optics assembly of eyepiece wherein so that user have the ability that changes the position of optics assembly about user's eyes.Position adjustment can be controlled the horizontal level of optics assembly in spectacle-frame, upright position, inclination etc.
In each embodiment, the present invention can provide digital interpupillary distance adjustment, carry out pupil alignment procedures such as integrated processor wherein, this process allows user to adjust the placement location in the visual field that displayed content presents on eyepiece optics assembly, to arrange that pupil is aimed at calibration factors to use in the time of the placement of other displaying contents.Calibration factors can comprise level and/or the vertically adjustment of shown content in the visual field.Calibration factors can comprise multiple calibration factors, represent separately to the distance of real object, when the calculating of the distance based on to real object in the visual field when locating content by service range calibration factors.Calibration factors can comprise the calibration process based on multiple calibration factors, and calibration factors represents the distance of real object separately, when the calculating based on to the distance of real object in the visual field when locating content by service range calibration factors.The location of image can be adjusted to move it in the visual field on display.Make two images be moved apart by the object that makes to seem imaging away from, make object look close and image is moved into be close together.The difference of the position for the object of each eye in the visual field is called as parallax.The object of parallax and institute's perception leaves user's distance dependent.
With reference now to Figure 173,, described the decomposition view of glasses.Electron device 17302 is arranged in the front frame of superciliary glasses, comprises CPU, display driver, camera, radio, processor, user interface etc.Optical module 17308 adopts the optionally lens 17304 that cover them to be attached to framework.Lens 17304 can be painted or colorable.Three-dimensional embodiment is shown herein, but should be understood that and also can use single optical module 17308.Electron device 17302 use lids 17314 seal, and lid comprises the user interface 17310 of physics, and this interface can be button, touch interface, spin, switch or any other physical user interface.Physical user interface 17310 can be controlled the various aspects of glasses, such as the application of the function of glasses, the application moving on glasses or control external unit.User can carry out easily to utilize this control functional part in the following manner: the control functional part/UI that catches the bottom of framework to stablize it simultaneously to touch frame roof.Arm 17312 is laid on ear, and can comprise socket, battery 17318 or function of supplying power etc. for headband, audio frequency/ear-phone function or the outer audio equipment of fixing glasses.Battery 17318 can be placed in arbitrary arm, discloses herein the option of battery 17318, but battery 17318 also comprises any available battery types.Headband can be the ear band made from Nitinol or other marmem.Ear band can be the form of belt, or as in Figure 177, ear band 17702 can be bent wire form, to be attenuated, to alleviate and to reduce costs.For object attractive in appearance, framework can be any color, and lens can be any colors, and the tip of eyepiece arm or at least arm can be coloured.For example, the most advanced and sophisticated Nitinol of formation arm can be coloured.
With reference now to Figure 174,, enable battery and come the electron device power supply in front frame, also use distribution design by operating hinge 17408, this distribution design is used the electric wire of minimum number and pass through hinge in wire guide 17404.Distribution design can comprise that frame electron device is to the electric wire 17402 that is positioned at the earphone on arm in the past.Figure 175 has described the amplified version of Figure 174, and focus is on the electric wire 17402 through wire guide 17404.Figure 176 A-C has described wire guide with the various piece of frame and inner glasses Operation Profile.This view is user's side of frame from seeing hinge.Figure 176 A shows the section of most of material, and Figure 176 B shows the section that approaches most of material, and Figure 176 C illustrates the full release of glasses.
Fig. 6 has described an embodiment of the eyepiece 600 with perspective or translucent lens 602.The image 618 of institute's projection can be seen on lens 602.In this embodiment, be just projected to image 618 on lens 602 and be by chance the augmented reality version of the scene that wearer watching, wherein show the point of interest (POI) of mark in the visual field to wearer.Augmented reality version can be enabled by the forward direction camera (not shown in Fig. 6) being embedded in eyepiece, content imaging home position/POI that this camera is being watched wearer.In one embodiment, the output of camera or optical transmitter can be sent to eyepiece controller or storer confession storage, supplies to be sent to remote location or check for the people who wears eyepiece or glasses.For example, video output can be spread and be delivered to virtual screen and watch for user.Therefore video output can be used to help to determine user's position, or can be sent to other people with assistance help location wearer's position or for any other object by long-range.Can be used to determine wearer's position such as other detection technique of GPS, RFID, manual input etc.Use location or identification data, database can be accessed to obtain to be applied by eyepiece, projection or the information that otherwise shows together with the content of just being seen.Augmented reality application and technology will further be discussed herein.
In Fig. 7, describe an embodiment of the eyepiece 700 with translucent lens 702, in translucent lens 702, showing streaming media (e-mail applications) and call-in reporting 704.In this embodiment, media have blocked a part of checking region, but should be understood that shown image can be placed in the visual field Anywhere.In each embodiment, can make media more transparent or opaquer.
In one embodiment, eyepiece can receive input from the external source such as external transducer box.Source can be depicted in the lens of eyepiece.In one embodiment, in the time that external source is phone, eyepiece can make telephonic station-keeping ability show location-based augmented reality, comprises from the mark of the AR application based on mark and covering.In each embodiment, the VNC client operating in eyepiece processor or associated device can be used to be connected to computing machine and control this computing machine, and the display of its Computer is watched by wearer in eyepiece.In one embodiment, can be spread and be delivered to eyepiece from the content in any source, such as from be placed in the panorama camera on vehicle top demonstration, equipment user interface, from the imaging of target drone or helicopter etc.For example, in the time being arranged on the feed of the camera on gun and being directed to eyepiece, this camera can allow the not target in direct sight line of shooting.
Lens can be variable colors, such as photochromic or electrochromic.Electrochromism lens can comprise the outburst of the electric charge applying on off-color material in response to processor and change overall off-color material or the discoloration coating of the opacity of lens at least a portion.For example, and with reference to figure 9, the variable color part 902 of lens 904 is illustrated as dimmed, such as in the time that this part provides shown content to eyepiece wearer, provides larger checked ability to wearer.In each embodiment, on lens, may there are the multiple color change intervals that can independently be controlled, such as subdivision, the programmable region of lens and/or the region of institute's projection in the region of the major part of lens, institute's projection, controlled etc. in Pixel-level.Can be via by the control technology control further describing herein to the activation of off-color material, or (for example apply for some, stream transmits the camera of the brightness in Video Applications, solar tracking application, ambient light sensor, the tracking visual field) automatically enable, or control in response to the UV sensor embedding in framework.In each embodiment, electrochromic layer can be on the surface of the optical element between each optical element and/or on eyepiece, such as on corrective lens, first-class at trajectory lens.In an example, electrochromic layer can be made up of stack, such as the PET/PC film of indium tin oxide (ITO) coating therebetween with two electroluminescence (EC) layers, this can remove another layer of PET/PC, reduce thus reflection (for example, layer stack can comprise PET/PC – EC – PET/PC – EC – PET/PC).In each embodiment, electricity can be controlled the scheme based on liquid crystal that optical layers can be used as two yuan of states with color and luster and provide.In other embodiments, the multilayer of the liquid crystal of formation optical layers or the electronics color and luster replacing can be used to provide variable color and luster, can open or close by level with some layer or the section that make optical layers.Electrochromic layer can be generally used for any of the electric control transparency in eyepiece, comprises that SPD, LCE, electricity are wetting etc.
In each embodiment, lens can have angular-sensitive coating, and this coating allows transmission to have the light wave of low incident angle, and reflects the light with high incident angle, such as s polarized light.Discoloration coating can be controlled such as the control technology portions by described herein or integrally.Lens can be variable contrasts, and contrast can be subject to the control of push button or any other control technology described herein.In each embodiment, user can wear mutual wear-type eyepiece, and wherein eyepiece comprises optics assembly, and by this optics assembly, user checks environment around and shown content.Optics assembly can comprise correct the corrective element of the view of user to surrounding environment, for the treatment of content so that the integrated processor showing to user and for content being introduced to the integrated image source of optics assembly.Optics assembly can comprise electrochromic layer, and this layer provides and depend on the requirement of displayed content and the indicating characteristic adjustment of ambient conditions.In each embodiment, indicating characteristic can be brightness, contrast etc.Ambient conditions can be a luminance level, and in the time not having indicating characteristic to adjust, this luminance level will make shown content be difficult to be checked by eyepiece wearer, and wherein indicating characteristic adjustment can be applied to the region that in optics assembly, content is being shown.
In each embodiment, eyepiece can have the controls such as brightness, contrast, spatial resolution in eyepiece view field, so that checking for bright or dark surrounding environment change or the content of improvement user to institute's projection.For example, user may just use eyepiece under bright sunshine condition, and in order to make user clearly see shown content, viewing area may need to change brightness and/or contrast.Or, can be modified around the region of checking of viewing area.In addition, no matter be in viewing area or outside viewing area, the region of changing can spatially be directed or control according to the application realizing.For example, only need to change the fraction of viewing area, such as determining or when predetermined comparison degree ratio departs from from a certain between the display section of viewing area and surrounding environment when this part of viewing area.In each embodiment, the each several part of lens can be modified brightness, contrast, spatial extent, resolution etc., such as being fixed to comprise whole viewing area, be adjusted to lens only a part, adapt to change in the illumination condition of surrounding environment and/or the brightness-contrast of shown content and be dynamic etc. for this change.Spatial extent (for example, be subject to the region of change impact) and resolution is (for example, display optical resolution) can in the different piece of lens, change, the different piece of lens comprises high resolving power position, low resolution position, single pixel position etc., and wherein different positions can be combined to realize the object of checking of the application carried out.In each embodiment, for realizing pearl that the technology of the change to brightness, contrast, spatial extent, resolution etc. can comprise that electrochromic material, LCD technology, optical device embed, flexible display, suspended particle equipment (SPD) technology, colloid technology etc.
In each embodiment, may there are the various enable modes of electrochromic layer.For example, user can enter sunglasses pattern, and wherein compound lens only seems, and some is dimmed, or user can enter " turning dark before one's eyes " pattern, the wherein compound lens complete blackening that seems.
In the time realizing the change of brightness, contrast, spatial extent, resolution etc., adoptable technology example can be electrochromic material, film, ink etc.Electrochromism is by the shown phenomenon of some material that reversibly changes outward appearance in the time that electric charge is applied in.Depend on application-specific, various types of materials and structure can be used to construct electrochromic device.For example, electrochromic material comprises tungsten oxide (WO 3), this is the main chemicals for the production of electrochromic or intelligent glass.In each embodiment, in the time realizing change, Electro-Discolor Coating can be used on the lens of eyepiece.In another example, electrochromic display device (ECD) can be used in the time realizing ' Electronic Paper ', and Electronic Paper is designed to imitate the outward appearance of regular paper, and wherein Electronic Paper shows reflected light as regular paper.In each embodiment, electrochromism can be realized in various application and material, comprise that Gyricon(is made up of the tygon spheroid being embedded in transparent silicon thin slice, each ball suspending, in oil vacuole, can rotate freely them), electrophoretic display device (EPD) (electric field applying by use rearrange charging granules of pigments form image), E-Ink technology, electricity wetting, electrofluid, interferometric modulator, be embedded in organic transistor, nanochromics display (NCD) etc. in flexible substrate.
In the time realizing the change of brightness, contrast, spatial extent, resolution etc., another example of adoptable technology can be suspended particle device (SPD).In the time that SPD film is applied to small voltage, its microscopic particles in stable state is opened by random scatter, become aligning and allow light to pass through.Response can be immediately, uniformly, and on film, has stable color.Can allow user to control by light, dazzle and heat to the adjustment of voltage.The scope of the response of system can from its closed condition completely the mazarine outward appearance of blocking light to the clear appearance in its opening.In each embodiment, SPD technology can be the emulsion being applied in plastic, to obtain movable film.This plastic foil can be lamination (as single glass surface), be suspended between two glass sheet, plastics or other transparent material etc.
With reference to figure 8A-C, in certain embodiments, electrooptics device can be installed on simple eye or eyes according to two parts and turn over/under turn in arrangement: 1) electrooptics device; And 2) corrective lens.Fig. 8 A has described two-part eyepiece, and wherein electrooptics device is comprised in and can be electrically connected in the module 802 of eyepiece 804 via the electric connector such as such as plug, bolt, socket, wiring.In this arrangement, the lens 818 in framework 814 can be corrective lens completely.Pupil spacing between two half of electro-optical module 802 can adjust to adapt to various IPD at Liang808Chu.Similarly, the placement of display 812 can be adjusted via beam 808.Fig. 8 B has described eyes electro-optical module 802, wherein half by turn over, and second half by under turn over.The bridge of the nose can be complete capable of regulating and flexible.This allows with head with 3 installations on the bridge of the nose and ear, to guarantee the stability of image in user's eye, and different from the instability of the optical device that is arranged on the helmet at scalp top offset.With reference to figure 8C, lens 818 can be meet ANSI, hard conating damage resistant polycarbonate trajectory lens, can be variable color, can there is angular-sensitive coating, can comprise UV sensitive material etc.In this arrangement, electro-optical module can comprise the black silicon sensor of the VIS/NIR/SWIR based on CMOS for Infravision.Electric light module 802 can have the feature of quick rupture capacity, for customer flexibility, on-the-spot replacement and upgrading.Electro-optical module 802 can have integrated supply socket.
In Figure 79, above turn over/under turn over lens 7910 and can comprise light piece 7908.Removable, elasticity adapter at night/light dam/light piece 7908 can be used to turn in shielding/under turn over lens 7910, such as for nighttime operation.The decomposition plan view of eyepiece has also described that head is with 7900, framework 7904 and the capable of regulating bridge of the nose 7902.Figure 80 has described the decomposition view of the electrooptics assembly of positive (A) and side angle (B) view.Retainer 8012 use corrective lens 7910 keep perspective optical device.O ring 8020 and screw 8022 are fixed to retainer on axle 8024.Spring 8028 provides the connection that spring is housed between retainer 8012 and axle 8024.Axle 8024 is connected to steel framework 8014, and this support uses thumbscrew 8018 to be fixed on eyepiece.The IPD that axle 8024 is taken on hinge and used IPD to adjust handle 8030 adjusts instrument.As visible in Figure 81, handle 8030 rotates along adjusting screw thread 8134.Axle 8024 also has two fixing helicla flutes 8132.
In each embodiment, the part that photochromatic layer can be used as the optical device of eyepiece is included.Photochromism is chemical species reversible transformations between two kinds of forms by absorption of electromagnetic radiation, and wherein two kinds of forms have different absorption spectrums, the modulation such as color, darkness etc. in the situation that being exposed to given light frequency.In an example, photochromatic layer can be included between the waveguide and corrective optical device of eyepiece, first-class outside corrective optical device.In each embodiment, the available UV diode of photochromatic layer (such as being used as dimmed layer) or other photochromic response wavelength as known in the art activate.In situation at electrochromic layer with UV photoactivation, eyepiece optics device also can comprise the UV coating in photochromatic layer outside, to prevent activating unintentionally it from the UV light of the sun.
Photochromic device changes from light to dark at present fast, and slowly bright from secretly changing to.This is because the related molecule of the change of photochromic material from limpid to dark changes.Photochromic molecules after the UV light such as the UV light such as from the sun are removed, vibration and get back to limpid.By increasing the vibration of molecule, such as by being exposed to heat, optical device will be limpid quickly.Photochromatic layer from secretly to bright speed can with temperature correlation.Bright Military Application is even more important from secretly changing to fast, in Military Application, the user of sunglasses enters dark internal environment from bright external environment condition conventionally, and importantly can in internal environment, see rapidly.
The invention provides the photochromatic layer just with the well heater adhering to, well heater is used to accelerate in photochromic material from dark to limpid transformation.The method depends on photochromic material from dark to the relation the speed of limpid transformation, wherein changes in the time of higher temperature very fast.In order to allow well heater to increase fast the temperature of photochromic material, photochromic material is provided as the thin layer with thin well heater.Low by the thermal mass of photochromic film device of per unit area is remained, well heater only needs to provide heat in a small amount to produce fast the large temperature variation in photochromic material.Because photochromic material only need be from secretly becoming between limpid tour in higher temperature, well heater only needs to use at short notice, so that electric power requires is low.
Well heater may be thin and transparent heating element, such as ITO well heater or any other transparent and conducting membrane material.Become fast limpid when user needs eyepiece, user can activate heating element by any control technology described herein.
In one embodiment, heating element can be used to alignment light photochromic device, can own dimmed cold environmental aspect with offset lens.
In another embodiment, the shallow layer of photochromic material can be placed on thick substrate, the stacked heating element in top.For example, the sunglass lens of covering can comprise the photochromic scheme of acceleration, but still on viewing area, has the electrochromism sheet separating that optionally adopts or do not adopt UV photocontrol,
Figure 94 A has described to have the photochromic film device of snake heater pattern, and Figure 94 B has described the side view of photochromic film device, and wherein this equipment is the lens for sunglasses.Photochromic film device is illustrating above, and not contact protection covers lens to reduce the thermal mass of equipment.
United States Patent (USP) 3,152,215 have described the heater layer combining with photochromatic layer, so that for reducing the object heating photochromic material from dark to limpid fringe time.But photochromatic layer is placed in wedge shape, this will greatly increase the thermal mass of equipment and reduce thus well heater by the speed of the temperature of change photochromic material, or greatly increases the required power of temperature that changes photochromic material.
The present invention includes the use of the thin bearing bed to having applied photochromic material.Bearing bed can be glass or plastics.As known in the art, photochromic material can be by vacuum coating, by flooding or being applied in bearing bed by thermal diffusion.The thickness of bearing bed can be 150 microns or less.That the required speed changing between the required darkness of the photochromic film device based in dark state and dark state and clear state is selected to the selection of bearing bed thickness.Thicker bearing bed can be darker in dark state, is heated to more slowly the temperature of rising simultaneously owing to having larger thermal mass.On the contrary, thinner bearing bed is not darker in dark state, simultaneously owing to having the temperature that is heated to quickly rising compared with little thermal mass.
Protective layer shown in Figure 94 separates with photochromic film device, so that the thermal mass of photochromic film device remains is low.In this way, can make protective layer thicker in higher impact strength to be provided.Protective layer can be glass or plastics, and for example protective layer can be polycarbonate.
Well heater can be the transparent conductor being formed in relatively uniform conductive path, and it is relatively uniform making the heat producing in the length of formed well heater.One example of the transparent conductor that can form is titania.As shown in Figure 94, provide larger region for electrically contacting at each end of heater pattern.
As what noticed in the discussion of Fig. 8 A-C, augmented reality glasses can comprise lens 818 to each of wearer eye.Can make lens 818 easily be assembled in framework 814, each lens be can be and be intended to use the people of glasses to customize.Therefore, lens can be corrective lens, and also can be colored for use as sunglasses, or have other quality that is suitable for desired environment.Therefore, lens can be colored as yellow, dead color or other suitable color, can be maybe photochromic, make the transparency of lens be exposed to brighter light time reduction at lens.In one embodiment, lens also can be designed to fasten in framework or on framework, and buckle lens are embodiment.For example, lens can be made up of high-quality Schott optical glass, and can comprise polarization filter.
Certainly, lens need not to be corrective lens; They can only be used as sunglasses or the protection to the optical system in framework.On non-, turn over/under turn in arrangement, it goes without saying that outer lens is to helping to protect waveguide in quite expensive augmented reality glasses, checking that system and electron device are important.Bottom line, outer lens provides the abrasive protection to user environment, is no matter a sand in environment, blackberry, blueberry prickly bushes, brambles etc., or flying debris, bullet and howitzer in another environment.In addition, outer lens can be ornamental, for changing the outward appearance of compound lens, may feel attractive to user's personalization or fashion.Outer lens also can help other user one by one to distinguish his or her glasses and other people glasses, for example, in the time that many users flock together.
Lens are suitable for impacting, and for example ballisticimpact is desirable.Thereby in one embodiment, lens and framework meet the ansi standard Z87.1-2010 for trajectory resistance.In one embodiment, lens also meet trajectory standard C E EN166B.In another embodiment, military affairs are used, lens and framework can meet the standard of MIL-PRF-31013, standard 3.5.1.1 or 4.4.1.1.Each in these standards has slightly different requirement to trajectory resistance, and is intended to separately protect user's eyes not to be subject to the impact of High-velocity Projectiles or chip.Although do not specify concrete material, such as the polycarbonate of level is enough to the test by specifying in proper standard conventionally.
In one embodiment, as shown in Fig. 8 D, lens are from framework outside but not inner side buckles into better to be impacted resistance, because any impact is expected the outside from augmented reality glasses.In this embodiment, replaceable lens 819 have multiple fastening arm 819a, and these arms are mounted in the recess 820a of framework 820.The engagement angle 819b of arm is greater than 90 °, and the engagement angle of recess is also greater than 90 °.Make each angle be greater than right angle and there is the actual effect that allows to remove from framework 820 lens 819.If a people vision changes, or any reason is expected different lens if, lens 819 need to be removed.The design fastening makes to exist slight compression or bearing load between lens and framework.That is, lens can be securely held in framework, such as the slight interference fit of passing through lens in framework.
It is not the only possible mode that fastens removedly lens and framework that the cantilever of Fig. 8 D fastens.For example, can use annular to fasten, the continuous sealing lip of its middle frame engages the amplification edge of lens, and then the amplification edge of lens fastens in lip, maybe may be fastened on lip.Such fastening is generally used for the cap for brush to be attached to pen.This configuration can have advantages of solid connection, and the chance that very little dust and dirty particle enter is less.Possible shortcoming comprises the quite tight tolerance limit required around the whole periphery of lens and framework, and the requirement to dimensional integrity in whole three-dimensionals in time.
Also likely use even simpler interface, this interface still can be considered to fasten.Can be in the outside surface of framework molded groove, lens have outstanding surface, this surface can be considered to be assembled to the joint tongue in groove.If groove is semi-cylindrical, such as from approximately 270 ° to approximately 300 °, joint tongue will fasten in groove and firmly be kept, and still may remove by the gap being retained in groove.In this embodiment, shown in Fig. 8 E, there are the lens of joint tongue 828 or replace lens or lid 826 and can be inserted in the groove 827 in framework 825, even if these lens or lid do not fasten in framework.Because this assembling is close fit, it will be taken on fastening and lens will securely be remained in framework.
In another embodiment, framework can be made into two, such as bottom and top, adopts conventional joint tongue and groove assembling.In another embodiment, this design also can be used standard fasteners to guarantee the tight grasping of framework to lens.This design should not need anything dismounting of framework inner side.Therefore, fastening or other lens or lid should be assembled on framework or from framework and remove, and needn't enter inside framework.As noted in other parts of the present invention, augmented reality glasses have many parts.Some in assembly and subgroup piece installing need careful aligning.Mobile and shake these assemblies may be unfavorable to its function, mobile and vibrations framework and outside or fasten lens or lid also may be unfavorable to its function.
In each embodiment, above turn over/under turn over and arrange to allow the modular design of eyepiece.For example, not only eyepiece can be equipped with simple eye or eyes module 802, and lens 818 also can be replaced.In each embodiment, supplementary features can be included with module 802, and no matter module 802 is associated with a display 812 or two displays 812.With reference to figure 8F, any one in the simple eye or eyes version of module 802 can be that only display 852(is simple eye), 854(eyes), maybe can be equipped with forward direction camera 858(simple eye) and 860 and 862(eyes).In certain embodiments, module can have additional integrated-optic device, such as GPS, laser range finder etc.Enabling in the embodiment 862 of city leader's tactics reaction, consciousness and visual (being also referred to as ' Ultra-Vis '), eyes electro-optical module 862 is equipped with three-dimensional forward direction camera 870, GPS and laser range finder 868.These features can allow Ultra-Vis embodiment to have panorama night vision and have laser range finder and the panorama night vision in geographic position.
In one embodiment, electrooptics device property can be but be not limited to as follows:
In one embodiment, projector characteristic can be as follows:
In another embodiment, strengthen lens that display eyepiece can comprise electric control as a part for micro-projector or as the part of the optical device between micro-projector and waveguide.Figure 21 has described to have an embodiment of such liquid lens 2152.
Glasses also can comprise camera or the optical sensor 2130 that at least one can provide one or more images to check for user.Image is formed in each side of glasses by micro-projector 2114, so that the waveguide 2108 in this side is passed on.In one embodiment, also can provide additional optical element is zoom lens 2152.Lens can be adjusted by user's electricity, so that the image of seeing in waveguide 2108 is focused for user.In each embodiment, camera can be poly-lens camera, and such as ' array camera ', wherein the data of the eyepiece processor multiple viewpoints from multiple lens and lens capable of being combined are to build high quality graphic.This technology can be called as and is calculated to be picture, because software is used to process image.Being calculated to be picture can provide the advantage of image processing, such as allowing according to function treatment composograph under each lens drawings.For example, because each lens provides its image, processor can provide image to process to create the image with special focusing, such as being recessed into picture, wherein from the focus of one of lenticular image be clearly, high-resolution etc., and wherein remaining image be defocus, low resolution etc.Processor also can select the each several part of composograph to be stored in storer, deletes remainder simultaneously, such as being important and will preserve when memory stores only some part limited and composograph.In each embodiment, the use of pair array camera can provide the ability of the focus of changing image after image is taken.Except the imaging advantage of array camera, array camera can provide the mechanical profile thinner than traditional simple lens assembly, and therefore makes it be easy to be integrated in eyepiece.
The alleged liquid lens being provided by Varioptic company of Lyons, France city or LensVector company limited of California, USA Mountain View city can be provided variable lens.Such lens can comprise the core with two kinds of immiscible liquids.Conventionally, in these lens, the path of light scioptics, the focal length of lens is modified or focuses by applying the interelectrode electromotive force being immersed in liquid.At least one in liquid is subject to the impact of obtained electric field gesture or magnetic field gesture.Therefore, can there is electricity wetting, as authorized described in the U.S. Patent Application Publication 2010/0007807 of LensVector company limited.In the Patent Application Publication 2009/021331 and 2009/0316097 of LensVector, other technology is described.All these three inventions are comprised in this by reference, as word for word described herein every one page and each accompanying drawing.
Describe for also can be by the miscellaneous equipment of the zoom lens of electric wetting phenomena work and technology from other patent documentation of Variopitc company.These documents comprise United States Patent (USP) 7,245,440 and 7,894,440 and U.S. Patent Application Publication 2010/0177386 and 2010/0295987, each in these documents is also comprised in this by reference, as word for word described herein every one page and each accompanying drawing.In these documents, two kinds of liquid have different refractive indexes and different electric conductivity conventionally, and for example a kind of liquid conducts electricity, and such as aqueous liquid, and another liquid insulate, such as oily liquids.Apply the thickness that electromotive force can change lens, and really change the path of light scioptics, therefore change the focal length of lens.
Electricity adjustable lens can be subject to the control of the control of glasses.In one embodiment, focus adjustment is that focus by recalling menu from control and adjusting lens is carried out.Lens can separately be controlled or can be controlled together.Adjustment is by physically rotating control handle, by with gesture instruction or undertaken by voice command.In another embodiment, augmented reality glasses also can comprise stadimeter, and the focus of electric adjustable lens can be pointed to from the target of the desired distance of user or to being automatically controlled such as the stadimeter of laser range finder by making.
As above-mentioned United States Patent (USP) 7,894, shown in 440, variable lens also can be applied to the outer lens of augmented reality glasses or eyepiece.In one embodiment, lens can replace corrective lens simply.The variable lens with its electric capable of regulating control can replace the lens that are arranged on image source or projector, or supplementing as this lens.Whether activity provides corrective optical device for environment, the external world, Waveguide display that corrective lens plug-in unit is user.
Importantly make the image presenting to the wearer of augmented reality glasses or eyepiece, the image stabilization of seeing in waveguide.The view presenting or image advance to digital circuit from one or two digital camera or the sensor that are arranged on eyepiece, and before image is apparent in the display of glasses, image is processed, and if needed, is stored as numerical data there.In any situation, and as mentioned above, then numerical data is used to form image, such as passing through to use LCOS display and a series of RGB light emitting diode.Process light image with a series of lens, polarization beam apparatus and power supply liquid corrective lens and at least one transition lens from projector to waveguide.
The process of collecting and present image comprises some machinery and the light interlock between each assembly of augmented reality glasses.Thereby, obviously by the stabilization of certain form of needs.This can comprise the optical stabilization to immediate cause, camera itself (because it is installed on mobile platform), glasses (itself is movably mounted on mobile subscriber).Thereby, may need camera stabilization or correction.In addition, reply liquid variable lens uses at least some stabilizations or correction.Ideally, the stabilization circuit at this some place is not only proofreaied and correct liquid lens, also proofread and correct from the circuit upstream of liquid lens, including image source, permitted multipart any aberration and vibration.An advantage of native system is, many commercial ready-made cameras are FA, and conventionally have at least one image stabilization feature or option.Therefore, may there is many embodiment of the present invention, there is separately stabilized image or the identical or different method of image stream very fast, as described below.Term optical stabilizationization is conventionally used with the implication of stabilized camera, camera platform or other physical objects physically herein, and image stabilization refers to data manipulation and processing.
A kind of technology of image stabilization is carried out in the time that digital picture is formed in digital picture.This technology can be used the pixel in border outside of visible frame as the buffer zone of undesirable motion.Or this technology can be used another metastable region or basis in successive frames.This technology can be adapted video camera, by the electronic image to be enough to the mode frame by frame mobile video of offsetting motion.This technology does not rely on sensor, and carrys out direct stabilized image by reducing from vibration or other distracting motion of mobile camera.In some technology, the speed of image can be slowed down, to add stabilization procedures to the remainder of digital process, and to the more time of each image request.The global motion vector that these utilizations calculate from differences in motion is frame by frame to determine the direction of stabilization.
The Photostabilised mechanism by gravity or electric drive of image moves or adjusts optical element or imaging sensor, makes it offset ambient vibration.The another way of stablizing shown content on optics is to provide that gyro is proofreaied and correct or for example, to holding the sensing of platform (, user) of augmented reality glasses.As mentioned above, sensor available on augmented reality glasses or eyepiece and that use comprises MEMS gyrosensor.These sensors catch movement and the motion in three-dimensional with very little increment, and can be used as feeding back the image that real time correction sends from camera.Obviously, at least do not need and a big chunk of unwelcome movement may be to be caused by the movement of user or camera itself.These larger movements can comprise user's overall movement, for example walking or run, by bike.Less vibration may be derived from augmented reality glasses, the vibration in the assembly in electricity and the mechanical linkage in the path of the image (output) forming from camera (input) to waveguide.These overall movement may to proofread and correct or consider even more important, but not for example independence in the interlock of the assembly in projector downstream and tiny movement.In each embodiment, gyrocontrolization can be in the time that image experiences periodic motion stabilized image.To such periodic motion, gyroscope can be determined the periodicity of user movement, and information is sent to processor to the placement of the content in User is proofreaied and correct.Gyroscope can utilize two or three or the moving average of more circulations in periodic motion in the time determining periodically.Other sensor also can be used to stabilized image or correctly in the user visual field, place image, such as accelerometer, position transducer, range sensor, stadimeter, biology sensor, geodesy sensor, optical sensor, video sensor, camera, infrared sensor, optical sensor, photoelectric cell sensor or RF sensor.In the time that sensor detects that user's head or eyeball move, sensor provides output to processor, and processor can be determined direction, speed, amount and the speed that user's head or eyeball move.Processor can become this information conversion suitable data structure further to process for the processor (can be same processor) of controlling optics assembly.Data structure can be one or more vectors.For example, the orientation that the direction definable of vector moves, and the speed that the length definable of vector moves.Use treated sensor output, the demonstration of content is by corresponding adjustment.
Therefore motion-sensing can be used to sensing motion and it is proofreaied and correct, and as in optical stabilization, or then proofreaies and correct for sensing motion the image of taking and processing, as in image stabilization.Device for sensing motion correcting image or data is described at Figure 34 A.In this device, can use one or more motion sensors, comprise accelerometer, angular position pick up or gyroscope, such as MEMS gyroscope.Be fed back to suitable sensor interface (such as analog to digital converter (ADC)) or other suitable interface (such as digital signal processor (DSP)) from the data of sensor.Then microprocessor processes this information as mentioned above, and the frame of image stabilization is sent to display driver, then sends to above-mentioned see-through display or waveguide.In one embodiment, display shows and starts with the RGB in the micro-projector of augmented reality eyepiece.
In another embodiment, video sensor or augmented reality glasses or the miscellaneous equipment with video sensor can be installed on vehicle.In this embodiment, video flowing can be transferred to the personnel in vehicle by telecommunication capability or the Internet-enabled.An application can be sightseeing or the visit to region.Another embodiment can be exploration or the investigation to region, or even patrol.In these embodiments, the gyrocontrol of imageing sensor will be useful, but not the numerical data application gyro of image or presentation video is proofreaied and correct.One embodiment of this technology describes in Figure 34 B.In this technology, camera or imageing sensor 3407 are installed on vehicle 3401.Be installed in camera set piece installing 3405 such as gyrostatic one or more motion sensors 3406.Stabilization platform 3404 receives information stabilized camera assembly 3405 from motion sensor, makes shake and rock in the time that camera is worked to be minimized.This is real optical stabilization.Or motion sensor or gyroscope can be installed in stabilization platform originally with it or in stabilization platform itself.In fact this technology provides the optical stabilization of stabilized camera or imageing sensor, and processes by computing machine data that camera takes afterwards and carrys out the digital stabilization of correcting image and form contrast.
In a kind of technology, the key of optical stabilization is to apply stabilization or correction before imageing sensor converts image to numerical information.In a kind of technology, be encoded and be sent to actuator from the feedback of the sensor such as gyroscope or angular-rate sensor, this actuator is moving image transmitting sensor as the focus of autofocus mechanism adjustment lens.Imageing sensor is moved to maintain image to the projection on the plane of delineation, and this is that the function of the focal length of the lens that using likely can obtain by scioptics itself from automatic distance correction and the focus information of the stadimeter of interactive wear-type eyepiece.In another technology, the angular-rate sensor that is also sometimes referred to as gyrosensor can be used to detection level respectively and move with vertical.Then detected motion can be fed back to the floating lense of electromagnet with mobile camera.But this optical stabilization technology is applied to each conceived lens by having to, make result quite expensive.
The stabilization of liquid lens is discussed in the U.S. Patent Application Publication 2010/0295987 of Varioptic company that is awarded Lyons, France city.In theory, be relatively simple to the control of liquid lens, for example, because only there is a variable will controlling: the voltage level that the electrode in the conduction to lens and non-conductive liquid applies, uses lens case and lid as electrode.Apply voltage and cause change or the inclination in liquid-liquid interface via electrowetting effect.The focus of this change or tilt adjustments lens or output.In its basic condition, then the control program with feedback will apply voltage and determine the effect of the voltage applying to result, the i.e. focus of image or astigmatism.Then voltage can apply according to various patterns, for example, equate with relative+Yi Ji two positive voltages, two negative voltages of different amplitudes etc. of – voltage, different amplitudes.Such lens are called as electric variable optical lens or electrooptics lens.
Voltage can apply to electrode in a short time according to each pattern, and carries out the inspection of focusing or aberration.Inspection can for example be undertaken by imageing sensor.In addition, the sensor on the lens in the sensor on camera or this situation, can detection camera or the motion of lens.Motion sensor can comprise accelerometer, gyroscope, angular-rate sensor or the piezoelectric sensor in a part that is arranged on the optical train that approaches on liquid lens or very much liquid lens.In one embodiment, then build the voltage applying and proofread and correct angle or the table such as correction card of the voltage that given mobile and horizontal is required.Also can be by use segmented electrode in the different piece of liquid, make to apply four voltages but not two, increase complexity.Certainly, if use four electrodes, can apply four voltages, cause than adopting only two patterns that electrode is much more.These patterns can comprise equating or relative positive and negative voltage etc. for relative fragment.One example is described in Figure 34 C.Four electrodes 3409 are installed in liquid lens shell (not shown).Two electrodes are installed in non-conductive liquid or near it, and another two electrodes are installed in conducting liquid or near it.Each electrode is independently with regard to the possible voltage that can apply.
Search or correction card can be constructed and be placed in the storer of augmented reality glasses.In use, accelerometer or other movable sensor are by the motion of sensing spectacles (being camera or the lens itself on glasses).Motion sensor such as accelerometer will especially sense the motion of interfering picture to the tiny oscillating mode of the level and smooth transmission of waveguide.In one embodiment, image stabilization techniques described herein can be applied to electric controllable liquid lens, makes to be proofreaied and correct immediately from the image of projector.This is the output of stable projection instrument, and at least some of the vibration to augmented reality eyepiece and movement and user move and proofread and correct at least in part.Also can there is the manual control of other parameter for adjusting gain or proofread and correct.Notice, the focus adjustment providing except imageing sensor control and discuss as a part for capable of regulating focus projection instrument, this technology also can be used to correct individual user's myopia or long sight.
Another zoom element uses tunable liquid crystal cells with focus image.These elements are for example open in U.S. Patent Application Publication 2009/0213321,2009/0316097 and 2010/0007807, and these patented claims by reference its entirety are contained in this and as foundation.In this method, liquid crystal material is comprised in transparent cell, preferably has the refractive index of coupling.Unit comprises transparency electrode, such as the electrode of being made up of indium tin oxide (ITO).Use a spiral electrode and the second spiral electrode or plane electrode, magnetic field inhomogeneous on space is applied in.Can use the electrode of other shape.The shape in magnetic field is determined the rotation of molecule in liquid crystal cells, to realize the change of refractive index and therefore to realize the change to lens focus.Therefore liquid crystal can, by electromagnetically-operated to change its refractive index, make tunable liquid crystal cells take on lens.
In the first embodiment, tunable liquid crystal cells 3420 is described in Figure 34 D.This unit comprises the internal layer 3421 of liquid crystal and the thin layer 3423 such as the directional material of polyimide.This material contributes to liquid crystal aligning in preferred orientations.Transparency electrode 3425 is positioned in each side of directional material.Electrode can be plane, can be maybe spiral as shown in the right side in Figure 34 D.Transparent glass substrate 3427 comprises the material in unit.Electrode is formed, and makes them that shape is lent to magnetic field.As noted, use in one embodiment the spiral electrode on a side or bilateral, it is not symmetrical making both sides.The second embodiment describes in Figure 34 E.Tunable liquid crystal cells 3430 comprises center liquid crystal material 3431, transparent glass substrate wall 3433 and transparency electrode.Bottom electrode 3435 is planes, and top electrodes 3437 is spiral-shaped.Transparency electrode can be made up of indium tin oxide (ITO).
Additional electrode can be used to the Fast Restoration of liquid crystal to amorphism or state of nature.Therefore little control voltage be used to dynamically to change the refractive index of the material that light passes through.Inhomogeneous magnetic field on the space of voltage generation required form, allows liquid crystal to take on lens.
In one embodiment, camera is included in other local black silicon, short-wave infrared (SWIR) cmos sensor of describing of this patent.In another embodiment, camera is the video sensor of 5,000,000 pixels (MP) optical stabilization.In one embodiment, control comprises 3GHz microprocessor or microcontroller, and also can comprise having the 633MHz digital signal processor that carries out 30M graphics accelerator polygon/second of realtime graphic processing for the image to from camera or video sensor.In one embodiment, augmented reality glasses can comprise for broadband, individual territory net (PAN), Local Area Network, wide area network, WLAN, follow IEEE802.11 or look back wireless Internet, radio or the telecommunication capability of communication.The equipment providing in one embodiment comprises the bluetooth capability of following IEEE802.15.In one embodiment, augmented reality glasses comprise the encryption system for secure communication, such as 256 advanced ciphering system (AES) encryption systems or other suitable encipheror.
In one embodiment, aerogram can comprise the ability for 3G or 4G network, and also can comprise wireless Internet ability.For the life-span extending, augmented reality eyepiece or glasses also can comprise at least one lithium ion battery, and as mentioned above, comprise charging ability.Charging plug can comprise AC/DC power supply changeover device, and can use multiple input voltages, such as 120 or 240V.Comprise 2D or the wireless air mouse of 3D for the control of the focus of adjusting capable of regulating punktal lens in one embodiment, or other posture in response to user or mobile noncontact control.2D mouse can be buied from Logitech Company of California, USA Fei Limeng city.3D mouse is described herein, or can use other mouse, such as the Cideko AVK05 that can buy from Taiwan Cideko company.
In one embodiment, eyepiece can comprise the electron device that is adapted control optical device and associated system, comprises central processor unit, nonvolatile memory, digital signal processor, 3-D graphics accelerator etc.Eyepiece can provide additional electronic component or functional part, comprises inertial navigation system, camera, microphone, audio frequency output, power supply, communication system, sensor, code table or isochronon function, thermometer, vibration temple part motor, motion sensor, enables voice-operated microphone to system, photochromic material is enabled to the UV sensor of contrast and light modulation.
In one embodiment, the CPU (central processing unit) of eyepiece (CPU) can be OMAP4, has dual 1GHz processor core.CPU can comprise 633MHz DSP, gives the 30M ability of polygon/second to CPU.
System also can provide two micro – SD(secure digitals) groove is for providing additional removable nonvolatile memory.
Plate carries camera can provide the color of 1.3MP, and records the video film up to 60 minutes.The video of recording can maybe can be unloaded film with mini-USB transfer equipment by wireless transmission.
Chip-on communication system (SOC) can be with operations such as wide area network (WLAN), versions 3.0, gps receiver, FM radio.
Eyepiece can adopt 3.6VDC lithium ion chargeable battery operation, with the battery life grown and be easy to use.Additional power supply can provide by the solar cell of system framework outside.These solar cells can be powered, and can also be to lithium ion cell charging.
The total power consumption of eyepiece is approximately 400mW, but depends on used feature and application and change.For example, the processor sensitive application with a lot of video and graphics requires more power, and will approach 400mW.More simply, more the application of video sensitivity will not used less power.Once the running time of charging also can be used and change with application and feature.
Micro-projector illumination engine, is also referred to as projector herein, can comprise multiple light emitting diodes (LED).For lifelike color is provided, use red LED, the green LED of Gree company and the blue led of Cree company of Osram company.These are the LED based on tube core.RGB engine can provide the output of capable of regulating color, permits a user to various programs and optimizing application and watches.
In each embodiment, can add illumination or control illumination by variety of way to glasses.For example, LED lamp or other lamp can be embedded in the framework of eyepiece, such as being embedded in the bridge of the nose, around compound lens or at temple part place.
The intensity of illumination or the color of illumination can be modulated.Modulation can, by various control technologys described herein, by various application, filtration and amplification, complete.
As example, illumination can be modulated by various control technologys described herein, moves or voice command such as the adjustment to control handle, gesture, eyeball.If user wants to increase the intensity of illumination, the control handle on these user's capable of regulating glasses, or the control handle in shown user interface on his adjustable lens, or according to alternate manner.User can move to control the control handle showing on lens with eyeball, or he can be according to alternate manner control handle.User can move to adjust illumination by the movement of hand or other health, makes the intensity of illumination or the movement carried out based on user of color and changes.And user can adjust illumination by voice command, such as the phrase of the illumination that increases or reduce by the request of saying or other color of request demonstration.In addition, illumination modulation can or realize by other means by any control technology described herein.
In addition, illumination can be modulated according to the application-specific of carrying out.As example, application can arrange the intensity of automatic adjustment illumination or the color of illumination by the optimum based on this application.If current illumination level is not the optimal level of the application carried out, can send message or order so that illumination adjustment to be provided.
In each embodiment, illumination modulation can be by filtering or completing by amplification.For example, can adopt allow light intensity and or color be changed and make filtering technique optimum or that required illumination is implemented.And in each embodiment, the intensity of illumination can be by application compared with large or reach required illumination intensity and modulated compared with little amplification.
Projector can be connected to display with to user's output video and other display element.The display using can be SVGA800x600 point/inch SYNDIANT liquid crystal over silicon (LCoS) display.
The target MPE size of this system can be 24mm x12mm x6mm.
Focus can be adjustable, allows user to improve projector and exports to be applicable to its demand.
Optical system can be comprised in the shell of being made up of 6061-T6 aluminium and glass-filled ABS/PC.
In one embodiment, the weight of system is estimated as 3.75 ounces 95 grams.
In one embodiment, eyepiece provides Infravision with the electron device being associated.Infravision can be enabled by black silicon SWIR sensor.Black silicon is to make the photoresponse of silicon strengthen silica-based CMOS (Complementary Metal Oxide Semiconductor) (CMOS) treatment technology of 100 times.Spectral range is deeply expanded to short-wave infrared (SWIR) wavelength coverage.In this technology, absorption and anti-reflecting layer that 300nm is dark are added to glasses.This layer provides improved response as shown in Figure 11, and wherein the response of black silicon is more much higher than silicon in visible ray and NIR scope, and extends to SWIR scope.This technology is the improvement to standing extremely high cost, performance issue and high volume manufacturability problem current techniques.Including this technology in night vision optical device takes the advantage economically of CMOS technology in design to.
From amplify starlight or different from the current nigh-vison googles (NVG) of other surround lighting of visible spectrum, SWIR sensor picks up each photon and converts the light in SWIR spectrum to electric signal, is similar to digital photography.Photon can be produced by naturally reconfigure (being also referred to as " nightglow ") of oxygen in night atmosphere and hydrogen atom.Short-wave infrared equipment is invisible by detecting in reflected starlight, urban lighting or moonlight at night, object is seen in short-wave infrared radiation.They or see through mist, haze or cigarette work also by day, and current NVG image intensifier infrared sensor will be overwhelmed by heat or brightness.Because short-wave infrared equipment picks up the invisible radiation on visible spectrum edge, SWIR image looks and looks like the image that visible ray produces to have identical shade and contrast and face detail, is just black and white, sharply strengthen identity, looked like people so people looks; They are unlike the bulk that adopts thermal imaging device conventionally to see.One of important SWIR ability is the view that aiming laser device is provided afield.It is sightless that aiming laser device (1.064um) adopts current nigh-vison googles.Adopt SWIR electrooptics device, soldier can check the each aiming laser device in use, comprise those laser instruments that used by enemy.Different from the thermal imaging device that does not penetrate window on vehicle or buildings, visible/near infrared/short-wave infrared sensor can be had an X-rayed them, no matter is daytime or night, gives important tactical advantage to user.
Some advantage comprises and only uses when needed active illumination.In some cases, may there be enough natural illuminations at night, such as during full moon.In the time being this situation, use the artificial night vision possibility of active illumination optional.Adopt SWIR sensor based on black silicon CMOS, may be without the need for source lighting under these situations, and do not provide source lighting, improve thus battery life.
In addition, black silicon image sensor can have the signal to noise ratio (S/N ratio) of the octuple that exceedes the signal to noise ratio (S/N ratio) obtaining under state of the sky at night at expensive InGaAs sensor.This technology also provides better resolution, provides than the much higher resolution of resolution that uses current techniques can use for night vision.Conventionally, long wavelength's image that the SWIR based on CMOS produces is difficult to be explained to have good heat detection, but differentiates rate variance.This problem adopts the picture black silicon SWIR sensor of the wavelength that depends on much shorter to solve.For these reasons, to battlefield night vision goggles, SWIR is highly desirable.Figure 12 shows the validity of black silicon night vision technology, provides and passes through a) dust; B) mist and c) cigarette check before and image afterwards.Image in Figure 12 shows the performance of the black silicon sensor of new VIS/NIR/SWIR.In each embodiment, imageing sensor can be distinguished the change in physical environment, such as vegetation, the ground of disturbance etc. of disturbance.For example, enemy combatant may place explosive release in the recent period on the ground, so the ground on explosive will be ' ground of disturbance ', and imageing sensor (and eyepiece inner or outside handling implement) can be distinguished the ground of recent disturbance and ground around.In this way, soldier can detect the placement of underground explosion device (for example, Improvised Explosive Device (IED)) at a distance.
Night vision system in the past suffers " halation " from bright source, such as street lamp.These " halation " are especially severe in image enhancement technique, and are also associated with resolution loss.In some cases, cooling system is essential in image enhancement technique system, has increased weight and has shortened battery life.Figure 17 shows A) can carry out flexibility platform and the B of the non-cooled cmos image sensor of VIS/NIR/SWIR imaging) difference of picture quality between figure image intensifying night vision system.
Figure 13 has described the difference in structure between current or existing vision enhancing technology 1300 and non-cooled cmos image sensor 1307.Existing platform (Figure 13 A) is because cost, weight, power consumption, spectral range and integrity problem limit disposing.Existing system is made up of front lens 1301, electric grade 1302 of time, microchannel plate 1303, high-voltage power supply 1304, phosphorous screen 1305 and eyepiece 1306 conventionally.This with can with the sub-fraction of cost, power consumption and weight carry out the non-cooled cmos image sensor 1307 of VIS/NIR/SWIR imaging flexibility platform (Figure 13 B) form contrast.These sensors simply too much comprise front lens 1308 and have the imageing sensor 1309 of digital picture output.
These advantages are derived from the treatment technology of CMOS compatible, and this technology is improved the photoresponse of silicon to exceed 100 times, and spectral range is deeply extended to short-wave infrared field.The difference of response has been shown in Figure 13 C.Although typical nigh-vison googles is limited to UV, visible ray and near infrared (NIR) scope, to about 1100nm(1.1 micron), new-type cmos image sensor scope also comprises short-wave infrared (SWIR) frequency spectrum, extension reaches 2000nm(2 micron).
Black silicon nuclear technology can provide the remarkable improvement to current night vision goggles.The light that femtosecond laser laser doping can strengthen silicon on very wide frequency spectrum detects character.In addition, optic response can be enhanced 100 times to 10,000 times.Compared with current night vision system, black silicon technology is the technology with quick, the scalable and CMOS compatible of cost very.Black silicon technology also provides low working bias voltage, is generally 3.3V.In addition, non-cooled performance may arrive 50 DEG C.Cooling requirement to current techniques has increased weight and power consumption, and also causes user's discomfort.As mentioned above, black silicon nuclear technology provides high-resolution replacement for current image intensifier technology.Black silicon nuclear technology can be according to have minimum crosstalking up to the speed of 1000 frame/seconds provides high-velocity electrons shutter.In some embodiment of night vision eyepiece, with respect to other optical display, such as LCoS display, preferably OLED display.
Hold the black silicon sensor of VIS/NIR/SWIR better Situation Awareness (SAAS) monitoring and Real-time image enhancement can be provided.
In certain embodiments, the black silicon sensor of VIS/NIR/SWIR can be included in the form factor that is only suitable for night vision, in nigh-vison googles or the night vision helmet.Nigh-vison googles can comprise makes its feature that is suitable for military market, and such as the power supply of firm and alternate form, and other form factor can be suitable for consumer or toy market.In one example, nigh-vison googles can have the scope of prolongation, such as 500-1200nm, and also can be used as camera.
In certain embodiments, the black silicon sensor of VIS/NIR/SWIR and other outer sensors can be included in the camera of installation that can be installed on transport or fighting machine, make Real-time Feedback can by by video superimpose in frontal view and driver or other occupants of not blocking it and be sent to vehicle.Driver can see the place that he or she is going to better, and gunner can see unexpected threat or target better, and navigator better perception Situation Awareness (SAAS) also searching threat simultaneously.Feedback also can be sent to non-at-scene position as required, such as the high-rise general headquarters of storer/memory location for after a while when run-home, navigation, monitoring, the data mining etc.
The further advantage of eyepiece can comprise sane connectivity.This connectivity allows download and transmit with bluetooth, Wi-Fi/ the Internet, honeycomb, satellite, 3G, FM/AM, TV and the UVB transceiver for quick sending/receiving mass data.For example, UWB transceiver can be used to create very High Data Rate, low probability of intercept/low detection probability (LPI/LPD), wireless personal domain network (WPAN) and connects mouse/controller of installing on weapon sight, weapon, E/O sensor, medical treatment transducer, audio/visual displays etc.In other embodiments, can for example create WPAN by other communication protocol, WPAN transceiver can be the modularization front end of complying with COTS, so that the wireless power management that makes to fight is high responsiveness, and avoids endangering wireless robustness.By ultra broadband (UWB) transceiver, base band/MAC and encryption chip are integrated in a module, just obtain physically small-sized dynamic and configurable transceiver and solved multiple operational requirements.Lower powered, encryption, wireless personal domain network (WPAN) between the equipment that WPAN transceiver is worn soldier, are created.WPAN transceiver can be affixed to or embed almost (handheld computer, fight display etc.) in any battlefield military equipment with network interface.This system can be supported the robustness that many users, AES encrypt, disturb for artificial interference and RF and be desirable for fight, low probability of intercept and detection probability (LPI/LPD) are provided.WPAN transceiver has been eliminated volume, weight and " obstacle causing " of soldier's data cable with it.Interface comprises USB1.1, USB2.0OTG, Ethernet10/100Base-T and RS2329 pin D-Sub.For the variable range of 2m nearly, power stage can be-10 ,-20dBm output.Data capacity can be 768Mbps and larger.Bandwidth can be 1.7GHz.Encryption can be 128,192 or 256 AES.WPAN transceiver can comprise that the message authentication code (MAC) of optimization generates.WPAN transceiver can meet MIL-STD-461F.WPAN transceiver can adopt the form of connector dust cover, and can be attachable to the military equipment in any battlefield.WPAN transceiver allows to carry out video, speech, still photo, text and chat simultaneously, the needs to the data cable between electronic equipment are eliminated, allow multiple equipment to carry out without not diverting one's attention with the control of hand, taking adjustable connectivity scope as feature, there is Ethernet and USB2.0 interface, taking adjustable frequency 3.1 to 10.6GHz and 200mw peak value energy consumption and nominal standby as feature.
For example, WPAN transceiver can allow to plant between the such biometric information registering apparatus of finding and create WPAN at eyepiece 100, computing machine, remote computation set remote-controller and Figure 58 of adopting the three-dimensional new line fight of GSE demonstration glasses form.In another example, if WPAN transceiver can allow to turn on adopting/under turn over to come back show fight eyes, HUD CPU(it be outside), grip controller and be similar between front arm computerized shown in Figure 58 and create WPAN before weapon.
Eyepiece can provide its oneself honeycomb connectivity, for example, by being connected with the individual radio of cellular system.Individual radio connects and may only can use the wearer of eyepiece, or it can use multiple proximal subscribers, and for example in Wi-Fi Hotspot (for example WiFi), wherein eyepiece provides local focus to utilize for other people.These proximal subscribers can be other wearers of eyepiece, or the user of a certain other wireless computer devices, for example mobile communication equipment (as mobile phone).Connect by this individual radio, wearer may not need other honeycombs or the Internet radio to connect to be connected to wireless service.For example, if the individual radio not being integrated in eyepiece connects, wearer may have to find WiFi tie point or be connected to their mobile communication equipment to set up wireless connections.In each embodiment, eyepiece can replace the needs to having mobile communication equipment (as mobile phone, mobile computer etc.) separately by these functions and user interface are integrated in eyepiece.For example, eyepiece can have integrated WiFi connection or focus, true or dummy keyboard interface, usb hub, loudspeaker (being for example used for music stream to be sent to loudspeaker) or loudspeaker input connection, integrated camera, external camera etc.In each embodiment, the external unit being connected with eyepiece can provide has the individual unit that personal network connects (as WiFi, honeycomb connect), keyboard, control panel (as touch pads) etc.
Communication from eyepiece can comprise the communication linkage for specific purpose.For example, utilizing ultra-wideband communications link with a small amount of time transmission and/or reception mass data moment.In another example, in the case of very limited transmission range, can use near-field communication (NFC) link in case individual very near time photos and sending messages send individual to, for example for tactics reason, for local direction, for warning etc.For example, soldier can send out/hold information safely, only sends to and need to know the very close people who maybe needs to use this information.In another example, wireless personal domain network (PAN) can be used to for example connect mouse/controller of installing on weapon sight, weapon, photoelectric sensor, medical treatment transducer, audio frequency-visual displays etc.
Eyepiece can comprise the inertial navigation system based on MEMS, for example GPS processor, accelerometer head control and other functions of enable system (for example for), gyroscope, altitude gauge, inclinometer, velograph/odometer, laser range finder, magnetometer, it also makes image stabilization become possibility.
Eyepiece can comprise provides the earphone integrated of audio frequency output to user or wearer such as clear earplug 120.
In one embodiment, can allow basic augmented reality with the camera (seeing Figure 21) of the integrated face forward of eyepiece.In augmented reality, beholder can align the viewed imaging of carrying out, then by strengthen, through editor, tagged or analyzed version layering is deposited on basic view.In alternate embodiment, the data that are associated can be shown together with primary image or on primary image.If two cameras are provided and are arranged on user's correct interocular distance place, can create stereoscopic video images.This ability may be useful for needing the auxiliary individual of eyesight.Many people stand the defect of their eyesight, for example myopia, long sight etc.Camera and as described herein very near virtual screen provide " video " for these people, and this video is adjustable (nearer or farther) aspect focus, and can be ordered completely and be controlled by speech or other by individual.This ability may be also useful for suffering the individual of disease of eye, such as cataract, retinitis pigmentosa etc.As long as certain organic visual capacity keeps existing, augmented reality eyepiece can help individual to see more clearly.Each embodiment of eyepiece can be taking lower one or more as feature: the brightness of amplifying, increasing, the still ability in healthy region by content map to eyes.Each embodiment of eyepiece can be used as bifocal or magnifier.Wearer can increase convergent-divergent or increase convergent-divergent in the part visual field in the visual field.In one embodiment, the camera being associated can obtain the image of object, then uses through the photo of convergent-divergent and presents to user.The region that user interface can allow wearer to point to him to want convergent-divergent, for example, utilize control technology as herein described, makes compared with all things that only amplify in the visual field of camera, and image processing can keep being absorbed in particular task.
In a further embodiment, towards after camera (not shown) also can be integrated in eyepiece.In this embodiment, towards after camera can allow the eyes control to eyepiece, by user, his or her eyes are pointed to shown specific project on eyepiece and make application or feature selecting.
Microcaloire Cassegrain fold concertina-wise optical camera can be integrated in to equipment for catching about the further embodiment of the equipment of individual biometric data.Microcaloire Cassegrain fold concertina-wise optical camera can be installed in the handheld device such as biological plating equipment, biological phone, also can be installed in a part that is used as the biological external member that gathers biometric data on glasses.
Cassegrain catoptron is the combination of main concave mirror and auxiliary convex mirror.These catoptrons are used in optical telescope and wireless aerial conventionally, because they provide good light (or sound) acquisition capacity with shorter, less packing forms.
In symmetrical Cassegrain, two catoptrons are about optical axis alignment, and primary mirror is conventionally porose in central authorities, allow light to reach eyepiece or camera chip or light detecting device, such as CCD chip.Conventionally a kind of alternate design using in radio telescope by final focus be placed on principal reflection mirror before.Further these catoptrons of alternate design tiltable are avoided hindering main or auxiliary catoptron, and can eliminate the needs to the hole in principal reflection mirror or auxiliary catoptron.Microcaloire Cassegrain fold concertina-wise optical camera can use any in above-mentioned modification, and final selection determined by the desirable size of optical device.
Traditional Cassegrain configuration 3500 is used parabolic mirror as primary mirror, and hyperboloidal mirror is as auxiliary mirror.The further embodiment of microcaloire Cassegrain fold concertina-wise optical camera can use hyperboloid primary mirror and/or sphere or oval auxiliary mirror.In operation, the legacy card Cassegrain that has parabola primary mirror and an auxiliary mirror of hyperboloid is reflected back light by the hole in primary mirror, as shown in Figure 35 downwards.Folded optical path makes design compacter, and adopts micro-size, is adapted with biological plating sensor as herein described and uses together with biological plating external member.In collapsible optical system, light beam is bent to make light path longer than the physical length of system.A conventional example of collapsible optical system is prismatic binocular.In camera lens, auxiliary mirror can be installed near on glass plate transparent on smooth on an optics of barrel, optics." star " diffraction effect being caused by prismatic blade shape support tripod has been eliminated in this support.This allows the lens barrel of seal closure, and has protected primary mirror, but causes some loss of light collection ability.
This Cassegrain design also utilizes the specific properties of parabolic mirror and hyperboloidal mirror.The all incident raies that are parallel to its axis of symmetry are reflexed to single focus by concave paraboloid catoptron.Convex hyperboloid mirror has two focuses, and will point to all light reflections of a focus to another focus.Catoptron in this type of camera lens is designed and is placed to shares a focus, the second focus of hyperboloidal mirror is put to the identical some place, place being observed with image, conventionally just in time outside eyepiece.The parallel rays that enters camera lens is reflexed to its focus by parabolic mirror, and this focus is consistent with the focus of hyperboloidal mirror.Then hyperboloidal mirror arrives another focus by those light reflections, at this place's cameras record image.
Figure 36 illustrates the configuration of microcaloire Cassegrain fold concertina-wise optical camera.This camera can be installed on augmented reality eyes, on biological phone or on other biological identifying information collecting device.Assembly 3600 has multiple telescopic sections, allows camera along with cassegrainian optical system stretches, and longer light path is provided.Screw thread 3602 allows camera to be installed on equipment, for example augmented reality glasses or other biological identifying information collecting device.Although the embodiment describing in Figure 36 uses screw thread, also can use other mount schemes, as bayonet mount, knob or cover.The first telescopic section 3604 is also served as the outer enclosure of camera lens in the time of complete retracted position.Camera also can carry out the stretching, extension of drives camera and inside contract in conjunction with motor.Also can comprise the second telescopic section 3606.Other embodiment can be in conjunction with the telescopic section of varying number, the length that this depends on selected task or wants the required light path of collected data.The 3rd telescopic section 3608 comprises camera lens and catoptron.If camera is designed to follow the design of legacy card Cassegrain, catoptron can be principal reflection mirror.Auxiliary catoptron can be comprised in the first telescopic section 3604.
Further embodiment can utilize micro-reflector to form camera, simultaneously still by providing longer light path by collapsible optical system.Use and design identical principle with Cassegrain.
Camera lens 3610 is provided for the optical system using together with the collapsible optical system of Cassegrain design.Camera lens 3610 can be selected from all kinds, and can change according to being used for.Screw thread 3602 allows various cameras to exchange according to user's needs.
Eyes control to feature and option selection can be controlled and be activated by the object identification software loading on system processor.Object identification software can allow augmented reality, will identify output combined with Query Database, by combined the computational tool of identification output and definite correlativity/likelihood etc.
In the additional embodiment in conjunction with 3D projector, it is also possible that three-dimensional is watched.Two stacking Miniature projector (not shown) can be used to create 3-D view output.
With reference to Figure 10, multiple digital CMOS sensors (each sensor array and projector have microprocessor and the DSP of redundancy) detect visible ray, near infrared light and short-wave infrared light, allow passive day and night operation, for example Real-time image enhancement 1002, keystone correction 1004 in real time and real-time virtual perspective correction 1008.Eyepiece can utilize digital CMOS imageing sensor and directional microphone as herein described (as microphone array), for example, monitor that for visual imaging visible scene (as coordinated imaging for bio-identification, ability of posture control, with 2D/3D projection map), IR/UV imaging carry out scene enhancing (for example having an X-rayed haze, cigarette, dark), audio direction sensing (as the direction of gunslinging or blast, speech detect) etc.In each embodiment, each the be fed into digital signal processor (DSP) in the input of these sensors is processed, for example eyepiece inner or with the DSP of external processing apparatus interface.Then the output that the DSP of each sensor inlet flow is processed can be combined in the mode that generates useful information data on algorithm.For example, this system may be useful for following combination: face recognition, Real-time voice detect, analyze by the link to database in real time, especially GPS location with distortion correction and soldier, attendant etc. time, for example, in the time monitoring interested outlying region, example is path or trail or high safety region as is known.In one embodiment, be input to the audio direction sensor input of DSP can the processed user to eyepiece produce one or more visual, that can listen or vibrate queue and indicate the direction of sound.For example; if intercept the sound of loud blast or gunslinging with hearing protection with protection soldier's hearing; if or blast rings very much and makes soldier not can say it from where and their ear may ring very much ground hummed now to such an extent as to they do not hear anything, can with to operator listen or vibration queue indicate the direction of original threat.
Augmented reality eyepiece or glasses can be powered by any accumulator system, such as powered battery, solar powered, line powering etc.Solar collector can be placed on picture frame, belt hook is first-class.Battery charging can be carried out with wall type charger, onboard charger, on belt hook, in spectacle case etc.In one embodiment, eyepiece can be rechargeable, and can be equipped with for rushing electric small USB connector again.In another embodiment, eyepiece can be equipped with by one or more long-range inductive power supply switch technologies and carry out long-range inductive charging, for example Powercast of Pennsylvania, America Ligonier; And Fulton Int ' l.Inc.(the said firm of Michigan, USA Ada also has another provider, Splashpower, the Inc. of Britain Camb) provide those.
Augmented reality eyepiece also comprises camera and camera is connected to the required any excuse of circuit.The output of camera can be stored in storer, also can be displayed on the display that the wearer of eyes can use.Display driver also can be used to control display.Augmented reality equipment also comprises power supply, electric power management circuit and the circuit for power supply is recharged of for example as directed battery and so on.As described in other places, recharge and can connect by rigid line (as small USB connector) or occur by inductor, solar panel input etc.
The control system of eyepiece or glasses can comprise the control algolithm of economizing power supply for indicate the low electric weight time at the power supply such as battery.This saving-algorithm can comprise the power supply of closing the application to energy-sensitive, as illumination, camera or require the sensor of high-energy level, for example, requires any sensor of well heater.Other save steps can comprise the power supply of slowing down for sensor or camera, for example, slow down sampling or frame per second, in the time that electric power is low, enter lower sampling or frame per second, closure sensor or camera in the time of lower level.Thereby, there are at least three kinds of operator schemes according to available power: normal mode; Battery saving mode; And urgent or " shut " mode".
Application of the present disclosure can be controlled by wearer's movement and direct action, the such as movement of his or her hand, finger, pin, head, eyes etc., the equipment (as being arranged on the sensor control device on health) of wearing or being arranged on wearer by the equipment (as accelerometer, gyroscope, camera, optical sensor, GPS sensor) of eyepiece and/or by wearer is enabled.In this way, wearer can directly control eyepiece by the movement of their health and/or action, without using traditional hand-held remote controller.For example, wearer can have be arranged on one or two on hand (for example at least one finger on, on palm, first-class at the back of the hand) the sensor device such as location sensing equipment, wherein this location sensing equipment provides the position data of hand, and provides the radio communication of position data as command information to eyepiece.In each embodiment, sensor device of the present disclosure can comprise gyroscope apparatus (as electronic gyroscope, MEMS gyroscope, mechanical gyroscope, Quantum gyroscope, Ring Laser Gyro instrument, fibre optic gyroscope), accelerometer, mems accelerometer, speed pickup, power sensor, pressure transducer, optical sensor, proximity sense, RFID etc. offering aspect positional information.For example, wearer can be provided with location sensing equipment on their right hand forefinger, the motion that wherein this equipment can this finger of sensing.In this example, user can or activate eyepiece by certain switching mechanism on eyepiece or such as, by the predetermined motion sequence (fast moving is pointed, will be pointed tapping crust etc.) of finger.Note, tapping crust can be explained by the sensing of accelerometer, power sensor, pressure transducer etc.The motion that then location sensing equipment can transmit finger is as command information, and for example aloft moveable finger moves cursor, in rapid movement, moves and indicate selection etc. across the image demonstrating or project.In each embodiment, location sensing equipment can directly send to eyepiece for command process the command information sensing, or command process circuit can be positioned at and exist together with location sensing equipment, for example, be arranged in this example the part that the upper conduct of finger comprises the assembly of the sensor of location sensing equipment.Command information can be followed visual detector.For example, cursor can change color when mutual with different content.For example, for know you in the time using peripheral equipment control glasses you finger where, the vision instruction of command information can be by reality in glasses.
In each embodiment, wearer can be arranged on multiple location sensing equipment on their health.For example and continue example above, wearer can be arranged on location sensing equipment on multiple points on hand, and for example each sensor is on different fingers, or for example, as the set of sensor, in gloves.In this way, can be used to from the total sensor command information of the set of the sensor at diverse location place on hand the command information that provides more complicated.For example, in the disclosure, in the use of simulation and object for appreciation simulation, user can play games with sensor device gloves, the wherein grasping of the hand of gloves sensing user to ball, bat, racket etc.In each embodiment, multiple location sensing equipment can be installed on the different parts of health, allows wearer that the compound movement of health is sent to the application of eyepiece cause and uses.
In each embodiment, sensor device can have power sensor, pressure transducer etc., for example, when contact with object for detection of sensor device.For example, sensor device can comprise the pressure transducer at the finger tip place of wearer's finger.In this case, wearer can tapping, repeatedly tapping, draw brush, touch etc. and come to generate and order to eyepiece.The degree that power sensor also can be used to indicate touch, hold, push away etc., wherein predetermined or study to threshold value determine different command information.In this way, can carry out transferring command according to a series of serial commands of constantly updating the command information of using by eyepiece in a certain application.In an example, wearer may move simulation, for example play application, Military Application, business application etc., wherein move and for example, be fed into the order of this simulation that eyepiece demonstrates by eyepiece as impact with the contacting of object (by multiple sensor devices at least one).For example, a certain sensor device can be included in a controller, wherein a controller can have power sensor, pressure transducer, Inertial Measurement Unit etc., wherein a controller can be used to produce virtually write, cursor that control is associated with the display of eyepiece, serve as computer mouse, by physical motion and/or contact to provide control command etc.
In each embodiment, sensor device can comprise optical sensor or optical launcher, as moving a kind of mode that is interpreted as order.For example, sensor device can comprise the optical sensor on hand that is arranged on wearer, and eyepiece shell can comprise optical launcher, and when the optical launcher of the hand that makes to move them as user by eyepiece, motion can be interpreted as order.The motion detecting by optical sensor can comprise that the brush of drawing carrying out with different speed, motion to repeat, stop and mobile combination etc. passes through.In each embodiment, optical sensor and/or transmitter can be positioned on eyepiece, be arranged on wearer upper (as on hand, on pin, in gloves, on certain part clothing), or between zones of different on wearer and on eyepiece, be used in combination etc.
In one embodiment, being useful on the some sensors that monitor wearer's situation or be adjacent to wearer's someone is installed in augmented reality glasses.Due to the progress of electronic technology, it is much smaller that sensor has become.Aspect size reduction and digitizing, signal conversion and signal processing technology have also been made major progress.Therefore just may in AR glasses, not only there is temperature sensor, also may there is whole sensor array.As described in, these sensors can comprise temperature sensor, and are used for detecting the sensor of following content: pulse; Heartbeat variability; EKG or ECG; Respiratory rate; Core temperature; Body heat flows; Electrodermal response is GSR; EMG; EEG; EOG; Blood pressure; Body fat; Hydration level; Activity level; Oxygen utilization; Glucose or blood sugar level; Body position; And UV radiant exposure or absorption.In addition, also can there be retina sensor and blood oxygen transducer (as Sp02 sensor) etc.These sensors can obtain from various manufacturers, comprise the Vermed of Vermont ,Usa shellfish Loews Fu Ersi; The VTI of Finland Ventaa; The ServoFlow of Massachusetts, United States Lexington.
In certain embodiments, by installation of sensors on individual or individual equipment on instead of glasses originally with it may be more useful.For example, on the equipment that, accelerometer, motion sensor and vibration transducer can be usefully arranged on upper, the individual clothing of individual or individual wears.These sensors can keep and the contacting continuously or periodically of the control of AR glasses by Bluetooth radio transmitter or other wireless devices of following IEEE802.11 specification.For example, doctor wishes the motion or the concussion that monitor that patient experiences during footrace, if sensor is directly installed on individual skin or on the T-shirt that even individual wears instead of is arranged on glasses, sensor may be more useful.In these situations, by being placed on the sensor of individual above or on clothes instead of on glasses, can obtain reading more accurately.These sensors do not need as being adapted to be mounted within glasses originally small, as will be seen more useful sensor with it.
AR glasses or safety goggles also can comprise environmental sensor or sensor array.These sensors are installed on glasses, and near atmosphere wearer or air sampling.These sensors or sensor array can be responsive for predetermined substance or material concentration.For example, sensor and array can be used for measuring carbon monoxide, oxides of nitrogen (" NOx "), temperature, relative humidity, noise level, volatile organic chemical product (VOC), ozone, particle, sulfuretted hydrogen, air pressure, ultraviolet light and intensity thereof.Supplier and manufacturer comprise the Sensares of French Kroll; The CairPol of France A Laisi; The Critical Environmental Technologies of Canada in delta city, Columbia Province of Britain, Canada; The Apollo Electronic Science and Technology Co., Ltd. of China Shenzhen; The AV Technology Ltd. in Cheshire, UK stoke ripple city.Many other sensors are known.If these installation of sensors are on individual or on individual's clothing or equipment, they may be also useful.These environmental sensors can comprise radiation sensor, chemical sensor, toxic gas sensor etc.
In one embodiment, environmental sensor, health monitoring sensor or both are installed on the picture frame of augmented reality glasses.In another embodiment, sensor can be installed on the upper or individual clothing of individual or equipment.For example, can be implanted for the sensor of electrical activity of the heart of measuring wearer, with conversion with transmit the applicable annex of the signal of the individual cardiomotility of instruction.
Signal can pass through bluetooth transmitting set or other wireless devices of following IEEE802.15.1 specification transmit very short distance.Other frequencies or agreement also can replacedly be used.Then signal can and be processed equipment and process by the signal monitoring of augmented reality glasses, is recorded and is presented on the virtual screen that wearer can use.In another embodiment, signal also can be sent to through AR glasses wearer's friend or squad leader.Thereby individual's health and happiness can be monitored by this individual and other people, and also can be followed the tracks of in time.
In another embodiment, environmental sensor can be installed on the upper or individual equipment of individual.For example, if be worn on individual coat or interior waistband instead of be directly installed on glasses, radiation or chemical sensor may be more useful.As mentioned above, can locally be monitored by AR eyes by this individual from the signal of sensor.Sensor reading also can be sent to other places desirably or automatically, the perhaps interval to set, for example per quart hour or per half an hour.Thereby the historical record that can make sensor reading (no matter being this individual health reading or environment) comes for following the tracks of or trend object.
In one embodiment, RF/ micropower impulse radio (MIR) sensor can be associated with eyepiece, and serves as short distance medical treatment radar.This sensor can work in ultra broadband.This sensor can comprise RF/ impulse generator, receiver and signal processor, and for may be useful by the ion current of 3mm myocardium cell of measuring skin detects and measure heart signal.Receiver can be the position that phased array antenna allows to determine signal in an area of space.This sensor can be used to detect and identify heart signal through the blocking-up thing such as wall, water, concrete, dust, metal, wood etc.For example, user can be with this sensor by detecting how many hearts rate define how many people and are arranged in xoncrete structure.In another embodiment, the heart rate detecting can serve as individual unique identifier, thereby they can be identified in the future.In one embodiment, RF/ impulse generator can embed in an equipment, for example eyepiece or a certain other equipment, and receiver is embedded in a different equipment, as another eyepiece or equipment.In this way, when heart rate detected between transmitter and receiver time, can create virtual " trip wire ".In one embodiment, this sensor can be used as field diagnostic or self diagnosis instrument.EKG can be analyzed and storage in the future as bio-identification identifier.User can receive the heart rate signal sensing and exist the warning of how many hearts rate as the content demonstrating in eyepiece.
Figure 29 depicts has the augmented reality eyepiece of various sensors and signal equipment or the embodiment 2900 of glasses.One or more environment or health sensor locally or are remotely connected to sensor interface by short-distance wireless electric line and antenna, as shown.Sensor interface circuitry comprise for detection of, amplify, process and send and/or transmit the armamentarium of sensor detected signal.Distance sensor can comprise for example implanted heart rate monitor or other body sensor (not shown).Other sensors can comprise accelerometer, inclinometer, temperature sensor, be applicable to detect any sensor in other health or the environmental sensor of discussing in the sensor of one or more chemicals or gas or the disclosure.Sensor interface is connected to microprocessor or the microcontroller of augmented reality equipment, and from this point of view, the information of collection can be recorded in storer, for example random-access memory (ram) or permanent storage, ROM (read-only memory) (ROM), as shown.
In one embodiment, sensor device allows to carry out while electric field sensing by eyepiece.Electric field (EF) sensing be a kind of allow COMPUTER DETECTION, assessment and with the contiguous method for sensing of working together with near object them.Physical contact with skin, for example with another people shake hands or with certain other physical contacts of conduction or non-conductive equipment or object, can be sensed according to the change in electric field, and or allow data to be sent to eyepiece or to transmit data from eyepiece, or terminating number reportedly send.For example, the video that eyepiece catches can be stored on eyepiece, until transmit with wearer's contact object of the eyepiece of embedded electric field sensing transceiver the data of initiating from eyepiece to receiver.Transceiver can comprise transmitter and data sensing circuit, transmitter comprises the transmitter circuit that causes medial electric field, and transceiver is by detected transmission data and receive that data are distinguished the pattern of transmitting and receiving and the output control signal corresponding with two kinds of patterns allows two-way communication.Can generate two person-to-person instantaneous dedicated networks by the contact such as shaking hands.Data can transmit between a certain user's eyepiece and the second user's data sink or eyepiece.Can strengthen this dedicated network by additional safety practice, such as face or audio identification, eye contact detection, fingerprint detection, bio-identification input, iris or retina tracking etc.
In each embodiment, can there is the authenticating device relevant with the function of accessing eyepiece, for example access shown go out or the content that projects, access the limited content projecting, complete or partly enable function (as by logging in the function that visits eyepiece) of eyepiece itself etc.Certification can provide by the identification of the speech to wearer, iris, retina, fingerprint etc. or other biological identification marking symbol.For example, eyepiece or the controller being associated can have IR, ultrasound wave or capacitive touch sense sensor, are used for receiving the control inputs relevant with certification or other eyepiece functions.Capacitive transducer can detect fingerprint, and starts application or otherwise control a certain eyepiece function.Each finger has different fingerprints, and therefore each finger can be used for controlling different eyepiece functions or starts fast different application or the certification of various ranks is provided.Electric capacity can not be worked together with gloves, but sonac can, and can be used in an identical manner providing biometric authentication or control.In eyepiece or the controller that is associated, useful sonac comprises the SonicSlide of Sonavation tMthe SonicTouch of the Sonavation using in sensor tMtechnology, it carrys out work to fingerprint imaging to distinguish minimum details in fingerprint with 256 grades of gray scales by burr and the dimpled grain of measuring fingerprint on acoustics.SonicSlide tMthe crucial image-forming assembly of sensor is ceramic microelectronic mechanical system (MEMS) the piezoelectric transducer array of being made up of ceramic composite.
Verification System can provide multiple users' bio-identification input database, and making provides based on the access privileges that is input to the strategy of the each user in database and be associated the access control that uses eyepiece.Eyepiece can provide verification process.For example, authenticating device can sense user and when take off eyepiece, and in the time that user puts on eyepiece again, requires certification again.This has guaranteed that eyepiece only provides access to those authorized users and those privileges that only wearer is authorized to better.In an example, authenticating device can detect the existence of user's eyes or head in the time that eyepiece is put on.In first order access, user can only can access low sensitive items, until authenticated.During authenticating, authenticating device can identifying user, and searches their access privileges.Once these privileges are determined, so authenticating device just can provide suitable access to user.In the situation that uncommitted user detected, eyepiece can keep the access to low sensitive items, further limiting access, complete denied access etc.
In one embodiment, receiver can be associated with an object to allow by the wearer's of eyepiece touch, this object to be controlled, and wherein touches and allows command signal in object, transmit or carry out.For example, receiver can be associated with automobile door lock.In the time that the wearer of eyepiece touches automobile, car door can release.In another example, receiver can be embedded in medicine bottle.In the time that the wearer of eyepiece touches medicine bottle, can initiate caution signal.In another example, receiver can be associated with the wall along walkway.In the time that the wearer of eyepiece passes through wall or touches wall, can in eyepiece or in the panel of videos of wall, start advertisement.
In one embodiment, in the time that the wearer of eyepiece initiates physical contact, can provide wearer to be connected to the instruction of the online activity such as game with the WiFi message exchange of receiver, authentication in thread environment maybe can be provided.In this embodiment, in response to this contact, this individual's expression can change color or stand certain other vision instructions.
In each embodiment, eyepiece can comprise as the haptic interface in Figure 14, for example, in order to enable the Tactile control to eyepiece, for example, brushes by drawing, gently detains, touches, presses, the rolling of click, spin etc.For example, haptic interface 1402 can be installed on the picture frame of eyepiece 1400, for example on a mirror leg, on two mirror legs, top, the bottom of picture frame etc. of the bridge of the nose, picture frame.In each embodiment, haptic interface 1402 can comprise the control and the function that are similar to computer mouse, all position control plates of 2D as described herein etc. with LR-button.For example, haptic interface can be installed on eyepiece near on user's temple, and serves as eyepiece is projected to " temple mouse " controller to user's content, and can comprise the rotary selector and the carriage return button that are arranged on temple.In another example, haptic interface can be one or more vibration temple motors, can vibrate to warn or notify user, such as left side danger, the right danger, medical conditions etc.Haptic interface can be installed on the controller separating with eyepiece, the controller for example worn, controller of carrying etc. on hand.If there is accelerometer in controller, it can sensing user tapping, for example on keyboard, their (thering is the hand wound of controller or kowtowing with the have gentle hands with controller) on hand etc.The various ways that then wearer can be interpreted as order by eyepiece touches this haptic interface, for example by interface once or repeatedly tapping, by Finger brush is crossed this interface, by press and keep, by once pressing a more than interface etc.In each embodiment, haptic interface can be attachable to wearer health (as their hand, arm, leg, trunk, neck), they clothing, as the annex of their clothing, as finger ring 1500, as bracelet, as necklace etc.For example, this interface can be affixed on health, and for example, at wrist back, the different piece that wherein touches this interface provides different command informations (for example touch front portion, rear portion, centre, maintenance a period of time, rap, draw brush etc.).In each embodiment, user can explain by power, pressure, movement etc. with contacting of haptic interface.For example, haptic interface can be in conjunction with resistive touch technology, capacitance touch technology, ratio pressure touching technique etc.In an example, in the case of this interface of application requirements be simple, durable, lower powered etc., haptic interface can utilize discrete resistance touching technique.In another example, passing through (for example, by mobile, stroke brush, multiple point touching etc.) in the multi-purpose situation of this interface requirement, this haptic interface can utilize capacitive touch technology.In another example, haptic interface can utilize pressure touching technique, for example, in the time requiring variable pressure order.In each embodiment, any touching technique in these touching techniques or similar touching technique all can be used in any haptic interface as herein described.
In one embodiment, hand-held annex can be used to control dummy keyboard and be input to eyepiece.For example, if handheld device has touch-screen, user can with touch screen interaction, touch-screen or present on-screen keyboard or be suitable for allowing user and equipment mutual (it provides input with dummy keyboard coordination to glasses).For example, this dummy keyboard can be present in glasses, but is option aloft, but user can make touch panel device be suitable for accepting the input corresponding to dummy keyboard.This equipment can be followed the tracks of finger in the time that finger slips over capacitive module, will provide keystroke sensation to the click of equipment.Equipment can have above touch-surface, has in the back or above one or more Action Buttons, allows user to click to select and without their finger is lifted to touch-surface.The letter that user has selected can be highlighted.User can still be slided and refer to text input, mentions their finger and finishes a certain word, inserts space, twoly kowtows to insert fullstop etc.Figure 159 has described the dummy keyboard 15902 presenting in the user visual field.On this keyboard, two keys are highlighted, ' D ' and ' Enter ' (carriage return).In this accompanying drawing, touch-screen accessory device 15904 is used for this input to offer keyboard, is then transmitted to glasses as input.Thereby provide the actual touch screen indicating on use virtual interface or external unit to carry out the visual detector of input or control command.
In an embodiment, eyepiece can comprise and utilizes magnetic field between eyepiece and external unit, to transmit and/or receive order, remote measurement, information etc., or directly transmit or receive from user the sense of touch communication interface of order, remote measurement, information etc. to user.For example, user can have the patterned magnetic material at a certain position (as in skin, nail, health etc.) of the health that is directly laid in them, and the oscillating magnetic field that this patterned magnetic material generates sense of touch communication interface is made response (as vibration, power, motion etc.) physically.This oscillating magnetic field can convey a message by the modulation of field, such as time-related difference, the frequency of signal etc. of the amplitude by signal, signal.The information of passing on can be alarm, incoming call instruction, for amusement, for communicating by letter, apply with eyepiece the instruction that is associated, be used to indicate the degree of approach of user and eyepiece, for providing tactile feedback etc. from eyepiece to user.Different command can cause different stimulating effects to patterned magnetic material, for different orders or indicator.For example, can be by the different frequency of the incoming call of the different people of the contacts list for user and/or sequence pattern, for the varying strength of different warning levels, realize different stimulating effects for interesting pattern of amusement object etc.
Sense of touch communication interface can comprise the coil that transmits and/or receive oscillating magnetic flux signal.Magnetic material can be ferromagnetic material, paramagnetic material etc., and can be according to power supply, China ink, tatoo, applique, adhesive tape, transfer paper, spraying etc. apply.In each embodiment, magnetic material can have the ability such as unmagnetized during not using demagnetization when eyepiece, in magnetic material is not present in from the magnetic field of eyepiece user.Can there is the spatial model of function to apply the magnetic material that will apply, for example, be used in response to specific signal of communication modulation, there is specific impedance, in response to specific frequency etc.The magnetic material applying can be visual picture, invisible image, tatoo, mark, label, symbol etc.The magnetic material applying can comprise a kind of pattern, the magnetic signal that this pattern utilization is imported into generates the transmission signal (for example, with the identifier about user) that returns to eyepiece sense of touch communication interface, as indicating signal of the degree of approach between eyepiece and magnetic material etc.For example, identifier can be user ID, and this user ID is compared with the ID storing on eyepiece for confirming that this user is the authorized user of eyepiece.In another example, magnetic material can only can generate the transmission signal that returns to eyepiece in the situation that magnetic material approaches eyepiece.For example, user can put on nail by magnetic material, and user can provide command pointer to eyepiece near their finger is put into user's haptic interface.
In another example, wearer can have the interface being installed in finger ring, handpiece etc. as shown in figure 15, wherein this interface can have with eyepiece and have at least one in multiple command interface types that wireless command is connected, as haptic interface, positional sensor devices etc.In one embodiment, finger ring 1500 can have the control of mapping calculation machine mouse, as button 1504(for example plays button, many buttons and similar mouse function), 2D position control 1502, roller etc.Button 1504 and 2D position control 1502 can be as shown in figure 15, and wherein button is positioned at the side towards thumb, and 2D positioner is positioned at top.Alternatively, button and 2D position control can adopt other configurations, such as all towards thumb side, be all positioned at end face or any other combination.2D position control 1502 can be 2D button position controller TrackPoint (TrackPoint) pointing device etc. of position of mouse beacon (be used in the keyboard of some laptop computer as being embedded in), TrackPoint, operating rod, optical tracking pad, photoelectricity wheel trolley, touch-screen, touch pad, Trackpad, rolling Trackpad, trace ball, any other position or position control device etc.In each embodiment, can offer eyepiece with wired or wireless interface from the control signal of haptic interface (as finger ring haptic interface 1500), wherein user can such as the hand with them, thumb, finger etc. provide control inputs easily.In each embodiment, finger ring perhaps can be expanded to adapt to any finger or shrink and more close hand.For example, finger ring can have customizable restraint strap or the hinge of spring is installed.For example, user perhaps can clearly express control with their thumb, and wherein finger ring is worn on user's forefinger.In each embodiment, the interactive mode that a kind of method or system can provide user to wear is worn eyepiece, wherein this eyepiece comprises that user observes surrounding environment by it and the optics assembly of the content that demonstrates, supply to be shown to user's processor for the treatment of content, for content being projected to the integrated projector apparatus of optics assembly, and the opertaing device of for example, wearing on user's health (user hand), this opertaing device comprises at least one Control Component being encouraged by user, and the control command of the excitation that is derived from this at least one Control Component is offered to processor as command instruction.Command instruction can be for the manipulation to the content that will be shown to user.Opertaing device can be worn on first of user's hand and point above, and at least one Control Component can be encouraged by the second finger of user's hand.The first finger can be forefinger, and second finger can be thumb, and the first finger and second finger can be positioned at the on hand same of user.Opertaing device can have at least one Control Component being arranged on towards in the forefinger side of thumb.This at least one Control Component can be button.This at least one Control Component can be 2D positioner.Opertaing device can have the Control Component being arranged on towards at least one button excitation in the forefinger side of thumb, and is arranged on the Control Component towards the 2D positioner excitation in top side of forefinger.Control Component can be installed at least two fingers of user's hand.Opertaing device can be used as gloves and is worn on user on hand.Opertaing device can be worn on user's wrist.This at least one Control Component can be worn at least one finger of hand, and transfer equipment can be worn on hand individually.Transfer equipment can be worn on wrist.Transfer equipment can be worn on the back of the hand.Control Component can be at least one in multiple buttons.This at least one button can provide the function that is substantially similar to conventional computer mouse button.In the plurality of button two can work to be substantially similar to the main button of conventional double-button computer mouse.This Control Component can be roller.This Control Component can be 2D position control assembly.This 2D position control assembly can be button position controller, TrackPoint, operating rod, optical tracking pad, photoelectricity wheel trolley, touch pad, Trackpad, rolling Trackpad, trace ball, capacitive touch screen etc.The thumb of this 2D position control assembly available subscribers is controlled.This Control Component of disposable can be the touch-screen that can realize the touch control including button class function and 2D operating function.This Control Component can be energized fixed point and opertaing device while being placed in the processor content projecting user.Finger ring controller can by discardable, rechargeable, sun power etc. plate carry powered battery.
In each embodiment, wearer can have the interface being arranged in finger ring 1500AA, and this interface comprises camera 1502AA, as shown in Figure 15 AA.In each embodiment, finger ring controller 1502AA can have control interface type as herein described, as by button 1504,2D position control 1502,3D position control (as utilized accelerometer, gyroscope) etc.So finger ring controller 1500AA can be used to control the function in eyepiece, as the manipulation of the displaying contents of control subtend wearer projection.In each embodiment, control interface 1502,1504 can provide control aspect to the camera 1502AA embedding, as ON/OFF, zoom, pan, focus on, record still image photo, recording of video etc.Alternatively, can control all functions by other control aspects of eyepiece, as by voice control, other Tactile control interfaces, eye-gaze detection etc. as described herein.Camera also can be enabled automatic control function, for example focusing automatically, timing function, facial detection and/or tracking, autozoom etc.For example, the finger ring controller 1500AA with integrated camera 1502AA can be used to check wearer 1508AA during the video conference starting by eyepiece, and wherein the extended finger ring controller of wearer 1508AA (being for example arranged on their finger) is so that the facial view that allows camera 1502AA to obtain them supplies to send at least one other participant of video conference.Alternatively, wearer can take off finger ring controller 1500AA and is lowered into surperficial 1510AA(as table end face) upper, to make camera 1502AA see wearer.So it is upper that the image of wearer 1512AA can be displayed on the viewing area 1518AA of eyepiece, and be transmitted to video conference other people, for example, together with other participants' of conference call image 1514AA.In each embodiment, camera 1502AA can provide FOV(visual field manually or automatically) 1504AA adjusting.For example, it is upper in conference call that wearer can be lowered into finger ring controller 1500AA surperficial 1510AA, and FOV1504AA can be by manually (as controlled by button control 1502,1504, voice, other haptic interfaces) or automatically (as passed through face recognition) control to make the FOV1504AA sensing wearer's of camera face.Can make FOV1504AA to move and to change along with wearer, for example, by the tracking via face recognition.FOV1504AA also can zoom in/out adjustment adapts to the variation of wearer's facial positions.In each embodiment, camera 1502AA can be used to multiple static state and/or Video Applications, wherein the visual field of camera is provided for wearer on the 1518AA of the viewing area of eyepiece, storer can be present in and in eyepiece, is used for memory image/video, and image/video can be transferred from eyepiece, convey to certain External memory equipment, user, web application etc.In each embodiment, camera can be incorporated in multiple different mobile devices, be for example worn on arm, on hand, on wrist, on finger etc., such as the wrist-watch with embedded type camera 3,200 3202 as shown in Figure 32 to 33.1502AA is the same with finger ring controller, and any one in these mobile devices all can comprise as the manual and/or automatic function as described in for finger ring controller 1502AA.In each embodiment, finger ring controller 1502AA can have additional sensor, embed function, controlling feature etc., such as fingerprint scanner, tactile feedback, lcd screen, accelerometer, bluetooth etc.For example, finger ring controller can provide the synchronization monitoring between eyepiece and other Control Components, as described herein.
In each embodiment, it is a kind of for the system and method for wearer's image is provided to video conference participants by use external mirror that eyepiece can provide, wherein wearer sees themselves in catoptron, and themselves image is captured by the integrated camera of eyepiece.The image catching can directly be used, or image can be reversed to correct the image inversion of catoptron.In an example, wearer can add and has multiple other people video conferences, and wherein wearer perhaps can watch by eyepiece other people real time video image.By utilizing the integrated camera in general mirror and lens, user perhaps can see themselves in mirror, make image be integrated that camera catches and provide themselves image to other people for video conference.Image such as related other people in video conference, this image also can be used as to the projects images of eyepiece and is obtained by wearer.
In each embodiment, also can provide and can in opertaing device, provide surperficial sensory package for detection of the Control Component of crossing over surperficial motion.This surface sensory package can be placed in the palmar side of user's hand.Surface can be at least one in crust, pressure release surface, user's skin surface, user's garment surface etc.Can be wirelessly, provide control command by transmission such as wired connections.Opertaing device can be controlled the fixed point function being associated with the processor content demonstrating.This fixed point function can be the control to cursor position; The content that selection, selection mobile display to the content demonstrating goes out; Control of zoom to the content demonstrating, pan, the visual field, size, position etc.Opertaing device can be controlled the fixed point function being associated with checked surrounding environment.This fixed point function can be that cursor is placed in surrounding environment on viewed object.The position location of the object of observing can be determined by the integrated camera of processor associating and eyepiece.The mark of the object of observing can be determined by the integrated camera of processor associating and eyepiece.Opertaing device can be controlled certain function of eyepiece.This function can be associated with the content demonstrating.This function can be the pattern control of eyepiece.Opertaing device can be folding, is convenient to deposit in the time that user does not wear.In each embodiment, opertaing device can use together with external unit, for example, be used for jointly controlling external unit with eyepiece.External unit can be amusement equipment, audio frequency apparatus, portable electric appts, navigator, weapon, self-actuated controller etc.
In each embodiment, the opertaing device of body worn (be for example worn on finger upper, in the bondage of palm place to hand, on arm, on leg, trunk is first-class) can provide 3D position sensor information to eyepiece.For example, opertaing device can serve as " air mouse ", and wherein 3D position transducer (such as accelerometer, gyroscope etc.) provides positional information when user command (such as the posture by button click, voice command, vision-based detection etc.).User perhaps can navigate and project 2D or the 3D rendering to user by eyepiece optical projection system by this feature.Further, eyepiece can provide the outside of image to relay, for showing or projecting to other people, such as in the situation that demonstrating.User perhaps can change the pattern of opertaing device between 2D and 3D, to be adapted to different functions, application, user interface etc.In each embodiment, multiple 3D opertaing devices can be used to some application, in Simulation Application.
In each embodiment, a kind of system can comprise: the mutual wear-type eyepiece that user wears, wherein eyepiece comprises the optics assembly that user observes surrounding environment and the content that demonstrates by it, and wherein this optics assembly comprises the correcting element of the view of correcting user to surrounding environment; Supply to be shown to user's integrated processor for the treatment of content; And for content being introduced to the integrated image source of optics assembly; And be arranged on the Tactile control interface on eyepiece, this Tactile control interface contact by user that this interface and user be arranged near this interface at least one accept the control inputs from user.
In each embodiment, can enable the control to eyepiece by hand control, and the control of the cursor being especially associated with the content demonstrating, for example utilize the wearable device 1500 shown in Figure 15, the virtual machine mouse 1500A as shown in Figure 15 A etc.For example, wearable device 1500 can transmit order by physical interface (as button 1502, roller 1504), and virtual machine mouse 1500A perhaps can carry out interpreted command by the movement of detection user's thumb, fist, hand etc. and action.In calculating field, physics mouse is by detecting the pointing device with respect to its surface-supported two dimensional motion effect.Object and one or more button that physics mouse is held by user's a subordinate traditionally form.It as allowed user to carry out " roller " of the various operations of depending on system, maybe can add additional buttons or feature that more control or dimension are inputted sometimes taking other elements as feature.The motion of mouse converts the motion of the cursor on display to, and this allows the meticulous control to graphic user interface.The in the situation that of eyepiece, user perhaps can utilize physics mouse, virtual mouse or both combinations.In each embodiment, virtual mouse can relate to one or more sensors of the hand that is attached to user, as on thumb 1502A, finger 1504A, palm 1508A, wrist 1510A etc., wherein eyepiece receives from the signal of sensor and the signal receiving is converted to the motion of cursor on user's eyepiece displayer.In each embodiment, can be by the external interface such as haptic interface 1402, by the receiver on the inside of eyepiece, at subsidiary communications interface, at the physics mouse being associated or wear the first-class signal that receives of interface.Virtual mouse also can comprise the driver of the hand that is attached to user or the element of other output types, as for providing tactile feedback by vibration, power, pressure, electric pulse, temperature etc. to user.Sensor and driver can be attached to by overcoat, finger ring, protector, gloves etc. user's hand.So, eyepiece virtual mouse can allow user the motion of hand to be converted to the motion of the cursor on eyepiece displayer, wherein " motion " can comprise at a slow speed move, change in rapid movement, wriggling, position, position etc., and can allow user to carry out work with three-dimensional, without physical surface and comprise some or all in 6 degree of freedom.Note, because " virtual mouse " can be associated with multiple parts of hand, virtual mouse can be implemented as multiple " virtual mouse " controller, or across the distributed director of multiple control members of hand.In each embodiment, eyepiece can provide the use of multiple virtual mouses, for example every of user one on hand, one of user or multi-feet first-class.
In each embodiment, eyepiece virtual mouse may not need physical surface to operate, such as detecting motion by the sensor of one of such as multiple accelerometer type (as tuning fork, piezoelectricity, shearing mode, strain mode, capacitive character, heat, resistive, dynamo-electric, resonance, magnetic, optics, sound, laser, three-dimensional etc.), and determine certain a part of translation or the angle displacement of hand or hand by the output signal of sensor.For example, accelerometer can produce its size and the proportional output signal of translational acceleration in three directions of hand.Paired accelerometer can be configured to the rotary acceleration of the each several part that detects hand or hand.Can be by accelerometer output signal integration being determined to point-to-point speed and the displacement of the each several part of hand or hand, and can be by the difference-product between the right output signal of accelerometer being assigned to determine rotational speed and the displacement of hand.Alternatively, can utilize other sensors, as sonac, imager, IR/RF, magnetometer, gyro magnetometer etc.Because accelerometer or other sensors can be installed on the each several part of hand, eyepiece perhaps can detect the multiple movement of hand, from conventionally with the computer mouse simple motion that the is associated motion to the more high complexity the chirokinesthetic explanation of complexity such as Simulation Application of moving.In each embodiment, user can only need little translation or spinning movement that these actions are converted into the user view in eyepiece projection to user to move the motion being associated.
In each embodiment, virtual mouse can have the physical switch for opertaing device associated therewith, as is arranged on the off/on switches on other positions of hand, eyepiece or health.Virtual mouse also can have opening/closing control of being undertaken by the predefined motion of hand or action etc.For example, can be by the operation that moves back and forth fast to enable virtual mouse of hand.In another example, virtual mouse is forbidden in the motion (for example, before eyepiece) that can cross eyepiece by hand.In each embodiment, can provide multiple explanations that move to the operation being conventionally associated with physics mouse control for the virtual mouse of eyepiece, so without training just for user is familiar with, for example click with pointing, double-click, three hit, right click, left click, click and drag, combine click, roller motion etc.In each embodiment, eyepiece can provide gesture recognition, for example, explain gesture by mathematical algorithm.
In each embodiment, can provide ability of posture control identification by the technology of utilizing the capacitive character causing from the variation in the distance of the conductor element of a part for the control system as eyepiece because of user's hand to change, therefore by any equipment of installation on hand without user.In each embodiment, the part that conductor can be used as eyepiece is mounted, for example, on the mirror leg or other parts of picture frame, or as certain external interface being arranged on user's health or clothes.For example, conductor can be antenna, and wherein control system works in the mode that is similar to the contactless musical instrument that is called Te Leimenqin.Te Leimenqin uses heterodyne principle to become sound signal next life, but in the situation of eyepiece, this signal can be used to generate control inputs signal.Control circuit can comprise some radio-frequency oscillators, for example oscillator with fixed frequency work another hand control by user, wherein change the input at control antenna place with the distance of hand.In this technology, user's hand serves as L-C(inductor-capacitor) ground plate (user's health ground connection) of variable condenser in circuit, it is a part for oscillator and determines its frequency.In another example, circuit can use single oscillator, two pairs of heterodyne oscillators etc.In each embodiment, may there be multiple different conductors as control inputs.In each embodiment, such control interface may be desirable for the control inputs (as volume control, Zoom control etc.) changing across certain limit.But such control interface also can be used for more discrete control signal (as ON/OFF control), wherein predetermined threshold determines that the state of control inputs changes.
In each embodiment, eyepiece can dock with the telepilot of installing on radio tracking pad mouse, hand-held remote controller, health, the physics remote control equipment that is arranged on telepilot on eyepiece etc.Remote control equipment can be installed on an outside equipment, for example, for individual, game, professional use, military etc.For example, telepilot can be installed on soldier's weapon, as be arranged on pistol grip, be arranged on muzzle guard shield, to be arranged on foregrip first-class, thereby remote control is provided and does not need their hand to move apart weapon to soldier.Telepilot can be installed to eyepiece removedly.
In each embodiment, can be activated and/or control by proximity sense for the telepilot of eyepiece.Proximity sense can be near the sensor of the existence of object just can detecting without physical contact.For example, proximity sense can send electromagnetism or electrostatic field or electromagnetic radiation beam (for example infrared ray), and finds variation or return signal in field.Sensed object is usually called as the target of proximity sense.Different proximity sense targets may need different sensors.For example, capacitive character or photoelectric sensor may be applicable to plastic target; Inductive proximity sense requires metal target.Other examples of proximity sense technology comprise that capacitive displacement transducer, eddy current, magnetic, photoelectric cell (reflection), laser, passive heat are infrared, the reflection of passive optical, CCD, ionising radiation etc.In each embodiment, proximity sense can be integrated in any control embodiment as herein described, comprises physics telepilot, virtual mouse, is arranged on interface on eyepiece, control on an equipment (as game console, weapon) mounted externally etc.
In each embodiment, can be used to control eyepiece for the sensor of the body kinematics of measuring user, or as outside input, for example, use Inertial Measurement Unit (IMU), 3 axle magnetometers, 3 axle gyroscopes, 3 axis accelerometers etc.For example, a sensor can be installed in user on hand, thereby allows to use the signal from sensor to control eyepiece, as described herein.In another example, sensor signal can receive and explain the body kinematics of assessing and/or utilize user for the object outside controlling by eyepiece.In an example, the sensor being arranged on user's every leg and every arm can provide signal to allow eyepiece to measure user's gait to eyepiece.User's gait then then can be used to monitor the time dependent gait of user, such as variation that is used for progress during variation, the physiatrics of monitors physical behavior, cause due to head trauma etc.In the example of monitoring head trauma, eyepiece can initially be determined user's baseline gait profile, then supervisory user in time, such as physical event (as with relevant collision, blast, the car accident etc. of moving) before and afterwards.Sportsman or individual, in physiatrics in the situation that, eyepiece can be used to periodically measure user's gait, and in database, safeguards that measured value is for analyzing.The gait time profile of running be can produce, physical trauma, physics progress etc. indicated such as being used for the gait of supervisory user.
In each embodiment, can initiate the control to eyepiece, the especially control to the cursor being associated with the content that is shown to user by wear the motion of the user's of eyepiece facial characteristics, the tension of facial muscles, fastening, the motion of chin etc. of tooth through facial stimulus sensor 1502B sensing.For example, as shown in Figure 15 B, eyepiece can have facial stimulus sensor as the expansion from eyepiece headphone set piece installing 1504B, from expansion of the mirror leg 1508B of eyepiece etc., power, vibration etc. that wherein facial stimulus sensor can sensing be associated with the motion of facial characteristics.Face stimulus sensor also can be installed separately with eyepiece assembly, such as the part as independent earphone, wherein the output of the sensor of earphone and facial stimulus sensor can be sent to eyepiece by wired or wireless communication (as bluetooth or other communication protocols known in the art).Face stimulus sensor also can be attachable in the surrounding, mouth of ear, on the face, neck is first-class.Face stimulus sensor also can be made up of multiple sensors, such as the sensing motion that is used for optimizing different faces or internal motion or action.In each embodiment, facial stimulus sensor can detect motion and they are construed to order, or original signal can be sent to eyepiece for explaining.Order can be order, the control being associated with the cursor providing to a part for user's demonstration as content or pointer etc. for controlling eyepiece function.For example, the tooth that user can fasten them is indicated once or twice and being clicked or double-clicking such as what be conventionally associated with the click of computer mouse.In another example, user can strain facial muscles and carry out directive command, such as with the selection of the image correlation connection of projection.In each embodiment, facial stimulus sensor can utilize noise reduction process to minimize face, first-class background motion, such as by Adaptive Signal Processing technology.Speech activity sensor for example also can be used to reduce from user, near other individualities, from the interference of ambient noise etc.In an example, facial stimulus sensor also can give a lecture by detection during vibration in user's cheek improve communication and eliminate noise, such as identify with multiple microphones ground unrest and by noise eliminate, ground unrest is eliminated in volume increase etc.
In each embodiment, the user of eyepiece perhaps can put into the visual field of eyepiece point at objects or position by the hand that lifts them and obtain the information about certain environmental characteristic of observing by eyepiece, place, object etc.For example, user's the finger of making sensing can be indicated an environmental characteristic, and wherein this finger is not only in the visual field of eyepiece, but also in the visual field of embedded type camera.System now perhaps can be by relevant to the place of the being seen environmental characteristic of camera the position of finger of making sensing.In addition, eyepiece can have position and orientation sensor, such as GPS and magnetometer, carrys out permission system and know user's place and sight line.Thus, perhaps can the extrapolate positional information of this environmental characteristic of system, such as be used for to user provide place information, by the position of environmental information be superimposed upon on 2D or 3D map, further the associated positional information of setting up is relevant to supplementary (as the business organization name in the individual name of address, this address, this place, the coordinate in this place) about this place etc. by this positional information.With reference to figure 15C, in an example, user seeing by eyepiece 1502C and using their hand 1504C to point to the house 1508C in their visual field, and wherein embedded type camera 1510C existing hand 1504C that makes sensing in its visual field also has house 1508C.In this example, system can be determined the place of house 1508C, and provides and be added on user to place information 1514C and 3D map on the view of this environment.In each embodiment, the information being associated with environmental characteristic can be provided by external unit (such as being connected to communicate by letter by radio communication), be stored in eyepiece inside (such as downloading to eyepiece for current place etc.).In each embodiment, the information that offers the wearer of eyepiece can comprise any information in the relevant much information of the scene observed with wearer, such as geography information, interest point information, social networking information (as with stand in wearer front individual relating to persons around this people, strengthen push away spy (Twitter), the information such as face book (Facebook), for example " suspension " is around this people), profile information (being for example stored in wearer's contacts list), historical information, consumption information, product information, retail information, security information, advertisement, business information, security information, the information relevant with game, humour annotation, information relevant with news etc.
In each embodiment, user perhaps can control their visual angle with respect to 3D projects images, the 3D projects images that such as the 3D projects images being associated with external environment condition, has been stored and has retrieved, film that 3D shows (watching as downloaded) etc.For example and refer again to Figure 15 C, user perhaps can be such as changing by the head that rotates them visual angle of image 1512C that 3D shows, even and the image that external environment condition and 3D show in real time rotate in the case of user they time also maintain together with mobile their position etc.In this way, eyepiece perhaps can provide augmented reality by information being superimposed upon in user's the external environment condition of observing, map 1512C, place information 1514C etc. that for example the 3D of stack shows, the map, the information etc. that wherein show can change along with the change of user's observation.In another example, at 3D film or through the film of 3D conversion, can be by certain that watch visual angle be controlled to change spectators' visual angle so that spectators " enter " film environment, wherein user perhaps can rotate their head and correspondingly make view change with the head position changing, wherein user perhaps can " enter into " in image in the time that they physically go ahead, along with user moves their the watching view attentively of eyes and visual angle is changed etc.In addition, can provide additional image information, such as the each side at User, this can visit by rotation head.
In each embodiment, the user of an eyepiece perhaps can at least be synchronizeed it to the view of projects images or video with another user's of a certain eyepiece or other video display apparatus view.For example, two independently eyepiece user may wish to watch identical 3D map, game projection, point of interest projection, video etc., wherein two spectators not only see identical projection content, and the view between quilt of projection content is synchronously.In an example, two users may want jointly to check the 3D map in a certain region, and this image is become to make a user perhaps can point to that another user on this 3D map can see and mutual position with it by synchronous.These two users perhaps can on 3D map, move everywhere and share between two users and 3D map virtual-physics is mutual etc.Further, one group of eyepiece wearer perhaps can be common mutual with a projection in groups.In this way, two or more users perhaps can synchronously have unified augmented reality experience by the coordinate of their eyepiece.Two or more eyepieces synchronously can be by passing on positional information to provide between eyepiece, such as absolute location information, relative position information, translation and rotary position information etc., such as from position transducer (as gyroscope, IMU, GPS etc.) as described herein.Can be by the Internet, by cellular network, network etc. carrys out the communication between guiding ocular via satellite.To contributing to the processing of synchronous positional information can carry out in the primary processor of single eyepiece, jointly carry out between one group of eyepiece, carry out in remote server system etc., or their combination in any.In each embodiment, the synchronous view of the coordinatograph of the projection content between multiple eyepieces can provide from body is to the augmented reality experience of the expansion of multiple individualities one by one, and wherein the plurality of individuality is benefited from this group augmented reality experience.For example, the people of one group of frequent concert can by their eyepiece with synchronize from being fed to of concert making side, to make visual effect or audio frequency to be pushed to the people with eyepiece by concert making side, performing artist, other audience members etc.In an example, performing artist can have argument mirror, and can control to audience member and send content.In one embodiment, content can be the view of performing artist to surrounding environment.Performing artist may also use argument mirror for various application, such as control exterior lighting system, mutual with augmented reality jazz drum or collection plate, recall lyrics etc.
In each embodiment, the image showing on eyepiece or video can or the direct image that be fed to or audio video synchronization from remote camera upper demonstration or that caught by the equipment that this is connected with the equipment being connected (it has the communication link with eyepiece).Feed can be selected, or the initiation such as the metadata that sends of one of another action sensor input that can be received by one of equipment from being connected or control signal, equipment of being connected by other.Other video display apparatus can be other eyepieces, desk-top computer, laptop computer, smart phone, flat computer, televisor etc.Eyepiece, equipment and remote camera can be passed through wide area, local, metropolitan area, individual territory and cloud network communication link and connect.Sensor input can be audio sensor input, video sensor input etc.Can comprise by other actions that receiving sensor is inputted or control signal is initiated and initiate such as tracking target, send the action message or the initiation audio video synchronization as described in other places herein etc.For example, the video being caught by long-range inspection post or the residing guard's in examination field eyepiece can be applied in while identifying interested people the video being fed to from guard's eyepiece in face recognition automatically selected, for being presented on supervisory eyepiece.
In each embodiment, eyepiece can utilize sound projective technique to realize the wearer's of eyepiece audio direction, for example, utilize surround sound technology.The realization of wearer's audio direction can comprise from source side to producing sound (in real time or as playback).It can comprise that vision or audible indicator provide the direction of sound source.Sound projective technique may be useful for having such as the individuality of the hearing defect that stands due to user that the sense of hearing is impaired, user wears earphone, user wears hearing protection etc. and cause or obstruction.In this example, eyepiece can provide the 3D of enhancing can listen reproduction.In an example, wearer may put on earphone, and has shot.In this example, eyepiece perhaps can reproduce the 3D sound profile of shot, thereby allows wearer to respond to shooting, knows that sound comes wherefrom.In another example, with earphone, hearing loss, perhaps do not can say and what said and/or just in talker's direction in the wearer of noisy environment etc., but provide 3D sound (for example to strengthen from eyepiece, wearer listens other approaching individuals by earphone, therefore there is no directivity information).In another example, wearer may be in noisy surrounding environment, or in may the environment of the noisy noise of generating period.In this example, eyepiece can have the noisy sound of cut-out and protect the ability of wearer's hearing, or sound may ring very much to such an extent as to wearer does not can say sound comes wherefrom, and their ear may very loudly make them not hear anything now.In order to help this situation, eyepiece can provide the queues such as vision, the sense of hearing, vibration to indicate the direction of sound source to wearer.In each embodiment, be plugged so that their ear is avoided noisy noise at wearer's ear, eyepiece can provide " enhancing " hearing, but substitutes those sound that miss with the reproduction that earplug generates sound from natural world.The communication that then this artificial sound can be used to the wireless transmission that can not naturally hear operator gives directivity.
In each embodiment, can be the different microphones of point of different directions for the example of configuration of the directivity of setting up source sound.For example, at least one microphone can be used to wearer's sound, and at least one microphone is for surrounding environment, and at least one points to ground downwards, possibly in multiple different discrete directions.In this example, can take away and point to lower microphone and isolate other sound, this can with 3D sound around and enhancing hearing technical combinations, as described herein.
In example at voice enhancement system as a part for eyepiece, have some users with eyepiece, for example, in noise circumstance, wherein all users " block ear ", realize as carried out culture noise obstruct by eyepiece earplug.One of wearer may shout that they need certain part equipment.Due to the hearing conservation that all neighbourhood noises and eyepiece create, no one can hear the request to this equipment.At this, the wearer who makes oral request has filtration microphone near their mouth, they can wirelessly send this request to other people, wherein their eyepiece can be relayed voice signal other users' eyepiece and the ear of a correct side, and other people will know to the right or it seems left and look at that who has made this request.This system can further strengthen with whole wearers' geographic position and " virtual " ambiophonic system, and " virtual " ambiophonic system gives 3d space perception (loop technique as true in SRS) with two earplugs.
In each embodiment, sense of hearing queue can be also that computing machine generates, and the user who therefore communicates does not need to say their communication, but can from commonly used command list, select, and computer based generates this communication in pre-configured condition etc.In an example, wearer may be in such a case, and wherein they do not want there is display before their eyes, but want earplug to be placed in their ear.In this case, if they want someone in group of notifications get up and follow them, they can only click controller specific times, or provide visual gesture with camera, IMU etc.System can be selected " and then I " order and send it to other user together with making user's the 3D system place of communication, to lure that they listen to from the place beyond the invisible that is in fact positioned at them into.In each embodiment, can determine and/or directional information is provided by the positional information of the user from eyepiece.
Eyepiece can comprise for palmesthetic equipment is provided to user, such as the vibration exciter (such as by mechanical vibration motor, piezoelectric vibration driver, ultrasonic vibration driver etc.) passing through in picture frame or the mirror leg of goggle structure.Vibration can be provided to indicate message instruction to user, as the indicator of the user to visual impairment (for example, due to dark, cigarette, cloud, blind), as a part for game, part of emulation etc.Can in the side mirror leg of eyepiece, use vibration exciter individually or together with loudspeaker, help create 3D vision-sound-vibration reality environment, such as for game, emulation etc.For example, vibration exciter can be installed in each side mirror leg of eyepiece, so that when the projectile that the left side that proper a certain application is presented on user's head is flown over, left side vibration exciter is configured in fact to fly over emulation projectile the mode of user's sensation and vibrates.In addition, the loudspeaker of this side can synchronously apply the sound that imitates the projectile sound that can send in the time flying over user's head.Vibration and/or loudspeaker can be installed on eyepiece by the mode that 3D vibration-audio experience is provided to user, for example, to strengthen the visual experience that content was provided by visually demonstrating, in 3D vision displaying contents.In this way, user can be enclosed in the virtual 3D environment of many perception.In each embodiment, the disclosure can comprise a kind of mutual wear-type eyepiece of being worn by user, wherein this eyepiece comprises: user observes the optics assembly of surrounding environment and the content demonstrating by it, be suitable for content to introduce the integrated image source of optics assembly, and be suitable for the treatment facility of the function of managing eyepiece, the structure that wherein this wear-type eyepiece has comprises: user is used for observing the picture frame of surrounding environment and is used for picture frame to be supported on left side and the right side mirror leg of user's head, and vibration exciter in each of left side and right side mirror leg, each vibration exciter is independently in response to the vibration command from treatment facility.Vibration command can in response to as shown go out the virtual projectile, virtual blast, message instruction, visual cues, warning etc. of a part of content, in one of vibration exciter, initiate vibration.The content demonstrating can be used as the part that user playing emulation, game application, useful application etc. and is provided.This application of calling vibration command can locally operate on eyepiece, partly or wholly moves etc. by outside platform, and wherein eyepiece has and the communication interconnect of this outside platform.In addition, eyepiece can comprise integral speakers as described herein, such as in each in left side and right side mirror leg, wherein vibration command is initiated vibration in one of vibration exciter, and this synchronizes with the audible command of initiating sound in the loudspeaker on same side mirror leg in the time receiving vibration command in time.
In each embodiment, eyepiece can provide the each side of signals intelligence (SIGINT), such as in the use of the signals of communication such as existing WiFi, 3G, bluetooth, is used for collecting near the equipment of eyepiece wearer and user's signals intelligence.These signals may be from other eyepieces, such as being used for the information about other known friendly users of collecting; Carry out other eyepieces of picking up of free unauthorized individuality, such as by the signal generating in the time that unauthorized user is attempted to use eyepiece; From other communication facilitiess (as radio, cell phone, pager, walkie-talkie etc.); Be derived from the electric signal of the equipment that may not be used directly to communication; Etc..The information of being collected by eyepiece can be the quantity of directional information, positional information, movable information, communication and/or speed etc.In addition, can carry out collection information by the coordinated manipulation of multiple eyepieces, such as the signal triangle location of the position for determining signal.
With reference to figure 15D, in each embodiment, the user of eyepiece 1502D perhaps can be with the multiple hand/finger points of the hand 1504D from them with respect to the visual field (FOV) 1508D that penetrates view definition camera 1510D, such as applying for augmented reality.For example, in the example shown, user is utilizing their the first finger and thumb to adjust the FOV1508D of the camera 1510D of eyepiece 1502D.User can utilize other to combine to adjust FOV1508D, such as the combination, use palm, cup-shaped hand etc. of finger and thumb of combination, two hands of combination, finger and thumb that utilizes finger.Can make the user can be to change the FOV1508 of camera 1510D with the almost identical mode of the user of touch-screen with multiple hand/finger points, the each point that wherein difference of hand/finger has been set up FOV be set up desirable view.But, in this example, between user's hand and eyepiece, do not make physical contact.At this, camera can be ordered the FOV that the each several part of user's hand is associated to set up or change camera.Order can be any command type as herein described, order that include but not limited to the hands movement in the FOV of camera, the order being associated with the physical interface on eyepiece, receives with the order being associated near the motion sensing of eyepiece, command interface from user's a certain position etc.Eyepiece perhaps can be identified as order by fingers/hand motion, as in certain repeating motion.In each embodiment, user also can utilize certain part of the image that this technology adjustment projects, wherein relevant in a certain respect by viewed camera image and the image projecting of eyepiece, the image correlation projecting such as the hand/finger point in view with user.For example, user may watch external environment condition and the image that projects just at the same time, and user utilize this technology to change to project watch area, region, magnification etc.In each embodiment, user can carry out the change to FOV for a variety of reasons, is included in real time environment from viewed scene and zooms in or out, zooms in or out, changes and distribute to watching area, changing environment or skeleton view of the image that projects etc. of the image that projects from the observed part of the image that projects.
In each embodiment, eyepiece can allow FOV simultaneously.For example, simultaneously wide, in, narrow camera FOV can be used, wherein user can make different FOV in view, present that (be perhaps wide for whole place is shown, be static simultaneously, and narrow for focusing on specific objective, perhaps move with eyes or cursor).
In each embodiment, eyepiece perhaps can, by following the tracks of eyes via the light of eyes of user reflection, determine where user is watching attentively, or the motion of eyes of user.Then this information can be used for helping to be relevant to the image that projects, camera view, external environment condition etc. carrys out the sight line of associated user, and for control technology as herein described.For example, user can watch a certain place on the image projecting attentively and make one's options, and moves (as nictation) such as the eyes that detect by external remote control or with certain.In the example of this technology and with reference to figure 15E, the utilizing emitted light 1508E such as infrared light can be from eyes 1504E reflection 1510E, and at optical display 502(as by camera or other optical sensors) locate by sensing.Then this information can analyzedly extract eyes rotation from the variation of reflection.In each embodiment, eye-tracking device can be used as feature by corneal reflection and pupil center and follow the tracks of in time; Use and follow the tracks of as feature from reflection and the crystalline lens back of cornea front portion; Make feature (such as the retinal vessel) imaging from inside ofeye, and in the time that eyes rotate, follow the trail of these features; Etc..Alternatively, eyepiece can be followed the tracks of by other technologies the motion of eyes, such as utilize around eyes, be arranged on assembly etc. in the contact lenses on eyes.For example, the motion that can provide the special invisible glasses with built-in optical assembly to measure eyes to user, such as catoptron, magnetic field sensor etc.In another example, can carry out measurement and monitoring electromotive force with the electrode that is placed in around eyes, utilize the constant potential field of eyes as bipolar generation, such as its positive pole at cornea and negative pole at retina.In this example, can be with being placed on the skin of around eyes, the first-class contact electrode of picture frame of eyepiece obtains electric signal.If eyes move towards periphery from central part, retina approaches an electrode and cornea approaches a relative electrode.This variation in bipolar orientation and the electric potential field causing causes the variation of the signal of measuring.Change by analyzing these, can follow the tracks of eyes and move.
User's eye gaze direction relates to placement (pass through eyepiece) and the optional selection (pass through user) of visual detector in user's peripheral vision with another the example how control being associated can be applied, such as inputting the clutter in the narrow in the user visual field around the direction of gaze at place so that reduce the highest vision of eyes.Because brain can how many message contexts of single treatment be limited, and brain pays close attention to the vision content close to direction of gaze most, and therefore eyepiece can provide the visual detector projecting as the clue to user in vision periphery.Like this, brain may only need to process detection to indicator, instead of the information being associated with indicator, thereby the information that reduced makes the possibility of user's overload.Indicator can be the object of icon, photo, color, symbol, flicker etc., and indicate warning, Email arrival, incoming call, calendar events, need to be from the inside of user's concern or external processing apparatus etc.Utilize the visual detector of periphery, user can realize it and without being divert one's attention by it.Then user can determine to promote the content being associated with this visual cues alternatively to see more information, such as watching this visual detector attentively, and by doing like this to open its content.The icon of the Email that for example, expression is imported into can indicate and receive Email.User can notice this icon and select to ignore its (if be not activated such as icon a period of time, icon disappears, such as by watching attentively or certain other opertaing devices).Alternatively, user can notice this visual detector and by watch attentively the direction of this visual detector select " activation " it.The in the situation that of Email, in the time that eyepiece detects user's eye gaze and the position consistency of icon, eyepiece can be opened Email and represent its content.In this way, user is maintaining the domination what information is just being concerned, as a result of, make minimum interference and make content service efficiency maximize.
In each embodiment and with some optical arrangement as herein described (as front lighting LCoS) explicitly, the feedback between two or more displays can guarantee that display has identical brightness and contrast.In each embodiment, the camera in each display may be utilized.Electric current to LED can be controlled and can be obtained color balance, such as for example, by the LED that selects similar quality, output and/or color (carrying out the frequency groove (bin) of self similarity), can provide right and left width modulation (PWM) value, and the calibration of performance period property.In each embodiment, can realize the calibration of power spectrum.If display is turned down due to the outside brightness of height, user can know the calibration to each display.In each embodiment, can create equal brightness between two displays, color saturation, color balance, colourity etc.This can prevent that user's brain from ignoring a display.In each embodiment, can create the feedback system from display, it allows user or another people's brightness adjusting etc., to make each displaying appliance have constant and/or consistent brightness, color saturation, balance, colourity etc.In each embodiment, on each display, may there is luminance sensor, it can be color, RGB, white sensor, full optical sensor etc.In each embodiment, sensor can be monitoring or the power sensor that checks the power that passes to LED or consumed by LED.User or another people can regulate one or more displays to the power supply of LED by increasing or reducing.This can carry out during manufacture, and/or can carry out at the life period of eyepiece and/or periodically.In each embodiment, may there is dynamic range aspect.Due to LED and/or power supply gradually grow dark, may there is the brightness that can by the power algorithm of accurate adjustment, both are consistent on a display.In each embodiment, user and/or manufacturer or eyepiece capable of regulating LED follow identical brightness curve in the time that power supply changes.Can there is RGB LED, and can mate LED curve between two displays.Therefore, can in a dynamic range, control brightness, color saturation, color balance, colourity etc.In each embodiment, can be during manufacture, in dynamic range, in measurements such as the life periods of glasses and control these.In each embodiment, equal brightness between two displays, color saturation, color balance, colourity etc. can be by actual creation, maybe can be created into the difference between the eyes based on user and be discovered by user.In each embodiment, can be carried out by user, manufacturer the adjustment of brightness, color saturation, color balance, colourity etc., and/or can be automatically performed based on feedback, various programmed algorithms etc. by eyepiece.In each embodiment, sensor feedback can cause automatically and/or manually adjusting at least one in brightness, color saturation, color balance, colourity etc.
In each embodiment, a kind of system can comprise the mutual wear-type eyepiece that user wears, wherein eyepiece comprises that user observes the optics assembly of surrounding environment and the content demonstrating by it, and for content being introduced to the integrated image source of optics assembly, wherein optics assembly comprises two or more displays, wherein at least one brightness adjusting in described display, color saturation, at least one in color balance and colourity, to make the brightness of described two or more displays, color saturation, at least one in color balance and colourity is relative to each other balanced in preset range.In each embodiment, this adjusting can comprise that at least one in brightness, color saturation, color balance and the colourity etc. that make described two or more displays is relative to each other in preset range.In each embodiment, can be based on the detection of the power that is delivered to integrated image source is made at least one the adjustment in brightness, color saturation, color balance and colourity etc.In each embodiment, this adjustment can be based on power algorithm, to make at least one in brightness, color saturation, color balance and colourity etc. consistent between two or more displays.In a further embodiment, the sensor that this adjustment can be based on whole optical sensor feedback.In each embodiment, can be during manufacture, at least one at least one in medium of the dynamic output area that produces of integrated image source in brightness adjusting, color saturation, color balance and colourity etc.In each embodiment, system can be suitable in the life cycle of eyepiece relative to each other at least one in brightness, color saturation, color balance and the colourity etc. of two or more displays described in periodically self-verifying.In each embodiment, system can be suitable at least one in brightness, color saturation, color balance and the colourity etc. of two or more displays described in relative to each other self-verifying, and optionally by the brightness of described two or more displays, color saturation, color balance and colourity etc. described at least one be set as predetermined value.In addition, one embodiment of this system can be suitable at least one in brightness, color saturation, color balance and the colourity etc. of two or more displays described in relative to each other self-verifying, and based on sensor feedback measure, optionally by the brightness of described two or more displays, color saturation, color balance and colourity etc. described at least one be set as predetermined value.
In each embodiment and with some optical arrangement described herein (as front lighting LCoS) explicitly, the contrast between described two or more displays can be adjusted to equal, or user is perceived as equal.In each embodiment, contrast can check and correspondingly adjust for each display, and can during manufacture process, be conditioned to calibrate and adjust display, and can be in manufacture process, in dynamic range, measured at the life period of glasses etc.In each embodiment, the contrast of system can be between two displays and than the external world by automatic calibration.In each embodiment, user can compensate the difference between his eyes.Contrast can be adjusted to compensate user's eyesight and/or Undersensing as required.In each embodiment, contrast ratio can change according to how assembled optical module is.As described herein, reduce parasitic light and can be devoted to the technology for assembling to provide high contrast ratio.In each embodiment, various types of single pixel intensity and/or many pixel colors detecting device can be inserted in optical element string, and some or all light that not all enter the eye movement scope (eyebox) of display are sampled.In each embodiment and depend on detecting device be placed on light path in where, can provide Real-time Feedback compensate rigging error, LED and LCoS panel output, vanning error, the compensation of hot and cold panel and/or maintain personal user's calibration to system.In each embodiment, the brightness and contrast of display can manage by good manufacturing practice.In addition, during manufacture, can carry out that quality analysis is tested and calibrating display and as required compensation as required.In addition, in the life-span of system, along with component wear or system are heated during use and catch a cold, can utilize the look-up table of offset to revise calibration.In each embodiment, can be carried out by user, manufacturer the adjustment of brightness, color saturation, color balance, colourity, contrast etc., and/or can be automatically performed based on feedback, various programmed algorithms etc. by eyepiece.In each embodiment, sensor feedback can cause automatically and/or manually adjusting at least one in brightness, color saturation, color balance, colourity, contrast etc.
In each embodiment, a kind of system can comprise the mutual wear-type eyepiece that user wears, wherein eyepiece comprises that user observes surrounding environment with the optics assembly of the content demonstrating and for content being introduced to the integrated image source of optics assembly by it, wherein optics assembly comprises two or more displays, wherein, at least one the adjusting contrast in described display, be relative to each other balanced in preset range with the contrast that makes described two or more displays.In a further embodiment, adjustable contrast equate it between two or more displays.In each embodiment, the medium adjusting contrast of dynamic output area that can produce during manufacture process, in integrated image source.In each embodiment, system can be suitable on the life-span of eyepiece periodically the contrast of two or more displays described in relative to each other self-verifying.In each embodiment, system can be suitable for the contrast of two or more displays described in relative to each other self-verifying, and is optionally predetermined value by the contrast settings of described two or more displays.In each embodiment, system can be suitable for the contrast of two or more displays described in relative to each other self-verifying, and measures based on sensor feedback, is optionally predetermined value by the contrast settings of described two or more displays.In each embodiment, contrast can be conditioned to compensate user's deficiency.In each embodiment, at least one in the light that contrast can produce according to parasitic light and integrated image source is conditioned.In each embodiment, the feedback of the detecting device in can the light path based on from system regulates contrast.Further, detecting device can comprise at least one in single pixel intensity and many pixel colors detecting device.In each embodiment, can provide Real-time Feedback compensate at least one in rigging error, LED and LCoS panel output, vanning error, the compensation of hot and cold panel and maintain personal user's calibration to system.In each embodiment, the calibration of contrast can be conditioned by the look-up table based on one or more offsets.
In one embodiment, particular optical configuration as herein described (as front lighting LCoS) allows camera to be inserted into along in many positions of optical element string, so that camera is placed directly on the optical axis.For example, camera sensor can be placed near LCoS, as the camera 10232 in Figure 102 B.This allows the measurement of position, diameter, speed and direction to pupil and the direct imaging to iris then.These measure and imaging can be used to secure log or load user arrange, by the measurement size of capillary and/or thickness detect health status, last watching area based in books arranges placeholder/bookmark etc.The data of the various assemblies about eyes that camera is collected can be used to control user interface, determine stress level, monitor warning, detect reaction to outside or stimulus that project etc.Because frontlighting optical device is sharp-pointed and compact, the camera with minimum pixel can be placed in optical element string, thereby keeps the overall dimensions of optical device little and guarantee high-definition picture.In each embodiment, can camera be placed in many parts of light path by insert beam splitter as in Figure 185, but also can allow camera to be placed on LCoS PCB, be directly embedded in LcoS silicon substrate, or other optical element strings are placed.
In each embodiment, in the time that camera is placed directly on the optical axis, camera perhaps can be seen or detect eyes or directly sees or inside ofeye detected.In each embodiment, system can be followed the tracks of eyes and be moved, and detects pupil and expands, and measures position, diameter, speed and the direction of pupil, and directly to iris imaging.In each embodiment, camera can determine that user is looking around or user is controlling eyepiece.Only, as example, camera can cause its transmitted signal to follow the tracks of the eyes Move Mode that eyes move by sensing, to make it sense the predetermined control order that user may carry out with his eyes.As example, the pattern that camera can move based on eyes of user is identified user's eyes and is being read the something in user interface.In these situations, camera is initiated the detection to a certain group of eyes order and is sent to eyepiece to carry out a certain function, such as opening Email etc.In each embodiment, camera can detect that user may focus on an object with the predetermined way of controlling eyepiece, such as by focusing on something with the time period extending, focuses on something, fast moving eyes and then focuses on this thing etc.Along with camera calibration is to such Move Mode, it can carry out a certain function to eyepiece transmitted signal.Only as example, focus on, shift sight and again focus on can make camera signal on the something of eyepiece user view in display to carry out " double-click ".Certainly, any this quasi-mode and/or algorithm can be used to move opertaing device by user's eyes.In each embodiment, camera can detect a certain Move Mode, and moves in application-specific is detected just in use when this, and camera can send signal specific to eyepiece based on this combination.As example, present and read consistent pattern if e-mail program is eyes that open and user, camera available signal notice eyepiece is opened the specific mail that user's eyes focus on.In each embodiment, can the detection based on camera initiate for the order of controlling eyepiece.
In each embodiment, the direct imaging of detection, retina and/or the iris of position, diameter, speed and the direction of camera to pupil etc. can be considered safety practice.As example, in the time that user puts on eyepiece, camera can be carried out retina scanning, Database Identification user on retina scanning contrast eyepiece or remote storage.In each embodiment, if user is identified as the owner of glasses or the user of glasses, it can be opened application and provide access to user.If their glasses do not identify user, they can pin or stop all or part of function.In each embodiment, user may not need this password, and eyepiece can automatically perform this function.In each embodiment, in the time that user is unrecognized, camera can obtain the identification information about wearer in the situation that wearer has stolen eyepiece.
In each embodiment, eyepiece can the detection based on eyes are moved, user's diagnosis is carried out in detection, the direct imaging to retina and/or iris etc. of position, diameter, speed and the direction to pupil.For example, diagnosis can be based on pupil dilation.For example, if user's pupil is expanded in the mode consistent with liar, camera and/or eyepiece can detect that this user lies.In addition,, if user obtained cerebral concussion, although the light of specified rate enters eyes, pupil also may change size.Eyepiece can warn user whether he obtained cerebral concussion.In each embodiment, can in the time that soldier, sportsman etc. exit physical efficiency activity, give them eyepiece, and eyepiece can be used to for example user is diagnosed as to cerebral concussion.Eyepiece can have customer data base that plate carries or that separate with eyepiece, this subscriber database stores with each user-dependent information.In one embodiment, in the time that sportsman leaves court to sideline, he can put on one's glasses to carry out retina scanning to carry out identifying user by database, then by detecting user's pupil size and comparing to diagnose or check user with the pupil size of expecting under given illumination condition.If user's data drop on outside desired extent, glasses can be told his pupil of user and to obtain cerebral concussion consistent.Can adopt similar purposes, such as detecting possible drug poisoning, detect retinal damage, detecting eye condition etc.
In each embodiment, Organic Light Emitting Diode (OLED) can be used to micro-display herein and/or the application of sensor, and can with such as OLEDCam(OLED camera) Fraunhofer system together with use, or be otherwise used to the detection that eyes are moved, or otherwise make eyes for illuminating user etc. together with eyepiece.In each embodiment, the equipment moving for detection of eyes can be placed on user's the optical axis along optical element string.In each embodiment, microscale optical launcher and receiver can be integrated in same chip.They can be implemented as two-way or unidirectional micro-display by array type structure.In each embodiment, equipment can present simultaneously and/or catch image.Micro-display may be the basis for the system of personal information, and can and identify mutual that user makes to user's presentation information.By being equipped with the eyepiece of two-way displays, user's perception environment that can do as usual, and extra information can be presented.Visual information can be adapted to the operation context of system, and user can come mutual by the movement of eyes or action.In each embodiment, CMOS chip can comprise and be positioned at micro-display and a camera on substrate, and the central member of substrate is the nested active matrix being made up of with photodiode OLED pixel.In each embodiment, pixel cell can be made up of R-G-B-Bai and R-G-B-photodiode pixel unit etc.
In each embodiment, a kind of system can comprise the mutual wear-type eyepiece that user wears, wherein this eyepiece comprise user by its observe surrounding environment and the content that demonstrates optics assembly, be suitable for the content demonstrating introduce the integrated image source of optics assembly and be placed on the camera in optics assembly along optical axis, to make camera can be observed at least a portion of user's eyes.In each embodiment, camera can be suitable for catching the image of eyes, pupil, retina, eyelid and/or eyelashes.In each embodiment, can initiate by least one image based on camera seizure for the order of controlling eyepiece.In each embodiment, at least one image that user's diagnosis can catch based on camera.At least one image that user's mark also can catch based on camera.As example, diagnosis can comprise the diagnosis to cerebral concussion.In each embodiment of this system, user's mark can be deployed as the security feature of eyepiece.In each embodiment, integrated image source can carry out illuminating eyes during picture catching at camera.Further, can be modulated during camera carries out picture catching from the light of image source.In each embodiment, camera can comprise one or more Organic Light Emitting Diodes (OLED).In each embodiment, can be thrown light on by various light, LED, OLED etc. in user's eyes or other positions including iris, pupil, eyelid, eyelashes etc. listed herein.In each embodiment, to the illumination of eyes of user can be used to imaging technique, catch eyes data, mark etc.
In one embodiment, this system can comprise the mutual wear-type eyepiece that user wears, wherein this eyepiece comprise user by its observe surrounding environment and the content that demonstrates optics assembly, be suitable for the content demonstrating to introduce the integrated image source of optics assembly and the equipment moving for detection of eyes.In each embodiment, the equipment moving for detection of eyes can comprise the microscale optical launcher and the receiver that are integrated in same chip.In each embodiment, this equipment can comprise CMOS chip, and this CMOS chip comprises micro-display and camera on a substrate.In each embodiment, this equipment moving for detection of eyes can be placed on user's the optical axis along optical element string.
In each embodiment, camera is arranged in optics assembly along optical axis, with make camera observe user eyes at least a portion and can carry out imaging to one or more in eyes, pupil, retina, eyelid and eyelashes.The eyes that integrated processor and camera are adapted to follow the tracks of user move; Measure at least one in pupil dilation, pupil position, pupil diameter, pupil speed and pupil direction; Move and move for the eyes of user of reading or watch attentively differentiation mutually by being contemplated to the eyes of user of controlling or order; User's eyes are moved to the order as the function for control integration processor or mutual wear-type eyepiece of processor; And user's eyes are moved as controlling user outside and the order of the equipment of wear-type eyepiece outside alternately.At least one image that diagnosis to user or mark can catch based on camera, as cerebral concussion.Can be deployed as the security feature of eyepiece to user's mark.This system can comprise the user's input interface based on move to control or signal external unit from user's eyes.Camera can be suitable for catching the image of eyes, and wherein image compares to indicate diagnosis with the database including other images of eyes.The optical axis in integrated image source and the optical axis of camera can be different.At least a portion of the optical axis in integrated image source and the optical axis of camera can be identical.
In augmented reality eyepiece such as camera, be integrated in microscale optical launcher and the receiver in same chip or on a substrate, comprise that the eyes that equipment the CMOS chip of micro-display and camera can detect user move.Integrated image source can be adapted to following at least one: during camera carries out picture catching to modulating from the light of image source and illuminating eyes.Camera can comprise one or more Organic Light Emitting Diodes (OLED).The equipment moving for detection of eyes can the optical axis along optical element displacement in user on or be positioned on the axle different from user's eyes.Integrated processor can be suitable for user's eyes to move and be interpreted as for the equipment in operating interactive wear-type eyepiece or the order of external unit.
The method that a kind of user's of detection eyes move can comprise wears wear-type eyepiece, this wear-type eyepiece comprise user by its observe surrounding environment and the content that demonstrates optics assembly, be adapted to the content demonstrating to introduce integrated processor and integrated image source and the camera of optics assembly, the eyes that detect user with camera and integrated processor move, and move with integrated processor and carry out opertaing device by eyes, the movement of wherein said camera calibration user's at least one eyes is also interpreted as order by described movement.Integrated processor can be distinguished between moving moving and be contemplated to as the eyes of order the eyes of watching attentively.Described method can comprise predetermined eyes are moved to the order that is interpreted as carrying out a certain function.Described method can comprise that at least one eyes that scan user are to determine user's mark.Described method can comprise that at least one eyes that scan user are with diagnosis user's health.Camera can comprise at least one Organic Light Emitting Diode (OLED).Specific eyes move and can be interpreted as specific order.Eyes move and can from the group of following formation, select: blink, the counting of repeatedly blinking, blink, blink rate, eyes open-closed (nictation at a slow speed), watch attentively trackings, to the eyes of a side move, up and down eyes move, move from a side to the eyes of a side, by the eyes of a series of positions move, to the eyes of ad-hoc location move, residence time a certain position, watching attentively towards the specific part of the watching attentively of fixed object, eyeglass by wear-type eyepiece.Described method can comprise by eyes and moves with user's input interface and carry out opertaing device.Described method can comprise and is shown to user with the view that described camera or second camera catch surrounding environment.
In each embodiment, eyepiece can utilize subconsciousness control aspect, as image around of wearer, with subconsciousness perception of the scene of presenting to user's image lower than the speed of consciousness perception, beholder is seen etc.For example, image is presented to wearer by the speed that can not perceive with wearer by eyepiece, discover but make wearer's subconsciousness the content being presented, such as reminding, warn (as asked wearer to increase the warning of the concern rank to something, but need not be too much so that user needs full consciousness to remind), the instruction relevant with wearer's direct environment (as eyepiece detects in wearer's the visual field wearer may interested something, the interest of this instruction attraction wearer to this thing) etc.In another example, eyepiece can monitor that interface provides indicator to wearer by brain activity, and wherein before Individual Consciousness identifies image to them, IC electric signal excites.For example, brain activity monitor interface can comprise electroencephalogram (EEG) sensor (etc.) in the time that wearer observes current environment, monitor brain activity.In the time monitoring that by brain activity interface eyepiece senses wearer and starts a certain element of " discovering " surrounding environment, eyepiece can provide level of consciousness feedback to wearer, to make wearer more perceive this element.For example, wearer may unconsciously start to perceive and in crowd, sees familiar face (as friend, suspect, famous person), and eyepiece provides vision or audio frequency to indicate to make wearer to notice more consciously this individual to wearer.In another example, wearer can check with a certain subconsciousness level and cause the product that they note, eyepiece to wearer consciousness instruction is provided, about the enhancing view of the more information of this product, this product, about link of the more information of this product etc.In each embodiment, the ability that wearer's reality is expanded to subconsciousness level by eyepiece can make eyepiece to provide and to exceed the augmented reality that wearer experiences the normal consciousness of their world around to wearer.
In each embodiment, eyepiece can have multiple modes of operation, wherein the control of eyepiece is controlled according to the position of hand, shape, motion etc. at least in part.For this control is provided, eyepiece can utilize hand recognizer to detect the shape of hand/finger, then those hand configurations (may be combined with the motion of hand) is associated as to order.In reality, because hand configuration and the motion that may only have limited quantity can be used for order eyepiece, these hand configurations may need to be reused according to the operator scheme of eyepiece.In each embodiment, can be to eyepiece is assigned to particular hand configuration or motion from a Mode change to next pattern, thus allow to reuse chirokinesthetic.For example, and with reference to figure 15F, user's hand 1504F can be moved in the visual field of the camera on eyepiece, according to this pattern, this moves and then can be interpreted as different orders, such as circular motion 1508F, through the motion 1510F in the visual field, move back and forth 1512F etc.In the example of a simplification, to suppose to have two kinds of operator schemes, pattern one is the view from the image projecting for pan, the image that pattern two projects for convergent-divergent.In this example, user may want with from left to right, the hands movement of finger pointing carrys out order pan and moves to the right.But user also may want to carry out command diagram with little hands movement left-to-right, finger pointing and look like to amplify greatlyr.In order to allow the dual use of this hands movement for two kinds of command types, eyepiece can be configured to differently explain hands movement according to the pattern at the current place of eyepiece, and wherein particular hand motion has been assigned for Mode change.For example, clockwise rotation can indicate the transformation from pan to zoom mode, can indicate the transformation from zooming to pan pattern and be rotated counterclockwise motion.This example is intended to illustrative instead of limits by any way, those skilled in the art will recognize that how this current techique can be used to realize the various order/mode configurations that use hand and finger, as the configuration-motion of hand-finger, both hands configuration-motion etc.
In each embodiment, a kind of system can comprise the mutual wear-type eyepiece that user wears, wherein eyepiece comprises that user observes the optics assembly of surrounding environment and the content that demonstrates by it, and wherein this optics assembly comprises the correcting element of the view of correcting user to surrounding environment, is shown to user's integrated processor and for content being introduced to the integrated image source of optics assembly for the treatment of content; And the integrated camera apparatus that posture is carried out to imaging, wherein said integrated processor identifies and is interpreted as command instruction by described posture.Steering order can provide manipulation to wanting displaying contents, convey to order of external unit etc.
In each embodiment, can move by eyes, the control to eyepiece is enabled in the action of eyes etc.For example, on eyepiece, may there is the camera of reviewing wearer's eyes, wherein eyes move or move and can be interpreted as command information, such as passing through nictation, nictation repeatedly, count nictation, blink rate, open-closure of eyes, watch tracking attentively, eyes to a side move, upper and lower eyes move, move to the eyes of a side from a side, eyes by a series of positions move, eyes to ad-hoc location move, the residence time in a certain position, towards watch attentively (as the bight of the eyeglass of eyepiece) of fixed object, watching attentively of specific part by eyeglass, watch attentively in real world objects first-class.In addition, eyes control can make beholder can focus on the specified point the image demonstrating from eyepiece, and because camera perhaps can be relevant to the point on display by the direction of observation of eyes, where eyepiece just perhaps can seen and the combination of wearer's action (as blinked, the movement of touch interfacing equipment, position sensing device etc.) carrys out interpreted command by wearer.For example, beholder perhaps can see a certain object on display, and is realized ground, selected this object by the motion of finger by position sensing device.
In certain embodiments, glasses can be equipped with the eye-tracking device of the movement of eyes for following the tracks of user (or preferably eyes); Alternatively, glasses can be equipped with the sensor for six-freedom degree mobile tracking, i.e. head mobile tracking.These equipment or sensor can obtain from the ISCAN of the Chronos Vision GmbH of Berlin, Germany and the fertile this city of Massachusetts, United States.Retina scanners also can be used for following the tracks of eyes and moves.Retina scanners also can be installed in augmented reality glasses, and can obtain from various companies, as the SMI of the Tobii of Stockholm, SWE, German Tai Ertuo and aforesaid ISCAN.
Augmented reality eyepiece also comprises that as directed user's input interface is used for allowing user control device.Input for opertaing device can comprise any sensor of sensor discussed above, and also can comprise Trackpad, one or more function key and any other suitable Local or Remote equipment.For example, eye-tracking device can be used to control another equipment, as video-game or external trace device.As example, Figure 29 A has described to have the user of the augmented reality eyepiece of having equipped the eye-tracking device 2900A that discuss in other places in this document.Eye-tracking device allows eyepiece to follow the tracks of the direction of user's single eyes (or preferably eyes), and will mobile send to the controller of eyepiece.Control system comprises augmented reality eyepiece and the opertaing device for weapon.Move and then can be transmitted to the opertaing device for weapon, the control of weapon controlled device, opertaing device is in user's sight line.Then the movement of eyes of user is converted into the signal of the movement for controlling weapon by suitable software, such as quadrant (scope) and orientation (direction).Can use additional control together with eye tracking, such as Trackpad or the function key of utilizing user.Weapon can be the large caliber weapon such as howitzer or mortar, can be maybe the minor-caliber weapon such as machine gun.
Then the movement of eyes of user is converted into the signal of the movement for controlling weapon by suitable software, such as quadrant (scope) and orientation (direction) of weapon.Additional control can be used to the single of weapon or transmitting continuously, such as Trackpad or the function key of utilizing user.Alternatively, weapon can be fix with nondirectional, such as the thunder of burying underground or shape charge weapon, and can be protected by safety apparatus, such as by the order that requires specific coding.The user of augmented reality equipment can activate weapon with order by sending suitable code, and without using eye tracking feature.
In each embodiment, can enable the control to eyepiece by wearer's posture.For example, eyepiece outwards can have (such as forward, to a side, downwards) camera seen by the posture of wearer's hand or the mobile control signal that is interpreted as.Hand signal can comprise makes hand pass through camera, before camera, convert hand position or synthetic language, sensing real world objects (such as the enhancing that is used for activating object) etc.Hands movement also can be used to handle the object being presented on translucent eyeglass inner side, such as mobile object, target rotation, deletion object, open-close screen in image or window etc.Although used hands movement in example above, any position of health or the object that wearer held or wore also can be used to carry out gesture recognition by eyepiece.
In each embodiment, head movement control can be used to send order to eyepiece, wherein the motion sensor such as accelerometer, gyroscope or any other sensor as herein described can be installed in that wearer's head is upper, on eyepiece, in cap, the helmet is medium.With reference to figure 14A, head movement can comprise the rapid movement of head, such as head to before or after sudden moving 1412, head to above and/or under sudden moving 1410, from a side to opposite side, ground swings suddenly, stops in a certain position head, such as to a side, movement and remain on appropriate location etc.Motion sensor can be integrated in eyepiece, by being connected with eyepiece wired or wireless on the head that is arranged on user or in the overcover (as cap, the helmet) of head etc.In each embodiment, user can wear mutual wear-type eyepiece, and wherein eyepiece comprises that user watches the optics assembly of surrounding environment and the content demonstrating by it.Optics assembly can comprise the correcting element of the view of correcting user to surrounding environment, for the treatment of content for being shown to user's integrated processor and for content being introduced to the integrated image source of optics assembly.At least one in multiple head movement sensing control equipment can be integrated with eyepiece or eyepiece be associated, they provide control command as command instruction based on the predefined head movement feature of sensing to processor.Head movement feature can be the swing of user's head, to make this swing be and the dissimilar obvious motion of common head movement.This obvious motion can be the sudden moving motion of head.Steering order can provide the manipulation for displaying contents, convey to order of external unit etc.Head movement control can with other control gears combined used, such as carrying out activation command with another control gear as discussed herein and head movement being carried out.For example, wearer may want a certain object to move right, and as discussed herein like that by eyes control, selects this object and activate head movement control.Then, by their head is tilted to the right, object can be moved right by order, and this order is terminated by eyes control.
In each embodiment, can control eyepiece by audio frequency, such as passing through microphone.Sound signal can comprise speech recognition, speech recognition, voice recognition, sound detection etc.Can detect audio frequency by the microphone on eyepiece, throat's microphone, jawbone microphone, suspended microphone, headphone, the earplug etc. with microphone.
In each embodiment, order input can provide multiple control functions, such as opening/closing eyepiece projector, opening/closing audio frequency, opening/closing camera, the projection of opening/closing augmented reality, opening/closing GPS, mutual (as the function as shown in selecting/accepting with display, image or video etc. that playback catches), mutual (as caught image or video with real world, to shown book page turning etc.), with that embed or outside mobile device (as mobile phone, navigator, musical instruments, VoIP etc.) perform an action, be used for the browser control of the Internet (as submitted to, next result etc.), Email control is (as read Email, show text, text voice conversion, typesetting, select etc.), GPS and Navigation Control are (as preserved position, transfer the position of preservation, direction is shown, place on consulting a map) etc.In each embodiment, eyepiece or its components can be indicated and by automatic open and/or closed by sensor, such as from IR sensor, accelerometer, power sensor, micro switch, capacitive sensor, by eye tracking checkout equipment etc.For example, when user while taking off, no longer has the capacitive sensor at physical contact (such as the bridge of the nose place at user's nose) from their head by eyepiece with user's skin by this eyepiece of sensing, eyepiece can be closed automatically.It will be appreciated by those skilled in the art that other similar configuration of when being taken off for sensing eyepiece.In each embodiment, when eyepiece can be attached to eyepiece or pull down from eyepiece by sensing detachable member, and can utilize this sensing to carry out the each side of opening/closing eyepiece.For example, a part of optical device is dismountable, and in the time that this optical device part is removed, the power supply of that half to eyepiece system is saved the electric power of battery with regard to power-off.The disclosure can comprise power supply management device, and wherein power supply management device is controlled the electric power that offers selected eyepiece component in response to sensor.Eyepiece can be installed in the picture frame with nose support and collapsible mirror leg, and wherein the hinge of picture frame is attached to folding mirror leg, and wherein sensor can be installed in the nose support of picture frame, in mirror leg, in hinge etc.Selected assembly can be image source, processor etc.In the time that user does not wear eyepiece, power supply management device can be in park mode, and wherein park mode can comprise periodically read sensor, and wherein in the time that power supply management device detects that user is just adorning oneself with eyepiece, it is transformed into awakening mode and eyepiece is powered.Power supply management device can be based on eyepiece function use, integrated battery in remaining electric power, network availability, rate of power consumption etc. reduce the power supply to assembly.Reducing power supply can be based on user preference.User can ignore power supply by order and reduce.In the time that power supply is reduced, can provide an instruction to user by the user interface of eyepiece.If the luminance level of image source reduces the power supply of image source owing to reducing, the electrochromism density in optics assembly can increase.
In each embodiment, eyepiece can provide 3D to be shown as picture to user, show image, stereographic map/space image, view sequential display, electro-hologram display, parallax " dual-view " display and parallax panoramagrams, reimaging system etc. such as the hologram by passing on stereopsis, automatic stereo, computing machine to generate, body, thereby created 3D depth perception for beholder.Show that to user 3D rendering can adopt different images to present to user's left eye and right eye, such as having certain optical module of distinguishing this image in left light path and right light path, project left eye from different images to user and right eye at projector apparatus, etc.Comprise from projector apparatus and can comprise graphic display device by light path to the light path of user's eyes, this graphic display device forms the visual representation of a certain object with three physical dimensions.Processor processor in integrated processor or external unit in eyepiece can provide 3D rendering to process as at least one step that generates 3D rendering to user.
In each embodiment, line holographic projections technology can be used to present 3D imaging effect to user, the hologram (CGH) generating such as computing machine, and it is the method that numeral generates holographic interference pattern.For example, hologram image can be projected by holographic 3D display, such as the display that carrys out work based on relevant interference of light.The advantage that the hologram that computing machine generates has is that people want the object illustrating not need to have any physical reality completely, that is to say that they can be fully generated as " integral hologram ".There is the multiple diverse ways of the interference figure for calculating CGH, comprise from holographic information and calculate in reduction field and calculating and quantification technique.For example, Fourier transformation method and point-source hologram are two examples of computing technique.Fourier transformation method can be used to each depth plane of simulated object to the propagation of hologram plane, and wherein the reconstruct of image can occur in far field.In instantiation procedure, may there be two steps, wherein first the light field in remote viewing person plane is calculated, and then this is converted back to eyeglass plane by Fourier transform, will be wherein the stack in the degree of depth of the Fourier transform of each object plane by the wavefront of hologram reconstruction.In another example, target image can be multiplied by the phase diagram of having applied inverse Fourier transform.Then in the middle of, hologram can amass to generate by this image that is shifted, and combination is to create final collection.Then the final collection of hologram can be similar to form kinoform and be carried out order demonstration to user, and wherein kinoform is phase hologram, and wherein the phase-modulation of object wavefront is registered as surface undulation profile.In holography of point objects drawing method, object is broken down into spontaneous luminous point, wherein each point source is calculated to element hologram, synthesizes final histogram by the element histogram that superposes all.
In one embodiment, can enable 3D or holographic imaging by dual projector system, wherein two projectors are stacked on and over each otherly export for 3D rendering.Can enter line holographic projections pattern by controlling mechanism as herein described or by catching image or signal (read such as palm turned upwards ground stretches hand, SKU, RDIF etc.).For example, the wearer of eyepiece can watch a letter " X " on cardboard, this make eyepiece enter holographic pattern and open second, stacking projector.Useful control technology is carried out the selection to showing what hologram.Projector can project to hologram on the letter " X " on cardboard.The software being associated can follow the tracks of the position of letter " X " and along with letter " X " movement move the image projecting.In another example, eyepiece can scan SKU, builds the SKU in external member such as toy, and the 3D rendering that can build from the toy having visited at line source or nonvolatile memory.With hologram mutual (such as rotation it, zoom in/out etc.) can be undertaken by controlling mechanism as herein described.Scanning can be enabled by the bar code/SKU scanning software being associated.In another example, can in space or on surface, project keyboard.Holographic keyboard can be used to or control any applications/functions being associated.
In each embodiment, eyepiece equipment can be used for the position with respect to true environment object (as desk, wall, Vehicular instrument panel etc.) locking dummy keyboard, then wherein dummy keyboard not along with wearer moves their head and move.In an example and with reference to Figure 24, user may be sitting in before desk and wear eyepiece 2402, and wishes input text in the application such as word processing application, web browser, communications applications etc.User perhaps can generating virtual keyboard 2408 or other Interactive control elements (as virtual mouse, counter, touch-screen etc.) come for input.User can be provided for the order of generating virtual keyboard 2408, and 2404 fixed positions that indicate dummy keyboard 2408 that make to use gesture.So dummy keyboard 2408 can remain fixed in space with respect to external environment condition, such as a certain position being fixed on desk 2410, even if wherein, in the time that user rotates their head, eyepiece equipment also remains on the position of this dummy keyboard 2408 on desk 2410.That is to say, eyepiece 2402 can compensate user's head movement to the User of dummy keyboard 2408 is remained on desk 2410.In each embodiment, user can wear interactive wear-type eyepiece, and wherein eyepiece comprises that user watches the optics assembly of surrounding environment and the content demonstrating by it.Optics assembly can comprise the correcting element of the view of correcting user to surrounding environment, be shown to user's integrated processor and for content being introduced to the integrated image source of optics assembly for the treatment of content.Can provide to surrounding environment imaging and by user's gesture be designated the order of Interactive control element position (such as move with ad hoc fashion, configure with hand-finger of ad hoc fashion location etc.) integrated camera apparatus.In response to the order of Interactive control element position, no matter how user's view direction changes, so the position of Interactive control element can keep fixing with respect to the object in surrounding environment in position.In this way, user perhaps can be to use the mode that physical keyboard is similar identical to utilize dummy keyboard with them, and wherein dummy keyboard remains in same position.But the in the situation that of dummy keyboard, carrying out limited subscriber less than " physical restriction " such as gravity can be positioned at keyboard where.For example, user can stand against wall, and keyboard position is arranged on wall etc." dummy keyboard " technology that it will be appreciated by those skilled in the art that can be applied to any controller, such as virtual mouse, virtual touch plate, virtual game interface, virtual telephony, virtual counter, virtual paintbrush, virtual plotting sheet etc.For example, virtual touch plate can be visual to user by eyepiece, and located by user by making to use gesture, and alternate physical touch pad and being used.
In each embodiment, eyepiece equipment can use visualization technique, such as be similar to the deformation such as parallax, keystone distortion by application class, the projection of object (as dummy keyboard, keypad, counter, notebook, operating rod, control panel, book etc.) is presented from the teeth outwards.For example, the outward appearance that projects the keyboard on the desktop in face of user with suitable skeleton view can be assisted by application keystone distortion effect, and the projection wherein providing to user by eyepiece is looked like to lie on the surface of desk so that it is looked by deformation.In addition, dynamically apply these technology, thereby even if user moves about this surface, also provide suitable skeleton view everywhere.
In each embodiment, eyepiece equipment can provide gesture recognition, and gesture recognition can be used to provide keyboard and mouse to experience with eyepiece.For example, utilize the image of keyboard, mouse and the finger of the middle and lower part that covers display, system perhaps can be followed the tracks of in real time finger position and be enabled virtual desktop.By gesture recognition, just can follow the tracks of without wired and externally fed equipment.In another example, without wired and external power source, the gesture recognition that can be undertaken by eyepiece is followed the tracks of fingertip location, such as utilizing the gloves that have passive RFID chip in each finger tip.In this example, each RFID chip can have its oneself response characteristic, thereby multiple fingers can be read simultaneously.RFID chip can match with glasses, so that they can be distinguished mutually with near other RFID of work.Glasses can provide signal to activate RFID chip and have two or more receiving antennas.Each receiving antenna can be connected to a phase measuring circuit element, and this phase measuring circuit element provides input to location positioning algorithm then.Location positioning algorithm also can provide speed and acceleration information, and this algorithm finally can provide keyboard and mouse message to eyepiece operating system.In each embodiment, utilize two receiving antennas, can determine by the phase differential between receiving antenna the position of orientation of each finger tip.Then relative phase difference between RFID chip can be used to determine the radial position of finger tip.
In each embodiment, eyepiece equipment can be presented on the projection of the medical scanning previously having obtained (such as X ray, ultrasound wave, MRI, PET scanning etc.) on wearer's health with visualization technique.For example, and with reference to figure 24A, the radioscopic image of the addressable hand getting collection to wearer of eyepiece.Then eyepiece can utilize its integrated camera to check wearer's hand 2402A, and the projects images 2404A of X ray is superimposed upon on hand.In addition, eyepiece perhaps can keep image stack in the time that wearer moves their hand, and relative to each other watches attentively.In each embodiment, in the time that seeing mirror, wearer also can realize this technology, and wherein eyepiece is exchanged an image on reflected image.This technology can be used as a part for diagnostic routine, moves and goes on a diet, explains diagnosis or situation etc. to patient for rehabilitation, the encouragement of during physiotherapy.Image can be wearer's image, from general image of medical conditions image data base etc.If general stack can illustrate for health be the internal problem of typical certain type, about follow particular routine reach after a period of time health by look projection how, etc.In each embodiment, the external control devices such as indicating needle controller can allow the manipulation to image.Further, the stack of image can be synchronous between multiple people, and everyone wears eyepiece as described herein.For example, patient and doctor can project image onto patient on hand, and wherein doctor can explain body illness now, and patient watches the synchronous images of the scanning projecting and doctor's explanation simultaneously.
In each embodiment, eyepiece equipment can be used for removing the part (live as user's stick shift, wherein do not wish keyboard to project user on hand) that occurs intervening obstruction in dummy keyboard projection.In an example and with reference to Figure 30, eyepiece 3002 can provide the dummy keyboard 3008 projecting to wearer, such as on the table.Then wearer can touch dummy keyboard 3008 and key at " on it ".Because keyboard is only dummy keyboard instead of the physical keyboard projecting, in the case of the image projecting not being done certain compensation, the virtual machine projecting by be projected onto user the back of the hand " on ".But as in this example, eyepiece can afford redress to the image projecting, can from this projection, be removed with the part that makes wearer's hand 3004 hinder dummy keyboard expection projection on the table.That is to say, may not wish that the some parts of keyboard projection 3008 is visualized user on hand, the hand 3004 that therefore eyepiece deducts dummy keyboard projection and wearer is put the part at a place altogether.In each embodiment, user can wear mutual wear-type eyepiece, and wherein eyepiece comprises that user watches the optics assembly of surrounding environment and the content demonstrating by it.Optics assembly can comprise the correcting element of the view of correcting user to surrounding environment, be shown to user's integrated processor and for content being introduced to the integrated image source of optics assembly for the treatment of content.The content demonstrating can comprise Interactive control element (as dummy keyboard, virtual mouse, counter, touch-screen etc.).Integrated camera apparatus can this body part imaging to user in the time of user's a certain body part and Interactive control element interactions, the wherein view of processor based on user, removes this part of Interactive control element by deducting a part that is confirmed as putting altogether a place with the user's body position of imaging for Interactive control element.In each embodiment, this Journalistic image is removed technology can be applied to other image projecting and barriers, and and does not mean that and be limited to hand this example on dummy keyboard.
In each embodiment, eyepiece equipment can provide intervening obstruction to any virtual content being presented on " truly " world's content.If a certain reference frame is confirmed as being placed in a certain distance by interior, the any object passing between virtual image and beholder can from shown go out content be removed, to cause interruption to the expection user that shown information is present in specified distance.In each embodiment, also can increase the perception to the distance level between watched content by various adjustable focus technology.
In each embodiment, eyepiece equipment can be used for from the ability such as determine the text input of expection dummy keyboard such as a series of characters contact of streaking by finger, stylus, whole hand etc.For example, and with reference to Figure 37, eyepiece may project dummy keyboard 3700, and wherein user wishes to input word " wind ".Conventionally, user will press discretely corresponding to " w ", then " i ", each key position of " n ", last " d " then, and a certain equipment being associated with eyepiece (camera, accelerometer etc., as described herein) will be interpreted as the letter corresponding to this position each position.But system also perhaps can monitor user's finger or other pointing devices across the movement of dummy keyboard or draw and sweep, and the best-fit of definite pointer movement coupling.In this accompanying drawing, pointer is located to start and the inswept path 3704 of passing through character e, r, t, y, u, i, k, n, b, v, f and d at character " w ", is parked in d.Eyepiece can be observed this sequence and such as determining this sequence by input path analyzer, by the sequence feed-in word match search equipment sensing, and exports best-fit word, is that " wind " is as text 3708 in this situation.In each embodiment, eyepiece can monitor pointing device across the motion of keyboard and more directly determine word, such as by automatic whole-word coupling, pattern-recognition, object identification etc., wherein certain " separator " indicates the space between word, such as circumnutation of the touching of the time-out in the motion of pointing device, pointing device, pointing device etc.For example, whole piece is drawn and is swept path and can make together with pattern or object recognition algorithm that the discrete mode that forms is associated to form word for whole word and user's finger being moved through to each character, and the time-out between wherein mobile is as the description between word.Eyepiece can provide inventory of best-fit word, best-fit word etc.In each embodiment, user can wear mutual wear-type eyepiece, and wherein eyepiece comprises that user watches the optics assembly of surrounding environment and the content demonstrating by it.Optics assembly can comprise the correcting element of the view of correcting user to surrounding environment, for the treatment of content for being shown to user's integrated processor and for content being introduced to the integrated image source of optical module.Shown go out image can comprise interactive Keyboard Control element (as dummy keyboard, counter, touch-screen etc.), wherein Keyboard Control element is associated with input path analyzer, word matched search equipment and keyboard inputting interface.User can by by a pointing device (as finger, stylus etc.) to want the character keys that slips over keyboard inputting interface as the sliding motion of the roughly sequence of the word of text input to carry out input text by user, wherein input path analyzer is determined the character contacting in input path, and word matched equipment is found the best word matched of the character string contacting and this best word matched is inputted as input text.In each embodiment, can be the something except keyboard with reference to displaying contents, such as the sketching board for handwritten text or be similar to 4 other interfaces to operating rod pad of controlling game or real machine people and aircraft with reference to etc.Another example can be virtual jazz drum, " touches " the colour pad of sounding such as having user.Eyepiece is explained across the ability of the pattern of the motion on a certain surface can allow reference projection content, points to, and provide vision and/or audible feedback to user to give user's something.In each embodiment, " motion " that eyepiece detects can be the motion of user's eyes of user in the time seeing surface.For example, eyepiece can have the equipment that the eyes for following the tracks of user move, and by have the content display position of the dummy keyboard projecting and the direction of gaze of eyes of user both, eyepiece perhaps can detect eyes of user and move across the sight line of keyboard, then like that motion is interpreted as to word as described herein.
In each embodiment, eyepiece can provide the ability of carrying out order eyepiece by gesture " sky-writing ", the finger that uses them such as wearer is aloft drawn and is scanned out letter, word etc. in the visual field of the eyepiece camera embedding, wherein eyepiece by finger motion be interpreted as for ordering, stamped signature, write, send e-mails, send the documents this etc. letter, word, symbol.For example, wearer can utilize by this technology " aerial stamped signature " sign document.Wearer can write text by this technology, such as in Email, text, document etc.Wearer's eyepiece can be control command by the Symbol recognition of making by hands movement.In each embodiment, as described herein, the gesture identification that the image that can catch by eyepiece camera is explained or realize sky-writing by other input control apparatus (as the Inertial Measurement Unit in the equipment on finger, hand etc. through being arranged on user (IMU)).
In each embodiment, eyepiece equipment can be used for presenting the content that demonstrate corresponding with identified mark, and this mark indicates the intention that shows this content.That is to say, can show certain content based on sensing predetermined external visual cues by order eyepiece.Visual cues can be image, icon, photo, face recognition, hand configuration, health configuration etc.The content demonstrating can be transferred out for interface equipment, contribute to help user to find the navigation of a certain position, advertisement in the time that eyepiece is watched a certain target image, be rich in profile of information etc. in the time that user reaches a certain travelling place.In each embodiment, the content for showing that visual indicia clue and they are associated can be stored in the storer on eyepiece, be stored in outer computer memory device and be imported into as required (such as according to geographic position, with the degree of approach, user's the order etc. that trigger target), generated by third party etc.In each embodiment, user can wear mutual wear-type eyepiece, and wherein eyepiece comprises that user watches the optics assembly of surrounding environment and the content demonstrating by it.Optics assembly can comprise the correcting element of the view of correcting user to surrounding environment, for the treatment of content for being shown to user's integrated processor and for content being introduced to the integrated image source of optics assembly.Can provide the integrated camera apparatus to external visual cues imaging, wherein integrated processor mark external visual cues be interpreted as showing the order of the content being associated with this visual cues.With reference to Figure 38, in each embodiment, visual cues 3812 can be included in a certain direction board 3814 in surrounding environment, and the content wherein projecting is associated with advertisement.This direction board can be billboard, and described advertisement is the personalized advertisement of the preference profile based on user.Visual cues 3802,3808 can be gesture, and the content projecting can be the dummy keyboard 3804,3810 projecting.For example, gesture can be from the thumb of first user hand and forefinger gesture 3802, and dummy keyboard 3804 is incident upon on the palm of first user hand, and wherein user can key in second user's hand on this dummy keyboard.Gesture 3808 can be the thumb of user's both hands and the combination of forefinger gesture, and dummy keyboard 3810 is projected according to the configuration of this gesture between user's both hands, and the thumb of the hand that wherein user can user is keyed on this dummy keyboard.Virtual clue can provide predetermined external visual cues and adopt the robotization resource that is associated of expected result of projection content way to the wearer of eyepiece, thereby makes the wearer need not own search clue.
In each embodiment, eyepiece can comprise the visual identity Language Translation equipment of the translation for visual rendering content is provided, such as for road sign, menu, billboard, shop sign, books, magazine etc.Visual identity Language Translation equipment can utilize optical character identification to carry out identifier word mother from content, by translation database, alphabetic string and word and expression is matched.This ability can be intactly included in eyepiece, such as adopting off-line mode, or is comprised at least in part in external computing device, such as on external server.For example, user may be in foreign country, and wherein the wearer of eyepiece does not understand direction board, menu etc., but eyepiece can provide translation for it.These translations can show as annotation for user, substitute foreign language word (such as on direction board) with translation, offer user etc. by audio translation.In this way, wearer need not make great efforts to search word translation, but can automatically be provided word translation.In an example, the user of eyepiece may be Italian and come the U.S., and they have a large amount of road signs of explanation so that the needs of safe driving.With reference to figure 38A, the Italian user of eyepiece is seeing parking (STOP) the direction board 3802A of the U.S..In this example, eyepiece can identify the letter on this direction board, and word " stop " is translated into Italian parking " arresto ", and stop signpost 3804A is looked pronounce word " arresto " instead of " stop ".In each embodiment, eyepiece also can provide simple interprets messages to wearer, audio translation is provided, provides translation dictionary etc. to wearer.The disclosure can comprise a kind of mutual wear-type eyepiece of being worn by user, and wherein eyepiece comprises the integrated image source that user is watched the optics assembly of surrounding environment and the content demonstrating and is suitable for content to introduce optics assembly by it; For the integrated camera to the text imaging of seeing in surrounding environment; For by relevant one or more characters of one or more characters of the text from watched and first language and by optical character recognition equipment relevant one or more characters of one or more characters of first language and second language, wherein one or more characters of second language are rendered as the content demonstrating by integrated image source, and the content wherein demonstrating is locked in the position with respect to one or more characters of the text from watched.One or more characters of second language present the annotation that can show as user, and be placed as with respect to original viewed text the content demonstrating.Presenting in the viewing location that may be superimposed on original viewed text of one or more characters of second language, be superimposed on the font feature of mating original viewed text that presents on original viewed text such as one or more characters of second language.Viewed text is positioned on direction board, printed document, books, road sign, billboard, menu etc.Optical character recognition equipment can be incorporated in eyepiece, provides or provide in the mode of inside and outside combination in eyepiece outside.These one or more characters can be word, phrase, alpha-numeric string etc.One or more characters of second language can be stored in external unit and be tagged to can use in the time that the second eyepiece is watched one text, comprise geographic position instruction, object identifier etc. such as tagging.In addition, in the time that the view of text shifts out outside the view of eyepiece, the presenting of one or more characters of second language can be stored so that proper text it is recalled to for presenting object while being moved back within the view of eyepiece.
In one example, eyepiece can be used in self-adaptation advertisement, such as for blind users.In each embodiment, the result of face recognition or object identity can be processed to obtain the result that can listen, and can be used as audio frequency and present to the wearer of glasses by the earplug/earphone being associated.In other embodiments, the tactile vibrations that the result of face recognition or object identity can be converted into glasses or be associated in controller.In an example, if someone stands in face of the user of self-adaptation glasses, camera can and send image to integrated processor to this people's imaging and supply by facial recognition software processing, or sends the facial recognition software being operated on server or in cloud to.The result of face recognition can be rendered as the penman text in the display of glasses for some individuality, but for blind person or weak-eyed user, result can be processed to obtain audio frequency.In other examples, object identification can determine that user is approaching roadside, doorway or other objects, and glasses or controller by sense of hearing ground or sense of touch warn user.For weak-eyed user, the text on display can be exaggerated or contrast can be enhanced.
In each embodiment, GPS sensor can be used to determine the user's who wears adaptive display position.GPS sensor can be accessed and in the time that user approaches or reaches various point of interest, audibly be notified user by navigation application.In each embodiment, by navigation application, user is audibly directed to terminal.
Eyepiece may be useful for various application and market.Should understand the function that controlling mechanism as herein described can be used to control application as herein described.Eyepiece can once move single application or can once move multiple application.Switching between application can be undertaken by controlling mechanism as herein described.Eyepiece can be used to military application, game, image recognition apply to check/subscribe e-book, GPS navigation (position, direction, speed and ETA(Estimated Time of Arrival)), mobile TV, physical culture (check leg speed, rank, match number of times; Receiving coach teaches), tele-medicine, industrial detection, aviation, shopping, stock control tracking, fire-fighting (enabling by understanding thoroughly smog, haze, dark VIS/NIRSWIR sensor), open air/risk, customized advertising etc.In one embodiment, eyepiece can be with Email, the Internet, the web-browsing of GMAIL in Fig. 7 and so on, check together with sports score, Video chat etc. and use.In one embodiment, eyepiece can be used to education/training goal, such as by showing that substep instructs (such as hands-free, wireless maintenance and repairing instruction).For example, manual video and/or instruction can be displayed in the visual field.In one embodiment, eyepiece can be used to fashion, health & beauty.For example, possible suit, hair style or cosmetics can be projected onto on user's mirror image.In one embodiment, eyepiece can be used to business intelligence, talks and meeting.For example, user's nametags can be scanned, and their face is fast by facial-recognition security systems, or the name that they say is searched to obtain biological information in database.Can be recorded for follow-up and check or filter through nametags, face and the session of scanning.
In one embodiment, " pattern " can be entered by eyepiece.In this pattern, application-specific may be available.For example, the eyepiece of consumer's version can have visitor's pattern, educational pattern, internet mode, TV pattern, game mode, motor pattern, designer's pattern, personal assistant pattern etc.
The user of augmented reality glasses may wish to participate in video call or video conference in wearing spectacles.Many computing machines (desk-top computer and laptop computer) all have integrated camera and come video call easy to use and meeting.Typically, software application is used to the use of camera mutually integrated with calling or conference features.By on knee and most of functions other computing equipments are provided at augmented reality glasses, many users may wish to utilize video call and video conference wearing in augmented reality eyes move.
In one embodiment, video call or video conference application can connect work with WiFi, or can be the 3D that is associated with user's cell phone or a part for 4G call network.Camera for video call or meeting is placed in device controller, such as wrist-watch or other independent electronic computing devices.It is unpractical that video call or conference camera are placed on augmented reality glasses because this placement meeting only provides themselves view to user, and can display conference or call out in other participants.But the camera that user can choice for use face forward shows another individuality in their environment or video call.
Figure 32 has described for the typical camera 3200 in video call or meeting.This camera is normally little, and can be installed on other portable computing devices on all wrist-watches 3202 as shown in Figure 32, on cell phone or including laptop computer.Video call is by being connected to come work by device controller with cell phone or other communication facilitiess.The software of the operating system of equipment utilization and glasses and communication facilities or computing equipment compatibility.In one embodiment, the screen of augmented reality glasses can show the list of the option for making calling, and user can or make posture by any other control technology as herein described and carry out the video call option on the screen of selective enhancement reality glasses with position control equipment.
Figure 33 shows the embodiment 3300 of the block diagram of video call camera.Camera combines lens 3302, CCD/CMOS sensor 3304, for the analog to digital converter 3306 of vision signal with for the analog to digital converter 3314 of sound signal.Microphone 3312 gathers audio frequency input.Analog to digital converter 3306 and 3314 both sends to their output signal signal to strengthen module 3308.Signal strengthens module 3308 will be transmitted to interface 3310 through the signal strengthening, and this is both synthetic of video and audio signal through the signal strengthening.Interface 3310 is connected to IEEE1394 Standard bus interface and control module 3316.
In operation, video call camera depends on signal capture, and incident light and incident sound are transformed into electronics by it.For light, this process is carried out by CCD or CMOS chip 3304.Sound is become electric pulse by microphone.
First step in this process of the image that generating video is called out is by image digitazation.CCD or CMOS chip 3304 subdivision graph pictures also convert thereof into pixel.If a certain pixel has been collected many photons, voltage will be for high.If this pixel has been collected little photon, voltage will be for low.This voltage is the analogue value.During digitized second step, voltage is transformed into digital value by the analog to digital converter 3306 that carries out image processing.Now, can obtain original digital image.
The audio frequency that microphone 3312 catches is also transformed into voltage.This voltage is sent to analog to digital converter 3314, and the analogue value is transformed into digital value there.
Next step is to strengthen signal, to make it can be sent to the beholder of video call or meeting.Signal enhancing comprises that use is arranged in CCD or CMOS chip 3304 color filter above creates color at image.This color filter be red, green or blue and by pixel change its color, it can be color filter array or Bayer color filters in one embodiment.Then these original digital image strengthen to meet aesthetic requirement by this color filter.Voice data also can be enhanced to obtain better calling and experience.
In final step before transmitting, image and voice data are compressed and as digital video frequency flow output, use in one embodiment digital video camera.If use photo camera, exportable single image, and in a further embodiment, speech comment can be additional to these files.The enhancing of initial numberical data is left camera and is occurred, and in the device controller that can communicate by letter with it in video call or session at augmented reality glasses in one embodiment or computing equipment, occurs.
Further embodiment can provide the other field of portable camera for industry, medical treatment, astronomy, microscopy, the special camera applications of requirement.These cameras are usually abandoned signal and are strengthened and output original digital image.These cameras can be installed on other electronic equipments or user on hand so that use.
Camera comes to dock with augmented reality glasses and device controller or computing equipment with IEEE1394 interface bus.This interface bus delivery time requires high data, such as video and the extremely important data of its integrality, comprises parameter or file in order to manipulation data or transmission image.
Except interface bus, the behavior of the equipment that protocol definition is associated with video call or meeting.In each embodiment, can adopt one of following agreement: AV/C, DCAM or SBP-2 for the camera using together with augmented reality glasses.
AV/C is the agreement for audio frequency and video control, and the behavior of the digital-video equipment of definition including video camera and video recorder.
DCAM refers to the digital camera specification based on 1394, and the behavior of the camera of definition absence of audio ground output uncompressed image data.
SBP-2 refers to serial bus protocol, and the behavior of the mass-memory unit of definition such as hard disk drive or dish.Use the equipment of same protocol to communicate with one another.Thereby for the video call that uses augmented reality glasses to carry out, the video camera on device controller and augmented reality glasses can use identical agreement.Because augmented reality glasses, device controller and camera use identical agreement, data can be in these exchanged between equipment.The file that can transmit between equipment comprises: the parameter of image and audio file, image and audio data stream, control camera etc.
In one embodiment, wish that the user who initiates video call can select video call option from the screen of initiating to present when video call.User selects by making posture with pointing device, or postures to signal the selection to video call option.Then user is placed to user's image is caught by camera being positioned at camera on device controller, watch or other separable electronic equipments.Image is processed by above-mentioned process, then spreads and delivers to augmented reality glasses and other participants for being shown to user.
In each embodiment, on other small portable apparatus that camera can be installed in cell phone, personal digital assistant, watch, falling decoration, maybe can be carried, wear or install.The image that camera catches or video can be spread delivers to eyepiece.For example, in the time that camera is installed on rifle, wearer perhaps can and wirelessly receive the stream of image as the content demonstrating to eyepiece to target imaging within view not.
In each embodiment, the disclosure can provide the content reception based on GPS to wearer, as shown in Figure 6.As described in, augmented reality glasses of the present disclosure can comprise storer, GPS, compass or other orientation equipments and camera.Wearer can with the computer program based on GPS can comprise the many application for iPhone that conventionally can obtain from the application shop of Apple.These programs of counterpart can be used for the smart phone of other brands, and can be applied to each embodiment of the present disclosure.These programs comprise for example SREngine(scene Recognition engine), NearestTube, TAT Augmented ID, Yelp, Layar and TwittARound, and other more specialized application such as RealSki.
SREngine is the scene Recognition engine of the viewed object of camera that can identifying user.It is the software engine of the static scene can identifying scenes such as buildings, structure, photo, object, room.What then it can be identified according to it puts on structure or object by virtual " label " automatically.For example, this program can be called by user of the present disclosure in the time watching street scene, as Fig. 6.Use the camera of augmented reality glasses, this engine is by the Fontaines de la Concorde(Place de la Concorde fountain in identification Paris).Then this program will call out a virtual label, as shown in the part that conduct in Fig. 6 projects the virtual image 618 on eyeglass 602.This label can be only text, as seen in the bottom of image 618.Other labels that can be applicable to this scene can comprise " fountain ", " museum ", " hotel " or the title of cylindric buildings below.Such other programs can comprise Wikitude AR Travel Guide, Yelp and many other programs.
NearestTube is for example directed to the nearest subway station in London by identical technology by user, and other programs can be carried out in other cities same or similar function.Layar is the Another Application of coming position and the visual field of identifying user with camera, compass or direction and gps data.Using this information, can there is helping directed and guiding user in coverture or label virtually.Yelp and Monocle carry out similar function, but their database is more detailed to a certain extent, help to guide in a similar fashion user to restaurant or other service providers.
User can control glasses and call these functions with any control described in this patent.For example, glasses can be equipped with the microphone picking up from user's voice commands, and process them with the software that the storer of glasses comprises.User then can be to responding from being included in equally miniature loudspeaker in eye glass frame or the prompting of earplug.Glasses also can be equipped with little Trackpad, are similar to those little Trackpads of finding on smart phone.Trackpad can allow moving hand or indicator on the virtual screen of user in AR glasses, is similar to touch-screen.In the time that user arrives the desired point on screen, user presses Trackpad and indicates his or her selection.Thereby user can calling program, for example guide-book, then finds his or her road by some menus, is perhaps to select country, then classification of city.Classification is selected to comprise such as hotel, shopping, museum, restaurant etc.User makes his or her selection, then by AR program designation.In one embodiment, glasses also comprise GPS steady arm, and provide can substituted default location current country and city.
In one embodiment, the received image of camera that the object identification software of eyepiece can be processed the face forward of eyepiece determines in the visual field what has.The gps coordinate of this position of being determined by the GPS of eyepiece in other embodiments, may be enough to determine in the visual field what has.In other embodiments, the RFID in environment or other beacons can broadcast location.Above-mentioned any or combination can make position and the identity for identifying thing in the visual field by eyepiece.
In the time that object is identified, can be increased the resolution of this object imaging, or can little compressible catches image or video.In addition, the resolution of other objects in user's the visual field can be lowered, or is captured with high compression rate more, so that reduce needed bandwidth.
Once be determined, the content relevant with point of interest in the visual field can be superimposed in real world image, such as social network content, interactive travelling, local information etc.The information relevant with film, local information, weather, restaurant, restaurant availability, local event, local taxi, music etc. and content can be accessed and are incident upon by eyepiece and on the eyeglass of eyepiece, be watched for user and mutual with it.For example, in the time that user sees Eiffel Tower, the camera of face forward can photographic images and the association processor that sends it to eyepiece process.Object identification software can determine that the structure in wearer's the visual field is Eiffel Tower.Alternatively, the determined gps coordinate of GPS that can search for eyepiece in database is determined the coordinate of this coordinate matching Eiffel Tower.Under any circumstance, then can search for and the restaurant in Eiffel Tower visitor's information, near and tower itself, local weather, local subway information, local hotel information, near the relevant content such as tourist spot other.Can allow mutual with this content by controlling mechanism as herein described.In one embodiment, the content reception based on GPS can be activated in the time entering the tourism pattern of eyepiece.
In one embodiment, eyepiece can be used to watch streamed video.For example, can be by by the search of GPS position, identify video by the search of the object identification of object in the visual field, voice search, holographic keyboard search etc.Continue the example of Eiffel Tower, can search for video database according to word " Eiffel Tower " by the gps coordinate of tower or in the time determining that it is exactly the structure in the visual field.Search Results can comprise the video that adds the video of geographical labels or be associated with Eiffel Tower.Can roll or turning video by using control technology as herein described.Use control technology as herein described can play interested video.Video can be superimposed in real world scene, or can be presented at out of sight on eyeglass.In one embodiment, can make by mechanism as herein described that eyepiece is dimmed allows more watching of high-contrast.In another example, eyepiece perhaps can utilize camera to be connected to wearer streamed video conference capabilities is provided with all networks as described herein.Streamed video can be the video of at least one other video conference participants, visual presentation etc.Streamed video can automatically be uploaded to video storage position in the time being captured, without passing through the mutual of eyepiece user.Streamed video can be uploaded to physics or virtual storage location.Virtual storage location can be positioned at Huo Yun memory location, single physical position.The video of streamed video conference also can be revised by eyepiece, and wherein amendment can be inputted based on sensor.Sensor input can be vision sensor input or audio sensor input.Vision sensor input can be the image of another participant, the visual presentation etc. of video conference.Audio sensor input can be a certain participant's of video conference speech.
In each embodiment, eyepiece can provide the interface of accepting the wireless streams transfer medium from the external unit such as smart phone, flat board, personal computer, amusement equipment, the happy video equipment of portable audio, household audio and video system, home entertainment system, another eyepiece etc. (as video, audio frequency, text messaging, call and schedule warning).Wireless streams transfer medium can be by any wireless communication system known in the art and agreement, such as bluetooth, WiFi, wireless home network connection, wireless lan (wlan), wireless family digital interface (WHDI), honeycomb mobile communication etc.Eyepiece also can use multiple wireless communication systems, such as one for flow transmit High Data Rate media (as video), one for low data rate media (as text messaging), one for order data between external unit and eyepiece etc.For example, High Data Rate video can be through WiFi DLNA(numeral real-time network alliance) transmission of interface incoming flow, and bluetooth is for the low data-rate applications such as text messaging.In each embodiment, external unit can be provided the application of docking of supporting with eyepiece.For example, can make one to move application and can be used for their smart phone to dock with eyepiece for user.In each embodiment, external unit can be provided the transmission equipment docking with eyepiece.For example, can provide transmitter adapter (dongle) that user's smart phone is docked with eyepiece.Owing to many processing requirements may being put on to external unit from external unit streaming media, eyepiece just may require plate still less to carry processing power to adapt to streaming media.For example, comprise for accepting streaming media, buffered data for adapting to the embodiment of eyepiece of streaming media, streaming media offered to user watches surrounding environment by it and the interface of the optics assembly of the content that demonstrates etc.That is to say, can be the simple version of other embodiment of eyepiece as herein described for an embodiment of the eyepiece of accepting streaming media, so that as the display of external unit.In an example, user perhaps can be spread video to deliver to the eyepiece of " simple version " from their smart phone.But, it will be understood by those skilled in the art that any other function as herein described also can be included to create the eyepiece of each embodiment version, from the eyepiece of simple version of display interface that only serves as external unit to the version that comprises FR ability as herein described, transmitting interface such as wireless streams is only one of multiple functions of providing of eyepiece and ability.For example, even in the eyepiece of simple version comparatively, as described herein control technology, power-saving technique, application, drive one or two display, may be also useful with 3D pattern demonstration etc. with streaming media, so that in the command mode of streaming media, offer help in increasing battery management, the optional media watching mode etc. in life-span.Alternatively, the eyepiece of super simple version can provide the cost of eyepiece and the minimized embodiment of complexity, is the situation of wireline interface such as the interface between external unit and eyepiece.For example, one embodiment of eyepiece can provide line interface between user's smart phone or flat board and eyepiece, and wherein the processing power of eyepiece can be only limited to streamed media to present to optics assembly for the desired processing of view content on the eyeglass of eyepiece now.
In other embodiments, operate in the remote control input equipment that an application on smart phone can be served as glasses.For example, the user interface such as keyboard can allow user to carry out typing character by smart phone.This application will make phone look to look like bluetooth keyboard.This application can be only the full frame blank application that touch is sent to the pseudo-touch-screen driver moving on glasses, so that user can use smart phone to carry out two referring to that convergent-divergents (pinch) and towing are as the actual physics places that complete these motions, and visual feedback in tactile feedback and the glasses of the hand of acquisition to you.Thereby more common application of the input gestures that utilizes these types moving on glasses can be worked well in the situation that user uses smart phone touch-screen.Command information can be attended by visual detector.For example, for the finger of knowing you in the time that you just control glasses or eyewear applications with external unit where, the visual instruction of command information can be displayed in glasses, such as the highlighted track of finger motion.The disclosure can comprise the mutual wear-type eyepiece that user wears, wherein eyepiece comprise user by its watch surrounding environment and the content that demonstrates optics assembly, be suitable for content to introduce the integrated image source of optics assembly; Integrated processor; External unit, this external unit has physical user interface and external unit is become to the application of the user interface that can operate by integrated processor of eyepiece, wherein in the content demonstrating, is instructed to alternately with the physics of external unit.In each embodiment, external unit can be smart phone, flat board, mobile navigation equipment etc.Physical user interface can be keypad, touch pad, control interface etc.For example, physical interface can be iPhone, and the content demonstrating is the dummy keyboard that the user action on iPhone keypad is shown as to the content demonstrating on action, the eyepiece on virtual keypad, such as illustrating on virtual keypad that in the time that user's finger movement is in fact mutual with the physics keypad of iPhone highlighted button, button press instruction etc.Finger movement can be one of the selection to content and the movement to the content demonstrating.This manipulation can be the multiple finger movements on touch pad, handles such as big or small two finger convergent-divergents of readjusting the content demonstrating on eyepiece.
As mentioned above, the user of augmented reality can receive content from a large amount of sources.Visitor or visitor may wish selection to be confined to local businessman or mechanism; On the other hand, the businessman that wants to obtain visitor or visitor may wish their supply or solicit to be limited to that the region in them or position are being visited but not to be local's people.Thereby in one embodiment, visitor or visitor may only limit to local businessman by his or her search, i.e. those businessmans in specific geographic limited field.These restrictions can be by GPS criterion or by manually indicating geographic restrictions setting.For example, someone may require the source of stream content or advertisement to be restricted to be positioned at those sources of this people's certain radius (setting quantity or kilometer or mile).Alternatively, this criterion can require source to be limited to town or those sources inside the province.These restrictions can by augmented reality user just as computer user at home or office can limit his or her search and arrange with keyboard and mouse; Augmented reality user's input is made by other modes described in the part of speech, hands movement or the each control of discussion of the present disclosure simply.
In addition the available content that, user selects can carry out restrained or restriction by supplier's type.For example, user has the website (.gov) of being runed by government organs or by those of the website (.org) of nonprofit institution or tissue operation.In this way, may can find that his or her selection is less mixed and disorderly to visiting the more interested visitors such as government bodies, museum, historic site or visitor.In the time that available selection has been reduced to more rational quantity, this individual may more easily can make decision.The ability of subduing fast available selection is desirable in the more urban areas that have many selections such as Paris or Washington D.C..
User controls glasses with this patent other local described any mode or patterns.For example, user can select to recall needed program or application by speech or by indicating on the virtual screen at augmented reality glasses.Strengthening glasses can respond to the Trackpad being arranged on eyes picture frame, as mentioned above.Alternatively, glasses can be in response to the one or more motions or the position transducer that are arranged on picture frame.Then be sent to microprocessor or the microcontroller in glasses from the signal of sensor, glasses also provide any required signal conversion or have processed.Once the program of selecting starts, user just makes a choice by any method as herein described and inputs response, such as moving by head, gesture, Trackpad presses or voice commands signals "Yes" or "No".
Meanwhile, content provider's (being advertiser) also may wish their supply to be limited to the people in specific geographical area (as their city scope).Meanwhile, advertiser's (being perhaps museum) may not wish content to offer local individual, but may wish to touch visitor or stranger.In another example, in the time that user is in, advertisement can not be presented, but advertisement is presented in the time that user travels or is away from home.Augmented reality equipment discussed in this article is preferably equipped with GPS ability and telecommunication capability, and for realizing the regular integrated processor based on geographical presenting for advertisement.For museum, by limiting its broadcasting power, streamed content is provided in limited area will be a simple thing.But museum may provide content by the Internet, and its content may can obtain in the world.In this example, user can receive content by augmented reality equipment, thereby be apprised of, opening the door today in museum and can supply visit.
User can respond to this content by the augmented reality equivalent of the link in click museum.Augmented reality equivalent can be other sensings instruction that speech instruction, hand or eyes move or user selects or be associated by use be arranged on the controller on health.Then museum receives and indicates user's identity or the cookie of user's ISP (ISP) at least.If this cookie indicates or suggests the ISP except local provider, so museum's server can be used as the special advertisement of visitor or offer responds.This cookie also can comprise the instruction of telecommunications link, as telephone number.If telephone number is not local number, this is that this people who responds is visitor's additional clue.So museum or other mechanisms can content desired with its market department or suggestion continue.
The Another Application of augmented reality eyepiece has utilized user's ability to control eyepiece and instrument thereof, wherein minimum to the use of user's hand, then uses voice commands, posture or motion.As mentioned above, user can require augmented reality eyepiece retrieving information.This information may be stored in the storer of eyepiece, but alternatively may be positioned at long-range, such as by the Internet or perhaps via only can be by the addressable database of the Intranet of the employee access of specific company or tissue.Thereby eyepiece can be mentioned in the same breath with a computing machine or mention in the same breath with the display screen that can be watched at extremely near scope place and hear and conventionally control in the case of the use of the hand to people is minimum.
Thereby application can comprise field data is offered to machinist or Electronic Installation Technician.Technician can in the time searching for about information ad hoc structure or that encounter problems, for example, in the time repairing engine or power supply, put on one's glasses.Use voice commands, then his or she accessible database search for customizing messages in database, such as handbook or other R and M documents.Thereby needed information can be accessed at once and be spent minimum energy to be employed, thereby allow technician to carry out more quickly required repairing or maintenance and make equipment recover to use.For mission critical equipment, except saving repairing or maintenance cost, this time is saved and also may save somebody's life.
The information giving can comprise repair manual etc., but also can comprise FR audio frequency-visual information, and, when technician or machinist attempt to carry out particular task, eyepiece screen can show the video that how to carry out this task to this individual.Augmented reality equipment also comprises telecommunication capability, and therefore technician also has the ability of asking other people to help in the situation that task has certain complicacy or unexpected difficulty.This education of the present disclosure aspect is not limited to maintenance and repair, but can be applied to any educational undertaking, such as intermediate classroom, senior classroom, continuing education course or theme, symposial etc.
In one embodiment, the eyepiece of enabling WiFi can move location-based should being used for to selecting the user who participates in to carry out geo-location.The position that user can broadcast them by log in this application permission on their phone, or select to participate in by enable geo-location on themselves eyepiece.In the time that the wearer of eyepiece is scanned people and therefore scanned their equipment of selection participation, this application can identify to be selected the user who participates in and sends instruction to projector augmented reality indicator is projected on the user of a certain selection participation in the user visual field.For example, can place green ring around selecting the people who participates in so that their position is in sight.In another example, yellow ring can indicate the people that selected participation but do not meet some criterion, does not have FACEBOOK account or in the situation that they have FACEBOOK account, does not have mutual friend such as them.
Some social networking, occupation networking and appointment application can with location-based application cooperation.Reside in software tunable on the eyepiece data from networking and appointment website and location-based application.For example, TwittARound is exactly such program, its utilize the camera of installing to detect and mark near the mark of other micro-blogs the microblogging of position.This pushes away spy by making to use individual of the present disclosure can locate near other Twitter() user.Alternatively, the possible equipment that they must be set of user is coordinated the information from various networkings and appointment website.For example, the wearer of eyepiece may want to see all E-HARMONY users of the position of broadcasting them.If the user that a certain selection is participated in is identified by eyepiece, augmented reality indicator can be coated on the user of this selection participation.If this user and wearer have some something in commons, have much in common etc. with user, this indicator can present different outward appearances.For example, and with reference to Figure 16, wearer is observing two people.These two people are identified as E-HARMONY user by being placed on ring around.But, have at least one identically with wearer with the woman shown in solid line ring, and there is no something in common with woman and wearer shown in dotted line ring.Any available profile information all can be accessed and be shown to user.
In one embodiment, in the time that eyepiece is pointed to the direction with the user of account such as such as FACEBOOK, TWITTER, BLIPPY, LINKEDIN, GOOGLE, WIKIPEDIA networking by wearer, the model that user is nearest or profile information can be displayed to wearer.For example, as above, about as described in TwittARound, nearest state renewal, " tweets " (pushing away spy), " blips " etc. (microblogging is shared in a kind of shopping) can be shown.In one embodiment, in the time that eyepiece is pointed to targeted customer's direction by wearer, if eyepiece sensing reaches extended period a period of time and/or a certain posture, head, eyes or audio frequency control are activated, they can indicate this user interested.Targeted customer can receive interest instruction in their phone or their glasses.If targeted customer is labeled as wearer interested but waits for that first wearer expresses interest, can in eyepiece, eject immediately the instruction about targeted customer's interest.Can catch image and targeted customer's information is stored on the nonvolatile memory being associated or in online account by a kind of controlling mechanism.
In other application of social networking, the face recognition program such as the TAT Augmented ID of the TAT – The Astonishing Tribe company from Sweden Ma Ermo can be used.This program can be used to identify people according to people's facial characteristics.This software application facial recognition software identifies individual.Use other application such as the photo mark software from Flickr, then people can identify specific near individual, so and people can be from thering is the social networking website Download Info about this individual information.This information can comprise that this individual provides available this individual name and profile on the website such as Facebook, Twitter etc.This application can be used to refresh user near the someone certain man memory or mark, and collects about this individual information.
In other application for social networking, wearer perhaps can utilize the location-based equipment of eyepiece in each place, and people explicitly, in each place, each product etc. is left to annotation, commentary, comment etc.For example, then a certain place that someone perhaps can access him comment of posting, wherein can make this post and be obtained by other people by social networks.In another example, someone's this comment of perhaps can posting in the position in this place, so that proper another people can obtain this comment while coming this position.In this way, wearer perhaps can access the comment being stayed by other people in the time that they come this place.For example, wearer may come the entrance in a certain restaurant and can access the comment about this restaurant, such as sorting according to a certain criterion (as nearest comment, reviewer's age etc.).
User can by speech, by as mentioned above from virtual touch screen make one's options, by selecting and select desirable program with Trackpad or initiating needed program by any control technology as herein described.Then can make menu setecting by similar or complementary mode.The sensor or the input equipment that are arranged on the convenient location of user's body also can be used, as be arranged on wrister, sensor and Trackpad on gloves, or or even separate devices, perhaps there is the size of smart phone or personal digital assistant.
Application of the present disclosure can provide access to the Internet to wearer, as for browsing, search for, shopping, amusement etc., such as by the wireless communication interface to eyepiece.For example, wearer can initiate web search by control posture, such as on a certain position by being worn on wearer's body (as on hand, on head, on pin), near the first-class opertaing device of a piece of furniture (as chair, desk, desk, desk lamp) upper, the wearer of a certain assembly (as personal computer, smart phone, music player) that using of wearer, wherein the image of web search is projected for wearer and is watched by eyepiece.Then wearer can check search and control web by opertaing device mutual by eyepiece.
In an example, user may adorn oneself with according to a certain embodiment of a pair of glasses configuration, and the image of the Internet web browser wherein projecting is provided by eyepiece, retains the ability of at least some part of actual environment around of simultaneously checking simultaneously.In this example, user may be at them adorn oneself with on hand motion-sensing opertaing device, the hand that wherein this opertaing device can send user for the relative motion of eyepiece as the control command of controlling for web, such as the mouse being similar in conventional personal computer configuration.Be appreciated that and will make user carry out web action to configure similar mode with conventional personal computer.In this case, the image of web search is provided by eyepiece, and the control of the selection of carrying out the action of searching for is provided by the motion of hand.For example, mobile cursor in the image projecting that the mass motion of hand can be searched at web, flicking of finger can provide selection action etc.In this way, by being connected to the embodiment of the Internet, can make wearer can carry out needed web search, or any other enable the function of explorer.In one example, user may be from App Store(application shop) downloaded computer program Yelp or Monocle, or such as near the application NRU(" you " near restaurant locating or other shops from Zagat), similar products Google Earth, Wikipedia etc.This individual can initiate for example search to restaurant or other commodity or ISP (such as hotel, repairman etc.) or information.In the time finding needed information, position is shown or shown to distance and the direction of desired locations.Demonstration can adopt the form of putting altogether the virtual label at a place in User with real world objects.
From Layar(Amsterdam, the Netherlands) other application be included as various " layers " that the desirable customizing messages of user is special.Layer can comprise restaurant information, information about specific company, real estate inventory, refuelling station etc.Use such as the information providing in the mobile software application of applying and user's GPS (GPS), information can be present on the screen of glasses, with the label with desirable information.The Tactile control or other control that use disclosure other places to discuss, user can rotate or otherwise rotate his or her health by axle, and watches the buildings of the virtual label that is marked with inclusion information.If user search restaurant, screen is by the restaurant information showing such as Name & Location.If user search particular address, virtual label is by the buildings appearing in the user visual field.So user can be by speech, by Trackpad, made and selected or select by virtual touch screen etc.
Application of the present disclosure can provide a kind of by advertisement delivery the mode to wearer.For example, in the time that beholder starts his or her one, in the time of browsing internet, carrying out web search in, in the time walking to duck into the store etc., advertisement can be displayed to beholder by eyepiece.For example, user may carry out web search, and becomes the target of advertisement by this web search subscriber.In this example, advertisement can be projected in the same space of projected web search, float to a side or be positioned on wearer's visual angle or under.In another example, when a certain advertisement provide near equipment (be perhaps wearer one) sense eyepiece there is (as by wireless connections, RFID etc.) time, can trigger advertisement and deliver to eyepiece, and advertisement is directed to eyepiece.In each embodiment, it is mutual that eyepiece can be used to follow the tracks of advertisement, sees billboard, sales promotion, advertisement etc. or carry out with it mutual such as user.For example, user can be tracked about the behavior of advertisement, such as for providing benefit, remuneration etc. to user.In an example, whenever user sees billboard, user is paid the ideal money of 5 dollars.Eyepiece can provide impression to follow the tracks of, such as based on seeing brand image (as based on time, geography) etc.As a result, offer can position-based and the event relevant with eyepiece (as user what is seen, what is heard, mutual etc. with what) determine target.In each embodiment, advertising objectiveization can be based on historical behavior, such as, interactive mode mutual based on user's past and what etc.
For example, wearer may go window-shopping in Manhattan, and wherein shop has been equipped with this series advertisements equipment is provided.When wearer in shop other walk out-of-date, advertisement provides the determined user's of integrated position sensor such as GPS that equipment can be based on eyepiece known location to trigger the delivery of advertisement to wearer.In one embodiment, refinement can further be carried out by other integrated sensors (such as magnetometer) in user's position, allows super localized augmented reality advertisement.For example,, if when magnetometer and GPS reading are positioned at before certain shops by user, the user on the bottom of market can receive particular advertisement.When user is in market during to last layer, GPS position can keep identical but magnetometer readings can indicate variation and the new location of user before different shops of user's height above sea level.In each embodiment, can store personal profiles information, to make advertisement provide equipment better advertisement and wearer's demand to be matched, wearer can provide the preference about advertisement, and wearer can stop at least some advertisements etc.Wearer also perhaps can pass to advertisement and the discount being associated friend wearer and they directly can be conveyed to neighbouring and enable those friends of themselves eyepiece; They also can be connected, be passed on them by Email, SMS by the wireless Internet to such as friend's social networks; Etc..Wearer can be connected to and allow following equipment and/or the infrastructure of passing on: the advertisement from sponsor to wearer; From wearer to advertising equipment, the feedback of ad sponsor etc.; To near other users (as friend and kinsfolk) or wearer someone; To shop, as eyepiece Local or Remote website (on the home computer on the Internet or user); Etc..These interconnectivity equipment can comprise the position that is used to provide user of eyepiece and the integrated equipment of direction of gaze, such as passing through to use GPS, 3 axle sensors, magnetometer, gyroscope, accelerometer etc., for determining wearer's direction, speed, attitude (as direction of gaze).Interconnectivity equipment can provide telecommunication apparatus, such as cellular link, WiFi/MiFi bridge etc.For example, wearer perhaps can link by available WiFi, communicate by letter by the integrated MiFi of cellular system (or any other people or group cellular link) etc.May exist wearer for storing advertisement for later equipment.Can exist permission integrated with wearer's eyepiece or that be arranged in local computer device advertisement to be carried out to the equipment of high-speed cache, such as in local zone, wherein in the time that can allowing in the close position being associated with this advertisement of wearer, the advertisement of high-speed cache delivers advertisement.For example, local advertising can be stored in and comprise on the local advertising of geo-location and the server of bargain goods, and in the time that wearer approaches ad-hoc location, these advertisements can individually be delivered to wearer, or in the time that wearer enters the geographic area being associated with advertisement, one group of advertisement can be delivered in batches to wearer, to make can to obtain these advertisements user during near ad-hoc location.Geographic position can be a part, the walkway etc. in the part in city, city, some blocks, single block, street, street, represents provincialism, this locality, super local zone.Note, aforesaid discussion is used term advertisement, but it will be appreciated by those skilled in the art that this also can mean bulletin, broadcast, leaflet, commercial advertisement, the communication that has patronage, endorsement, notice, sales promotion, circular, message etc.
Figure 18-20A has described the mode for customized messages being delivered to the people in the short distance of wishing the facility (such as retail shop) that sends message.With reference now to Figure 18,, each embodiment can provide such as in the time that the wearer of eyepiece is walking or is driving by above-mentioned for searching for supplier's the mode that should be used for checking customization notice board of goods and service.As described in Figure 18, notice board 1800 shows the exemplary advertisement based on augmented reality being shown by seller or service provider.As depicted, this example ad can be relevant with the drink offer being provided by bar.For example, only the cost of a drink can provide two portions of drinks.By the such advertisement based on augmented reality and offer, wearer's notice can be easily directed to notice board.Notice board also can provide the details about the position in this bar, such as street address, floor number, telephone number etc.According to other embodiment, the several equipment except eyepiece can be utilized to check notice board.These equipment can include but not limited to smart mobile phone, IPHONE, IPAD, windshield, user's glasses, the helmet, wrist-watch, earphone, vehicle-mounted bracket etc.According to an embodiment, the scene that can automatically receive offer or check notice board in the time of user (wearer in the situation that augmented reality is embedded in eyepiece) process or this section of driving process.According to another embodiment, the scene that user can receive offer or check notice board according to his request.
Figure 19 shows two exemplary roadside notice boards 1900, and they comprise the offer from seller or service provider and the advertisement that can check by augmented reality mode.Strengthening advertisement can provide to user or wearer the perception of live and reality.
As shown in Figure 20, the equipment (such as the camera lens providing in eyepiece) of enabling augmented reality can be used to receive and/or check the scribble 2000, poster, drawing etc. that can be displayed on roadside or buildings and top, shop, side, front.Roadside notice board and scribble can have visual indicator (for example, code, shape) or the wireless pointer that advertisement or advertising database can be linked to notice board.In the time that wearer is close and check notice board, the projection of notice board advertisement then can be provided for wearer.In each embodiment, also can store personal profiles information to make advertisement can mate better wearer's needs, wearer can provide the preference for advertisement, and wearer can stop at least some advertisement etc.In each embodiment, eyepiece can have brightness and contrast for the eyepiece view field of notice board to be controlled, to promote such as the readability to advertisement under bright external environment condition.
In other embodiments, user can come in this specific location posted information or message according to the GPS position of ad-hoc location or other position indicator (such as magnetometer readings).In the time that the viewer who expects certain distance in this position is interior, this viewer can see this message, as explained in conjunction with Figure 20 A.In the first step 2001 of method 20A, user determines a position, will be received by the people who sends object as this message in this position message.Message is then posted 2003, to be sent to suitable one or more people in the time that recipient approaches expection " checking region ".The wearer's of augmented reality eyepiece position upgrades 2005 constantly by the gps system that forms this eyepiece part.Determine when gps system that wearer is positioned at and expect that certain distance (for example, 10 meters) of checking region is when interior, message is then sent out 2007 to viewer.In one embodiment, message is then revealed as to recipient's Email or text message, if or recipient just adorning oneself with eyepiece, this message can appear in eyepiece.Because message is sent to this people according to people's position, in some sense, message can be shown in " scribble " in specified location or close buildings or the feature of this assigned address.Specific setting can be used to determine to be all processes people of " checking region " or to only have specific people or group or the equipment with unique identifier can be seen message.For example, having checked the soldier in a village can be by being associated message or identifier, and (such as the large X of the position in this house of mark) comes virtually this house to be labeled as and to be checked with a house.This soldier can indicate only has other US soldiers can receive location-based content.When other US soldiers are during through these houses, they can be such as by seeing virtual " X " on side, house (other enable the equipment of augmented reality if they have eyepiece or certain) or indicating the message that this house has been checked automatically to receive instruction by reception.In another example, the content relevant with Secure Application (such as warning, target identification, communication etc.) can be spread and be delivered to eyepiece.
Each embodiment can be provided for checking the mode of the information being associated with (in shop) product.Information can comprise the technical descriptioon, electronic coupons, sales promotion of introduction on discharge, the consumer of nutritional information, the dress-goods of food product, with the price comparison of other similar products, with the price comparison in other shops etc.This information can be relevant to shop arrangement etc. and be projected onto at the relative position of product wearer's peripheral visual field.Product can by SKU, brand label etc. by vision identify; Be transmitted by the packing of product, such as the RFID label by product; Transmit by shop, such as according to wearer in shop position, with respect to position of product etc.
For example, viewer can just walk by clothes shop, and can be provided the information about the clothes on shelf in the time that they walk, and the RFID label that wherein this information exchange is crossed product provides.In each embodiment, this information can be used as information list, diagram, audio frequency and/or representation of video shot etc. and is delivered.In another example, wearer can buy food, and ad serving mechanism can provide the information being associated with contiguous this wearer's product to wearer, in the time that wearer selects this product and check brand, name of product, SKU etc., can be provided this information.A kind of environment of the information that has more that can effectively do shopping therein can be provided to wearer in this way.
The equipment (such as the camera lens being arranged in the eyepiece of exemplary sunglasses) that embodiment can allow user to enable augmented reality by use receives or shares the information about shopping or urban district.These embodiment will use augmented reality (AR) software application, such as those above application of mentioning in conjunction with the supplier of search goods and service.In a scene, the wearer of eyepiece can take a walk for shopping object in street or market.In addition, user can activate the various patterns that help to define for special scenes or environment user preference.For example, user can enter navigation mode, can be navigated to pass across a street buy accessory and the product of preference with market by this navigation mode wearer.This pattern can be selected and wearer can provide various guidances by the whole bag of tricks (such as by Text Command, voice command etc.).In one embodiment, wearer can provide voice command and select navigation mode, and this can cause the enhancing before wearer to show.Enhancing information can be provided by the offer providing in the information relevant with the position of each shop and supplier in market, each shop and by each supplier, current reducing the price time, current date and time etc.Various types of options also can be displayed to wearer.Can roll each option by taking a walk on the directed Di of navigation mode street of wearer.According to provided option, wearer can be according to the place of selecting the most applicable his shopping such as offer and discount etc.In each embodiment, eyepiece can provide search, browse, select, preserve, share the ability (such as checking by eyepiece) of buying article, receiving the advertisement etc. of buying article.For example, wearer can on the Internet, search for article and in the situation that not making a phone call (such as by application shop, E-business applications etc.) buy.
Wearer can provide voice command and then be directed to this place to this place navigation and wearer.Wearer also can automatically or according to the request of current transaction, sales promotion and event about in interested position (such as near shopping shop) receive advertisement and offer.Advertisement, preferential and offer can be close to that wearer occurs and option can be shown to be used in buying according to advertisement, preferential and offer the product of wanting.Wearer can for example select a product and settle accounts to buy it by Google.Be similar to describe in Fig. 7 such, the information that the transaction of message or Email and purchase product has completed can appear on eyepiece.Product sending state/information also can be shown.Wearer can further be passed on or be reminded friend and relative about offer and event by social network-i i-platform, and also can ask them to add.
In each embodiment, user can wear wear-type eyepiece, and wherein eyepiece comprises optics assembly, can check environment around and the content of demonstration by this optics assembly user.Shown content can comprise one or more local advertisings.The position of eyepiece can be determined and local advertising can have and the correlativity of the position of eyepiece by integrated position transducer.As an example, user's position can be determined by GPS, RFID, manual input etc.In addition, user can pass through cafe on foot, and according to the adjacency in user and this shop, is similar to the brand of this shop of demonstration brand 1900(such as fast food restaurant or the coffee described in Figure 19) advertisement can appear in user's the visual field.When user is when environmental field moves around, he or she can experience the local advertising of similar type.
In other embodiments, eyepiece can comprise the capacitive sensor can sensing eyepiece whether just contacting with human skin.This sensor can be capacitive sensor, resistance sensor, induction pick-up, emf sensor etc.Such sensor or sensor group can by allow to detect the mode that when be worn by user of glasses be positioned on eyepiece and/or eyepiece mirror holder on.In other embodiments, sensor can be used to determine that whether eyepiece is in a position, so that sensor can be for example can be worn by user when the position in launching at eyepiece.In addition, local advertising can be only in the time that eyepiece contacts with human skin, can wearing position time, both combinations, reality be sent out while being worn by user etc.In other embodiments, local advertising can be unlocked or is unlocked and worn etc. and to be sent out by user in response to eyepiece in response to eyepiece.As example, advertiser only can select to send local advertising when the facility of user's adjacent specific and in the time that the actual positive wearing spectacles of user and glasses are unlocked, thereby allows advertiser, at reasonable time, advertisement is directed to user.
According to other embodiment, local advertising can be used as banner, two-dimensional diagram, text etc. and is shown to user.In addition, local advertising can be associated with the physics aspect in the user visual field of surrounding environment.Local advertising also can be shown as augmented reality advertisement, and wherein advertisement is associated with the physics aspect of surrounding environment.Such advertisement can be two dimension or three-dimensional.As example, local advertising can be associated with physics notice board (as further described in Figure 18), wherein user's notice can drop on the content of demonstration, and the content of this demonstration illustrates the beverage that is poured into the real building thing surrounding environment from notice board 1800.Local advertising also can comprise the sound that is shown to user by earphone, audio frequency apparatus or alternate manner.In addition, local advertising can be by animation in each embodiment.For example, user can see that beverage flows to approaching buildings and optionally flows in environment around from notice board.Similarly, advertisement can show the motion of any other type as wanted in advertisement.Additionally, local advertising can be shown as three dimensional object, and this three dimensional object can be associated with surrounding environment or carry out alternately with surrounding environment.In the embodiment that object in the user visual field of advertisement therein and surrounding environment is associated, though when user rotate he time, advertisement also can keep being associated with this object or contiguous this object.For example, if an advertisement (such as the coffee cup of describing in Figure 19) is associated with specific buildings, even if the head that rotates him as user is so while removing to check another object in its environment, coffee cup advertisement can keep being associated with this buildings and on the suitable position of this buildings.
In other embodiments, the web that local advertising can carry out according to user searches for to be shown to user, and wherein advertisement is displayed in the content of web Search Results.For example, user can search for " reducing the price the time " just in the street in the time that he takes a walk, and in the content of Search Results, the local advertising of advertising for the beer price in local bar can be shown.
In addition, the content of local advertising can be determined according to user's personal information.Can make user's information available to web application, advertisement mechanism etc.In addition web application,, advertisement mechanism or user's eyepiece can carry out filtering advertisements according to user's personal information.Generally speaking, for example, user can store about him and what be liked and do not like and so on personal information, and such information can be used to advertisement to be directed to user's eyepiece.As specific example, user can store the data of the hobby to local motion team about him, and in the time that advertisement becomes available, those advertisements about the team of fortune team that he likes can be given preferentially and be pushed to user.Similarly, user not Cup of tea thing can be used to get rid of specific advertisement from the visual field.In each embodiment, advertisement can be cached on server, and on this server, advertisement can and be displayed to user by least one access in advertisement mechanism, web application and eyepiece.
In each embodiment, user can carry out with the local advertising of various ways and any type alternately.User can move by making eyes, health moves with at least one in other posture and moves to ask the additional information relevant with local advertising.For example, if an advertisement is displayed to user, he can move in this advertisement to select specific advertisement to receive about the more information of this advertisement at this his hand of advertisement Back stroke or by his eyes in his visual field.In addition, user can select any movement by describing in this article or control technology (such as moving by eyes, health moves, other posture etc.) to ignore advertisement.In addition, user can be by not selecting advertisement further to allow alternately this advertisement to be ignored to select to ignore this advertisement by acquiescence within given a period of time.For example, do not make posture in shown five seconds and obtain more information from advertisement if user is chosen in advertisement, this advertisement can be ignored acquiescently and be disappeared from the user visual field so.In addition, user can select not allow local advertising shown, and this can be selected such option or be realized by closing such feature via the control on described eyepiece by described user on graphic user interface.
In other embodiments, eyepiece can comprise audio frequency apparatus.Therefore, shown content can comprise local advertising and audio frequency, can also hear message or other sound effect (because they are relevant with local advertising) so that obtain user.As example and refer again to Figure 18, in the time that user sees the beer pouring out, in fact he will can hear and the corresponding audio frequency transmission of each action in advertisement.In this case, user can hear that bottle cap is opened and to be then liquid topple over out and be poured into the sound roof from bottle.In other embodiments, descriptive message can be played, or the part that general information can be used as advertisement provides, or both.In each embodiment, any audio frequency can undesirably be play for advertisement.
According to another embodiment, the equipment (such as the camera lens being provided in eyepiece) that social networking can be enabled augmented reality by use promotes.This can be used to not have several users of the equipment of enabling augmented reality or other people link together, and these several users or other people can share idea and viewpoint each other.For example, the wearer of eyepiece can be sitting in campus together with other students.Wearer can be connected with first student who appears at cafe and send message to it.Wearer can inquire that this first student is relevant to the interested people of specific subject (such as Environmental Economics for example).Along with other students are through wearer's the visual field, the camera lens that is configured in eyepiece inside can be followed the tracks of student and student and networking data storehouse (such as " Google me(carries out Google search to me) " that can comprise open profile) mated.Profile from the interested and related personnel in public data storehouse can occur and eject on eyepiece in wearer's front.May more incoherent profiles in profile can conductively-closed or user is revealed as to conductively-closed.Associated profiles can the highlighted quick reference for wearer.The associated profiles of being selected by wearer may be to subject Environmental Economics interested and wearer also can be connected with them.In addition, they also can also be connected with first student.In this way, can use the eyepiece of having enabled augmented reality feature to build social networks by wearer.This social networks of being managed by wearer and session wherein can be saved for reference in the future.
The present invention can be used in and use in the real estate scene of equipment (such as the camera lens being configured in eyepiece) of enabling augmented reality.According to this embodiment, wearer may go for the information about a place, and wherein user can appear in this place in the specific time (such as during driving, walking, jog etc.).Wearer can for example want to know house benefit and the loss in this place.He may also go for the details about the facility in this place.Therefore, wearer can utilize map (such as Google Online Map) identification there to can be used for the real estate of hiring out or buying.As mentioned above, user can use mobile Internet application (such as Layar) to receive about praedial information for sale or that hire out.In such application, be projected onto about the information of the buildings in the user visual field on the inside of glasses and consider for user.Each option can be shown to wearer for such as rolling with the tracking plate of installing on spectacle-frame on eyepiece lens.Wearer can select and receive the information about selected option.The scene of enabling augmented reality of selected option can be displayed to wearer and wearer can check picture and in virtual environment, carry out facility visit.Wearer can further receive re immobiles manager information and and in managing one of real estate meet.Also can on eyepiece, receive email notification or Advise By Wire confirms for reservation.If wearer finds selected real estate and be worth very much, transaction can be reached and wearer can buy so.
According to another embodiment, the equipment (such as the camera lens being provided in eyepiece) that customization and patronage tourism and travelling can be enabled augmented reality by use strengthens.For example, wearer (as traveller) can arrive a city (such as Paris) and want and receive the information about the relevant visit in this area and sightseeing, thereby arranges playing in he of ensuing a couple of days retention period.Wearer can put on his eyepiece or operate any other and enable the equipment of augmented reality and provide voice or the Text Command about his request.Enabling the eyepiece of augmented reality can be located wearer position and be judged wearer's tourism preference by geographical remote sensing technology.Eyepiece can according to wearer's request receiving and on screen displaying format customization information.Customization travel information can comprise the information about following place: National Portrait Gallery and museum, monument and historical place, the comprehensive place of doing shopping, amusement and night life place, restaurant and bar, most popular tourism destination and tour center/focus, most popular this locality/culture/regional destination and focus etc., and unrestricted.According to user, to the one or more selection in these classifications, eyepiece can be pointed out other problem to user, such as the residence time, tourism cost etc.Wearer can be responded and and then be received customization travel information with the selected order of wearer by voice command.For example, wearer can give National Portrait Gallery higher than monumental priority.Therefore, this information becomes and can use wearer.In addition, map and different tourism set of choices and different prioritizations also can appear at wearer's front, such as:
1: the first tourism option (avenue des champs elysees, Louvre Palace, Luo Dan, museum, famous coffee-house) of prioritization
2: the second options of prioritization
3: the three options of prioritization
For example, due to the preference of indicating according to wearer, the first option is ordered as has the highest priority, and wearer can select the first option.The advertisement relevant with sponsor can be ejected immediately after selection.Subsequently, virtual tourism can start by the augmented reality mode that is in close proximity to true environment.Wearer can for example proceed to the visits in 30 seconds of spending a holiday special of the holiday village, Atlantis in Bahamas.Virtual 3D visit can comprise the fast browsing to room, sandy beach, public space, park, facility etc.Wearer also can experience the shopping facility in this region and be received in offer and the discount in these places and shop.This day finish time, wearer may be sitting in he room or hotel in experienced daylong visit.Finally, therefore wearer can determine and arrange his plan.
The equipment (such as the camera lens being provided in eyepiece) that another embodiment can enable augmented reality by use allows to pay close attention to the information of motor vehicle repair and maintenance service.Wearer can receive the advertisement about auto repair shop and dealer by sending for the voice command of request.This request for example can comprise the demand of changing oil in vehicle/car.Eyepiece can receive information and be shown to wearer from repair shop.Eyepiece can stop the 3D model of wearer's vehicle and show remaining oil mass in car by enabling the scene/view of augmented reality.Eyepiece can show that other is also about the relevant information of wearer's vehicle, such as the maintenance needs in other position (such as brake block).Wearer can see that the 3D view of the brake block of wearing and tearing also may be to placing under repair these brake blocks or changing interested.Therefore, wearer can arrange to solve this problem with the reservation of manufacturer by the integrated wireless communication capacity with eyepiece.Can remind confirmation of receipt by the Email on eyepiece camera lens or Inbound Calls.
According to another embodiment, buy the equipment (such as the camera lens being provided in eyepiece) that present can enable augmented reality by use and be benefited.Wearer can announce the present request for certain situation by text or voice command.The preference that eyepiece can point out wearer to answer him, such as cost scope of present type, the age group that receives the people of this present, present etc.Variety of option can be presented to user according to the preference receiving.For example, the option that is presented to wearer can be: cookies basket, grape wine and cheese basket, chocolate assorted, golfer's present basket etc.
Available options can be rolled and can be selected optimal option by voice command or Text Command by wearer.For example, wearer can select golfer's present basket.Golfer's present basket and the 3D view of golf course can appear at wearer before.The virtual 3D view of the golfer's who shows to enable by enhancing present basket and golf course is in close proximity to real world in perception.Wearer can finally respond to address, position and other similar inquiry of prompting by eyepiece.Then can remind confirmation of receipt by the Email on eyepiece camera lens or Inbound Calls.
In each embodiment, eyewear platform can use and take physics and informationization input, carry out on processing capacity control panel and surface and system (comprising based on backfeed loop) in conjunction with various controlling mechanisms, carrys out to carry out alternately with content and carry out e-commerce transaction.The ecommerce of even now and content scene are a large amount of, but some such scenes include but not limited to retail purchases environment, educational environment, transportation environment, home environment, event context, catering environment and outdoor environment.Although described these fields at this, various other scenes are obvious to those skilled in the art.
In each embodiment, eyewear platform can be used in retail purchases environment.For example, user can receive the content relevant with interested article and/or environment with glasses.User can receive and/or search pricing information in retail purchases environment or alternative offer, product information (such as SKU/ bar code), scoring, advertisement, GroupOn(purchases by group) offer etc.In each embodiment, user can find or obtain the positional information for special article.User also can obtain the information about the integration plan information relevant with particular brand, article and/or shopping environment.In addition, user can enter item scan in shopping basket with the glasses that are equipped with camera, scanner, QR reader etc.In addition, user can detect best article in one group of article with eyepiece.As example, the specific mode imagery article of feature that user can active spectacles, such as service routine determine or the density of sensing one article or thickness one intrafascicular best to find.In each embodiment, user can consult the price of article or provide the price of his preference with glasses.For example, virtually or after the scan articles such as the scanner that is associated with glasses of employing, user can make posture, mobile he eyes or with voice command or adopt alternate manner to provide him by the price for this article payment.User can be further carrys out order article with glasses and is scanned and follows by the method for payment showing via user interface or provide and pay.Such payment can be moved etc. and to be indicated by hand posture described here, eyes.Similarly, user can be for example exchanged " plot point " or rewards during the travel of her shopping by GroupOn etc., and the reception sales promotion relevant with special article and/or facility.In addition, user can utilize glasses to carry out image recognition, to make article in an interface be identified and these article are placed an order.For example, the program that glasses are used can allow user to identify the wrist-watch in StoreFront with glasses, thereby in this interface, triggers the menu that places an order for these article in the time that these article are identified.In other embodiments, can by scanning bar code, QR code, Product labelling etc. by input information in eyewear platform.When user moves around or participates in when retail interface when he is just using glasses in retail environment, sales promotion information (such as processing, mark, advertisement, reward voucher etc.) can be scanned or otherwise be received or identification by glasses.User can be scanned accumulating card to use in transaction or otherwise input such information to use during retail transaction with glasses.In each embodiment, glasses can assisting navigation and guide.For example, user can be presented the detail map in shop, and can provided channel Notation Of Content, thereby allows user navigate to better article and in retail environment, navigate better.User can catch product image or download product image from true environment, making this image can be used to buy article, create notes, the generation of these article or receive scoring, comment and the product information etc. of these article.In addition, the application of the geographic position of object images and glasses can allow user to receive proximal most position, this locality comment of article etc. of article.In each embodiment, user's geographic position can allow specific object images to be generated or more suitably be identified.
As example more specifically, system can comprise the mutual wear-type eyepiece that user wears, wherein eyepiece comprises for determining that eyepiece is just being adjacent to the module of retail environment, and system can comprise by its user checks the optics assembly of retail environment around, for the feature of environment-identification and play up the 3-D processing module showing for the 3-D of the ad content of the retail location of eyepiece on wear-type eyepiece, be used for the image-processing module of the image of the wearer's who catches and process wear-type eyepiece environment, for content being incorporated into the integrated image source of optics assembly, wherein integrated image source shows that 3-D play up as the covering on environment, and wherein integrated image source presents the 3-D demonstration of the ad content relevant with retail environment.In each embodiment, 3-D processing module can be locked in display element in institute's recognition feature of environment, and content can present by the relation of the feature for identifying in showing.In each embodiment, what the 3-D of advertisement showed plays up can be at least one the result in scanning bar code, QR code, Product labelling etc.This can be also following achievement: buy product, in eyepiece input product image, enter retail environment the position position of retail environment (or move into), user's eyes are fixed to input integral plan information on product and in eyepiece.In each embodiment, the 2nd 3-D shows that can be used as following result one of is at least played up: scan product, buy product, enter the position etc. of retail environment.In each embodiment, user can carry out e-commerce transaction by eyepiece, and this transaction can comprise scan articles buy, according to relatively the selecting article, negotiated price, exchange plot point, exchange sales promotion, order article etc. of other article.In each embodiment, advertisement can comprise the position of the article that approach user.The position of article can show with respect to user's position, and user can be administered to the guiding of these article.In each embodiment, eyepiece can be used to social networking and eyepiece and can in retail environment, adopt another user's face recognition.In addition, eyepiece can be used to identify the existence of a people in environment, and present with wearer and know the relevant social networking content of relation between others.In addition, user can send and/or receive friend's request by make posture with a position of his health.In each embodiment, user can carry out by advertisement the price of comparative item.In each embodiment, advertisement can comprise audio content.In each embodiment, identification one feature can comprise following one of at least: robotization processing to the image that comprises this feature, check this feature, communicate with this feature, identify this feature by processing the position of this feature, specify about the information of this feature, the user of this feature from database retrieval etc. with signal.In addition, user can be by specifying a feature for keeping overlay content with the user interface interaction of eyepiece.In each embodiment, covering can or be close to institute's recognition feature rendering content in institute's recognition feature, and in a further embodiment, institute's recognition feature can be following one of at least: the position in the article that buy, article for sale, mark, advertisement, passage, shop, retail kiosk, information desk, cashier's machine, televisor, screen, shopping cart etc.
In each embodiment, glasses can be used in educational environment.As example, glasses can show e-learning content, such as what find at textbook or otherwise.Glasses can allow that user checks, cultivation, review project be for test.In each embodiment, user can be monitored in the time testting.The response that glasses can carry out timing and can follow the tracks of user him in the time that user leaies through material is to adjust necessarily examination according to the process of user's answer and/or test.In a further embodiment, user can check that augmented reality (AR) covers by glasses.In each embodiment, AR covers can be included in progressively guidance medium in laboratory, in classroom.In each embodiment, can show virtual professor, thereby allow to be undertaken alternately by video, audio frequency and talk.User can check that blackboard/blank is taken down notes and he can input additional item onboard by glasses, these additional items can be shared with them in the time that other users check blackboard/blank in user interface or in the time that other users check real plate, can in the time that user checks specific blackboard/blank, be added and/or cover so that AR is taken down notes.In each embodiment, glasses can provide social networking platform to the member of class or educational class, and provide for social networking content about the member of class.
In each embodiment, glasses can use in conjunction with the business in educational environment.As example, user can buy course or otherwise keeping track of content progress and course credit with glasses.In addition, user can monitor examination and test rank and examination on the horizon and administrative date of test, user can download course credit/class information, user can catch operation that discuss on classroom, that list on syllabus or that otherwise obtain and identical content is added in calendar, and user can meet with friend or the member of class by the mode communicating with other people via glasses.In each embodiment, user can check he bill and tuition fee report for check and follow the tracks of them.In each embodiment, user can buy that course does same thing or he can use course, and wherein course provides advertisement associated with it.
In a further embodiment, user can use glasses in educational environment.User can by glasses scan examination/test paper for checking, operation etc.The data that are associated with textbook content, handbook and/or book, blackboard/blank content can be scanned or otherwise be caught to user for recording the note and job trace.User can scan or seizure and placard/indicate relevant data.Thus, user can follow the tracks of student's meeting on the horizon, increase in inventory description, meeting-place etc.In each embodiment, user can catch classmate, friend, concern personage's etc. face.In each embodiment, the eyes that glasses can be followed the tracks of user move to verify mutual with content.In each embodiment, glasses can allow " Lifestride(advances with big strides) " or other functions so that picked-up content etc.The pen that user can make notes and communicate by letter with glasses by movement by posture, glasses can be stored user's notes.In other embodiments, user can make posture and glasses can carry out record notes according to such posture, and in a further embodiment, another sensor associated with user's palmistry can allow in the time that user writing is taken down notes, and notes are by glasses record.
In each embodiment, system can comprise the mutual wear-type eyepiece that user wears, and wherein eyepiece comprises for determining that eyepiece is just being adjacent to the module of educational environment.In addition, system can comprise: the optics assembly of checking surrounding environment by its user; For the feature of environment-identification and play up the processing module of the education related content relevant with environment; Image-the processing module that is used for the image of the wearer's who catches and process wear-type eyepiece environment, this image processing module can be locked in display element in institute's recognition feature of environment; And for content being incorporated into the integrated image source of optics assembly, wherein integrated image source plays up education related content into the covering on environment, and wherein this content can present by the relation of institute's recognition feature in relative and demonstration.In each embodiment, integrated image source can present the demonstration of the transport content relevant with transportation environment, and relation such and institute recognition feature can not be presented.In each embodiment, playing up of education content can be the result of scanning bar code, QR code etc., and it can be following result: in eyepiece, input the image of textbook, input the mark in image, the environment-identification that distributes material in eyepiece and enter the position of educational environment.In each embodiment, educational environment can be classroom, gymnasium, automatic garage, garage, outdoor environment, gymnasium, laboratory, factory, business site, kitchen, hospital etc.In addition, education content can be text, textbook extracts, instruction, video, audio frequency, laboratory agreement, chemical constitution, 3-D image, 3-D covering, text covering, classroom book, test, prescription, classroom notes, case history, client file, safety instruction and daily exercise.In each embodiment, education content can be associated with the object in environment or cover on this object.In each embodiment, object can be blank, blackboard, machine, automobile, aircraft, patient, textbook, projector etc.In each embodiment, system can be used to social networking, and further can adopt at least one the face recognition to classmate, teacher etc. in environment.In each embodiment, user can send and/or receive friend's request by make posture with a position of his health.In each embodiment, user can carry out alternately for taking an examination, fulfil assignment, check syllabus with content, checks course project, exercise Using Skill, tracking course process, follow the tracks of credit, record the note, record notes, submit a question etc.In each embodiment, covering can or be close to the feature rendering content of identifying in identified feature.In addition the feature of identifying, can be following one of at least: placard, blackboard, blank, screen, machine, automobile, aircraft, patient, textbook, projector, monitor, desk, intelligent plate etc.As example, notes can appear in the demonstration that become by blackboard frame; Film can appear in the demonstration of screen whereabouts, and molecule shows and can on blackboard, occur etc.In each embodiment, identification one feature can comprise following one of at least: robotization processing to the image that comprises this feature, check this feature, communicate with this feature, identify this feature, retrieve about the information of this feature, specify about the information of this feature and the user of this feature from database retrieval by processing the position of this feature by processing the position of this feature with signal.In addition, user can be by specifying a feature for keeping overlay content with the user interface interaction of eyepiece.
In each embodiment, glasses can be used in transportation environment.As example, user can retrieve or catch and transport relevant content, such as timetable, availability, delay and cancellation.For example, in the time that user arrives at the airport, he can be checked his Flight Information and be seen that his flight will or postpone on schedule by glasses.In each embodiment, user can check his seat/Seat Choice and select dessert and meals preference.He can by glasses value machine and in each embodiment, he can be exchanged or be upgraded by glasses his seat selection.In each embodiment, user pilot can be given the progressively flow process for inventory before the flight of FAA requirement.In each embodiment, guidance and navigation instruction when train train chief, pilot etc. can be given operation train, aircraft etc.In other embodiments, user passenger can look back security information by glasses, and for example, before user can check flight, safety is indicated, and wherein he is shown and how operates emergency equipment etc.In each embodiment, user can carry out predetermined supporting item with glasses, such as renting car, hotel etc.In each embodiment, the visit that user can be scheduled to appear in person and/or he can carry out virtual visit to interested region by glasses.He can check that he is by the surrounding environment of the destination of travelling, so that he was familiar with this region before reaching.In other embodiments, he also can check and more various processing.User can check and/or receive integration content, and such as the available award to particular account, his convertible integration and to what project, he is convertible etc.User can be used at assigned aircraft by glasses, exchange plot point while renting car, hotel etc.In each embodiment, user can travelling or transportation environment in glasses the object for networking.For example, user can find out whom the people on his specific flight or train is.User also can during transportation check entertainment content with glasses.As example, aloft film can be transferred to user's glasses.In each embodiment, user can check that the content relevant with each position and he can check AR continental embankment etc.As example, when train or aircraft are during through a view, user can check that the item of interest object AR as being associated with specific region covers (such as continental embankment).In each embodiment, user receives advertisement can pass through notice board/mark in the time of his transfer time.In addition, user can receive the personal information relevant with the transport professional person who participates in its transport.As example, user can receive information this driver's relevant with taxi driver record, or he can check pilot accident and/or the act of violating regulations record of the safety scoring that can reflect pilot.
In addition, user can use glasses in transportation environment relevantly with business.For example, user can carry out Reserved seating with glasses, exchange and reward plot point and carry out Reserved seating, arrange and payment etc. for the meals of In transit.User can find flight and be scheduled to for this reason/pay, rents car, predetermined hotel, taxi, bus etc.User can network with the people (such as other passenger) relevant with his travelling.In each embodiment, user can navigate with glasses.For example, the map that user can be given bus and taxi is to be presented at footloose best route and method in city.User can pay and/or checks and apply the advertisement being associated for identical object to application for identical object.User can he during travel with continental embankment in the AR content with around continental embankment carry out alternately, and carry out alternately with advertisement and the sales promotion of notice board, mark etc. from based on AR.
In each embodiment, user can be input to eyewear platform by project in transportation environment.For example, the ticket that he can be by scanning him with glasses is with starting value machine process.He can be provided panel board, and this panel board is in his In transit display speed, fuel and GPS position.Glasses can be communicated with display instrument panel and the information about vehicle and/or Transportation Model is provided by the IT system of bluetooth and vehicle.In each embodiment, user can with glasses identify other passengers face and/or by image being input to the image of storing other passengers in glasses.User can be input to continental embankment related content in glasses for database mutual or that create this content for recall in the future.In each embodiment, user can input can be based on AR's or not notice board/the mark based on AR for their storage and mutual with them.
In addition, system can comprise the mutual wear-type eyepiece that user wears, and wherein eyepiece comprises: for determining that eyepiece is just being adjacent to the module of transportation environment; Check the optics assembly of surrounding environment by its user; For the feature of environment-identification and play up the processing module of the transport related content relevant with transportation environment; Image-the processing module that is used for the image of the wearer's who catches and process wear-type eyepiece environment, this image processing module can be locked in display element in institute's recognition feature of environment; And for content being incorporated into the integrated image source of optics assembly, wherein integrated image source plays up transport content into the covering on environment, and wherein this content can present by the relation of institute's recognition feature in relative and demonstration.In each embodiment, demonstration and relation such and institute recognition feature that integrated image source can present the transport content relevant with transportation environment can not be presented.In each embodiment, playing up of content of transport can be following result: scanning bar code, QR code, ticket etc., input is used for the image of the ticket transporting, and enters train, railway station, taxi station, taxi, airport, aircraft, ship, platform, subway, subway station etc.In each embodiment, transport content can be position, auxiliary resources, safety instruction, flight instruction, operator's inventory, FAA information, Flight Information, arrival and Departure airport information, the itinerary etc. of text, video, audio frequency, 3-D image, 3-D covering, text covering, guide, arrangement, map, navigation, advertisement, point of interest.In each embodiment, auxiliary resources can comprise for make hotel and be scheduled to, make car rent predetermined, make dinner and be scheduled to, mark the resource that individual preference, change seat were selected, found local amusement, arrange local visit etc.In each embodiment, user can buy through ticket for flight, by ship, by train with eyepiece, user can buy the cover ticket of taking for subway, check schedule, the price of relatively travelling, retrieval direction, retrieval haulage track, current location in consulting a map, checks high usage route for Transportation Model etc.In each embodiment, content can be associated with vehicle for the information showing about described vehicle, and wherein such information comprises emergency exit information, maintenance information, operation information, panel board information, type information etc.System can be used to social networking, and this system can adopt the face recognition to tourist, operator etc. in environment.User can send and/or receive friend's request by make posture with a position of his health.In each embodiment, eyepiece can be used to identify the existence of a people in environment, and present with wearer and know the relevant social networking content of relation between others.In each embodiment, user can carry out mutual to obtain additional information with shown advertisement.In each embodiment, content can be to strengthen environment (and content can strengthen environment), comprises following arbitrary content: vision instruction, audio frequency instruction, visual indicia, for the various reasons covering route planning of (comprising in case of emergency escaping from this environment etc.).In each embodiment, covering can or be close to the feature rendering content of identifying in identified feature.In addition the feature of identifying, can be following one of at least: placard, train aircraft, taxi, ship, subway, screen, retail kiosk, map, window and wall.In each embodiment, identification one feature can comprise following one of at least: robotization processing to the image that comprises this feature, check this feature, communicate with this feature, identify this feature by processing the position of this feature, specify about the information of this feature, the user of this feature from database retrieval etc. with signal.In addition, user can be by specifying a feature for keeping overlay content with the user interface interaction of eyepiece.
In each embodiment, eyewear platform can be used in home environment.In each embodiment, glasses can use in conjunction with content.For example, glasses can be used to amusement, and wherein user is in and watches media.Glasses also can be used to shopping and for example generate groceries inventory etc., and carry out the required and article stored of stock and for checking these contents.User can carry out family's coordination with glasses, such as by carrying out Pay Bill via glasses, making family's task list that will complete etc.For example, glasses can be used to make and retain doctor's reservation, football match on the horizon etc.Glasses can be used to program-guide.As example, glasses can be used to guides user and carry out controlling electric device equipment, such as DVD player, VCR, telepilot etc.In addition, glasses can be used to security personnel and/or safety.User can activate warning system and open to guarantee it in the time being in or while being away from home.User can check family's camera and turn on the lamp of family and close the lamp etc. of family in the time leaving.User can be given the instruction for emergency condition, and for example, user can be given about the instruction of how doing catching fire, during hurricane etc.User can open the feature of glasses described here and understand thoroughly smog etc.During this emergency condition, user can follow the tracks of kinsfolk and communicate with them with glasses.Glasses can provide help CPR guidance, 911 callings etc.
In each embodiment, glasses can use in conjunction with the business in home environment.For example, user can carry out food order with glasses and sends, checks dry-cleaning, subscribes dry-cleaning pickup etc.User can order entertainment content, such as film, video-game etc.In each embodiment, user can find and use for guiding material of family's project, Pay Bill etc.User can check advertisement and/or sales promotion and take action according to them in the time being in.For example, if advertisement is displayed in glasses in the time that user just uses stirring machine in kitchen, advertisement can point out the more information about new stirring machine of discovery and user can select this advertisement to learn the more information about this equipment to user.
In each embodiment, user can use glasses in family's sublimity so that user by input information in glasses.As example, user can input secretarial work for storage, memory, mutual etc.User can input shopping list, bill, inventory, handbook, mail etc.User's can input to do for oneself advertisement of the paper part mail advertisement that AR, TV, radio etc. enable.User can be scanned the advertisement of paper part to check or additional AR information that reception is associated with advertisement.User can input embedded symbol and/or identifier for example identifies electric equipment or other hardware.User can input Wi-Fi Web content to glasses.In addition, user can input television content, such as screen and intelligent television content.Thus, user can be undertaken by eyewear platform and such content alternately.User can be input to remote control command in eyewear platform, can operate various device, such as TV, VCR, DVD player, electric equipment etc. so that obtain user.In addition, user can input security system content so that user's camera that can be associated with security system, with security system etc. carries out alternately and they is controlled.User can check the various camera feed that are associated with security system, to make him to check home environment regional around by eyewear platform.Glasses can be via bluetooth, connect etc. and to be connected with such camera via the Internet, Wi-Fi.User can further can arrange alarm, close alarm, check alarm and carry out alternately with the alarm being associated with security system.
In addition, system can comprise the mutual wear-type eyepiece that user wears, and wherein eyepiece comprises: for determining the module of the just contiguous home environment of eyepiece; Check the optics assembly of home environment around by its user; For the feature of environment-identification and play up the processing module of the family related content relevant with environment; Image-the processing module that is used for the image of the wearer's who catches and process wear-type eyepiece environment, this image processing module can be locked in display element in institute's recognition feature of environment; And for content being incorporated into the integrated image source of optics assembly, wherein integrated image source Ke Jiang family related content is played up as the covering on environment, and wherein this content can present by the relation of the feature of identifying in relative and demonstration.In each embodiment, demonstration and relation and content such and institute recognition feature that integrated image source can present the content relevant with environment can not be presented.In each embodiment, playing up of content can be following result: enter house, by fixing user's eyes article at home, with another equipment of the mark in eyepiece environment-identification, operation family etc.In each embodiment, content can comprise for operating the user interface such as following equipment: VCR, DVR, satellite receiver, Set Top Box, video request program equipment, audio frequency apparatus, video game console, warning system, home computer, heating and cooling system etc.In each embodiment, user can move by eyes, hand posture, put and first-classly carry out alternately with user interface.In each embodiment, content can allow user to complete following task: generate shopping list, check groceries stocks, Pay Bill, check bill, activated equipment, operation light, generate for kinsfolk and/or other people virtual communication, order delivery service (such as dry-cleaning, food etc.), by the advertisement action of environment etc.In each embodiment, user can identify the just face near another people in home environment or family by eyepiece.In each embodiment, content can comprise instruction in promptly arranging and this instruction can be in audio frequency, video, video instruction etc. one of at least.In each embodiment, content can be to strengthen environment, or content can strengthen environment, and comprises following arbitrary content: vision instruction, audio frequency instruction, visual indicia, in case of emergency escaping from the covering route planning of environment etc.In each embodiment, content can generate in response to embedded symbol, television audio and/or video content, advertisement etc.In each embodiment, content can be from being stored in that user manual eyepiece is retrieved or from the Internet download etc.Content can comprise 3-D advertisement, audio frequency, video, text etc.In each embodiment, identification one feature can comprise following one of at least: robotization processing to the image that comprises this feature, check this feature, communicate with this feature, identify this feature by processing the position of this feature, specify about the information of this feature, the user of this feature from database retrieval etc. with signal.In each embodiment, user can be by specifying a feature for keeping overlay content with the user interface interaction of eyepiece.In each embodiment, covering can or be close to the feature rendering content of identifying in identified feature.In addition the feature of identifying, can be following one of at least: station, note pad, calendar, wall, electronic equipment, security system, room, door, gateway, key holder and stationary installation are write in electric equipment, notes.
In each embodiment, user can use glasses in event context.In variety of event environment, user can come to carry out alternately with content with eyewear platform.As example, user can check the availability for timetable, ticket information and/or the ticket/seat of the event such as such as concert, ball match, various entertainment, store of business events.User can check or otherwise carry out alternately with the sales promotion information of an event.User can check integration program content, such as counting of being associated with an event or award value etc.User can be provided the access for an event explicitly due to reasons such as integration plans or with integration plan etc.User can be provided due to reasons such as integration states the chance of " premiums " material of the event of checking.In each embodiment, user can check the assistant service and the commodity that are associated with event, buy these services and commodity etc.In each embodiment, user can check the AR content at event place, such as first down line, goal mark, access to racer/performing artist etc.In each embodiment, user can check optional video feed, such as visual angle, side, backstage visual angle/video feed etc. when the another location of user in stadium.
In each embodiment, the business of glasses in can binding events environment is used.As example, user can buy/pre-booking, check selection/available seat etc.User can predetermined supporting item, such as buying the backstage pass, his seat etc. of upgrading.In each embodiment, user can purchase events dependent merchandise, such as sport shirt, concert clothing, placard etc.User can further exchange plot point, such as those and the plot point of rewarding or often participant's project is associated.In each embodiment, user can buy picture and/or picture with scenes as souvenir, the chronicle of events, such as project of the video that is digitized " signature " of the specific part from event, match or event or complete match or event etc.User can be with additional video or the explanation of racer and/or performing artist during more cost or the such event of view for free.
In each embodiment, user can be input to eyewear platform by project and/or data in event context.In each embodiment, user can input seat that ticket/pass is found him, sign in to event etc.User can strengthen to input targeted promotional material (such as placard and mark) for checking them and/or carrying out alternately with them with AR.User can input integral plan information and can be for particular event scanning card etc.Such user can carry out alternately with the such account relevant with event, data are provided, activate the account etc. to the account.In each embodiment, user can be input to Web content in glasses via Wi-Fi, bluetooth etc.
In addition, system can comprise the mutual wear-type eyepiece that user wears, and wherein eyepiece comprises: for determining the module of the positive proximal event environment of eyepiece; Check the optics assembly of event context around by its user; For play up the processing module for the demonstration of the event content of the event context of eyepiece on wear-type eyepiece; Be used for the image-processing module of the image of the wearer's who catches and process wear-type eyepiece environment, this processing comprises the feature that identification is relevant with event and stores the position of this feature; And for content being incorporated into the integrated image source of optics assembly, wherein integrated image source is played up event content the covering on the environment for being checked by eyepiece wearer and this content is associated with this feature; Wherein integrated image source presents the content relevant with environment.In each embodiment, image processing module can be locked in display element in institute's recognition feature of environment, and content can present by the relation of institute's recognition feature in relative demonstration.In each embodiment, playing up of content can be following result one of at least: enter event context, user's eyes be fixed on to feature in the project at event place, in environment-identification, the appearance under event of scanning user's ticket, identification one people, input be from the image of event etc.In each embodiment, content can comprise that strengthening vision is fed to, and comprises following arbitrary content: first down line, place mark line, performing artist's demonstration, the demonstration of performing artist's musical instrument, instant replay, enhancing view, live video, optional view, advertisement, 3-D content, the seat relevant with the event availability etc. of upgrading.In each embodiment, content can comprise enhancing audio feed, comprises following arbitrary content: racer explains orally, explains orally audio frequency, race sounds, enhancing performance sound, performing artist's comment, live audio etc.User can move by eyes, hand posture, put and one of at least come to carry out alternately with content in first-class.In each embodiment, eyepiece can be used to identify the existence of a people under event, and can present with wearer and know the relevant social networking content of relation between others.In addition, user can by health one position with him make posture (such as nodding) come in sending and receiving friend request one of at least.System can comprise user interface come for purchase events project, image and event view and from the chronicle of events of event digital signature one of at least.In addition, content can be in response to one of at least generating in embedded symbol, television content, advertisement etc.In each embodiment, content can comprise in the augmented video of backstage, cloakroom, baseball player rest room, baseball bull pen, racer's bench etc. and audio frequency one of at least.In each embodiment, identification one feature can comprise following one of at least: robotization processing to the image that comprises this feature, check this feature, communicate with this feature, identify this feature by processing the position of this feature, specify about the information of this feature, the user of this feature from database retrieval etc. with signal.In addition, user can be by specifying a feature for keeping overlay content with the user interface interaction of eyepiece.In each embodiment, covering can or be close to the feature rendering content of identifying in identified feature, and in each embodiment, the feature of identifying can be an object of match, comprise following one of at least: the distance that place, ball, goal, scoring plug, jumbotron, screen, ball are passed through, the path of ball, seat, stadium etc.In each embodiment, the feature of identifying can be the object that artist performs, comprise following one of at least: musician, music musical instrument, stage, music stand, performer, setting, stage property, curtain etc.In each embodiment, the feature of identifying can be an object in leased territory, comprise following one of at least: doll, animal stuffed toy, concert clothing, food, beverage, cap, clothing, sandy beach towel, toy, physical culture souvenir, concert souvenir etc.
In each embodiment, eyewear platform can be used in catering environment.For example, glasses can be filed a request with regard to content in catering environment.In each embodiment, user can make being scheduled to, checking possible seat availability, checking scoring, comment, party venue position and content etc. of seat etc. with glasses.User also can check comparison between menu content and price, this party venue and other party venue, about collocation of the wine of the scoring of the details of F&B (such as comment, nutritional labeling, how it to prepare), wine, robotization etc.User can check social content, for example, can identify or identify a people and/or carry out alternately with the client of identical party venue.In each embodiment, user can check the integration program content relevant with user's account and/or specific party venue, counts such as having dinner.User can translate the item on menu, title and the definition etc. of searching composition by search with glasses.User can check video or the image of menu item.In each embodiment, user can check the AR version of menu.In each embodiment, user can catch menu image and with unlimited focus on check this image, increase magnification, adjusting contrast, to menu illumination etc.In each embodiment, user can check menu item and just mark and price etc. is automatically divided drinks and beverage department to arrange in pairs or groups.He had eaten the prompting that what and he like and so on database and checks passing meal user-accessible.In each embodiment, user can see the different item of item of consuming from him.For example, if user has selected broken fourth salad, he can be considered as filet steak etc. by this.
In each embodiment, glasses can be used in the business of catering environment.For example, glasses can be used to find party venue, make or more new subscription, navigate through menus, select the item that interested item maybe will buy and give me a little in party venue options from menu.Glasses can be used to be one pay, shared is paid, calculate tip, exchange plot point etc.
In each embodiment, glasses can be used in catering environment with input data/item.In each embodiment, user can carry out input content via Wi-Fi, bluetooth etc.In each embodiment, user can input have AR strengthen menu, mark etc. with check they and/or mutual with them.In each embodiment, user can input have AR strengthen ad content with check they and/or mutual with them.User can input the item for paying, and pays/exchanges etc. such as credit/debit card, integration.Such input can be made by near-field communication etc.In each embodiment, user can pay via face recognition.In each embodiment, glasses can be used to identify employee's face and such payment and can file a request according to such face recognition.In other embodiments, user's face or another individual face can be identified and account can be carried out accordingly quilt and withholdd to make payment.
In each embodiment, system can comprise the mutual wear-type eyepiece that user wears, and wherein eyepiece comprises: for determining just contiguous environment and the wine culture module one of at least of having dinner of eyepiece; Check the optics assembly of surrounding environment by its user; For the feature of environment-identification and play up the processing module of the food and drink related content relevant with environment; Image-the processing module that is used for the image of the wearer's who catches and process wear-type eyepiece environment, this image processing module can be locked in display element in institute's recognition feature of environment; And for content being incorporated into the integrated image source of optics assembly, wherein integrated image source can be by one of at least playing up as the covering on environment in food and drink related content, and this content can with relatively with show in the relation of institute's recognition feature present.In each embodiment, integrated image source can present the demonstration of the content relevant with environment, and relation and content such and institute recognition feature can not be presented.In each embodiment, playing up of content can be following result one of at least: enter have dinner in environment and wine culture one of at least, by user's eyes be fixed on the menu in this environment, open mark in menu, environment-identification, to focus on mark in environment first-class.In each embodiment, content can comprise enhancing menu content, comprise following one of at least: the audio description of the pairing of nutritive value, wine and the menu item of the scoring of menu, the comparison of menu item, menu item, the image of menu item, menu item, the video of menu item, the contrast that strengthens magnification, menu item and illumination and according to geographic area, composition, scoring, user in the past whether post-consumer this etc. the classification of menu item.In each embodiment, content can be received as menu in the time of the seats such as user.In each embodiment, user can move by eyes, hand posture, put and one of at least come to carry out alternately with content in first-class.In each embodiment, user can order via eyepiece.In each embodiment, user can pay check, bill or expense via eyepiece.In each embodiment, eyepiece can be used to social networking and provide following one of at least: the comment of user to environment and the face recognition to another people in environment.In each embodiment, user can by make with a position of his health posture come in sending and receiving friend request one of at least.Content can comprise from retrieve to the additional information relevant with menu item.In each embodiment, covering can or be close to the feature rendering content of identifying in identified feature.In each embodiment, the feature of identifying can be following one of at least: placard, framework, Menu Board, menu, container for drink, food exhibition vehicle, bar, desk, window, wall etc.In each embodiment, identification one feature can comprise following one of at least: robotization processing to the image that comprises this feature, check this feature, communicate with this feature, identify this feature by processing the position of this feature, specify about the information of this feature, the user of this feature from database retrieval etc. with signal.In each embodiment, user can be by specifying a feature for keeping overlay content with the user interface interaction of eyepiece.
In each embodiment, eyewear platform can be used in outdoor environment.In each embodiment, glasses can be used to and content exchange and/or viewing content.User can check navigation information, and the AR of the track barrier that otherwise may not see such as track position, to the time of destination, to track or along the advancing of track, track map, user covers etc.User can be given the condition of outdoor environment, such as temperature, weather, sowing condition, fishing condition, water level, tidal conditions etc.User can communicate by letter with glasses, coordination, weather alert etc. such as the opsition dependent relevant with outdoor environment to group.User can collect information, such as so that mark plant, trees, animal, birds, sound, bird cry etc.In each embodiment, user can check object and by inquiring " what this is " to glasses, user can be presented about the content of this object and/or information.Whether, in each embodiment, user can obtain security information, be edible, poisonous, dangerous etc. such as certain thing.For example, user can ask a question " this is adventurous snake? " in the time seeing by glasses, glasses then can be used to provide the information about this snake to user, have venom etc. such as it.In each embodiment, user can identify and/or receive and the relevant content of continental embankment being associated with outdoor environment with glasses.Such continental embankment can help user in environment, navigate or understand environment.In each embodiment, user can check program-guide with glasses, specifically ties, crosses difficult landform etc. for one such as how pitching a tent, making a call to.User can inquire, " how I lift this tent " and user can receive the progressively instruction about this.In each embodiment, user can check about self, the content of behavior or situation, or analysis for this reason.User can upgrade from glasses request, such as " I have dewatered ", " my hypothermia ", " my oxygen level is low " etc.According to result, the behavior that user can change him prevents specific result or promotes specific result.In each embodiment, user can check social content and environment, the experience blog etc. relevant with other people experience on track.In each embodiment, it is only for expert that user can be alerted certain ski trail, or user can further be informed the present situation, such as having serious ice patch in the various piece in place.
In each embodiment, user is used as relevant with business by glasses in environment out of doors.User can download the related content relevant with environment.As example, user can download road, footpath map, fishing map, about the data of grabbing fish, skiing, Halfpipe etc.User can arrange to stay, order supply, rental equipment, arrangement guide, go sight-seeing, enter event, for example obtain for the license of going fishing, go hunting licence or other etc.In such setting, user can carry out alternately via glasses and social networks, for example, user can participate in training club, with other people on Jing road or in specific environment communicate etc.User can mark and/or the achievement of tracking taking target as pointing to.For example, user can follow the tracks of or mark is climbed the target of Mount Whitney, target that can mark charitable " happy race " etc.User can use the business prototype based on blog etc.In each embodiment, user can improve the subsidy for certain outdoor event with social networking via eyewear platform.
In each embodiment, user can by outdoor environment or content, the data etc. relevant with outdoor environment be input in glasses.In each embodiment, user can come for scenery identification with the camera in glasses, and user can be provided the information relevant with specific environment or navigate in specific environment with GPS by glasses.User can send and communicates by letter and receive communication or the sending and receiving communication relevant with environment from them to other users in environment environment.User can input continental embankment data, use AR to strengthen the continental embankment etc. of checking environment.User can input features such as leaf and flower, makes the mark relevant with these features, can catch the picture of these features and/or understand them in environment.The image that user can catch item, the animal etc. of composing environment is understood more them, storage data relevant with them, is carried out alternately etc. with the AR content relevant with them.
In each embodiment, system can comprise the mutual wear-type eyepiece that user wears, and wherein eyepiece comprises: for determining the module of the just contiguous outdoor environment of eyepiece; Check the optics assembly of outdoor environment around by its user; For play up the processing module for the demonstration of the outdoor content of the outdoor environment of eyepiece on wear-type eyepiece; Be used for the image-processing module of the image of the wearer's who catches and process wear-type eyepiece environment, this processing comprises the feature that identification is relevant with event and stores the position of this feature; And for content being incorporated into the integrated image source of optics assembly, wherein integrated image source is played up outdoor content the covering on the environment for being checked by eyepiece wearer and this content is associated with this feature, and wherein integrated image source presents the content relevant with outdoor environment.In a further embodiment, image processing module can be locked in display element in institute's recognition feature of environment, and content can present by the relation of institute's recognition feature in relative demonstration.In each embodiment, content can be used as following result one of at least and is played up: enter outdoor environment, by user's eyes be fixed on that in environment one is upper, the image of the existence in environment of feature in environment-identification, identification one people, input environment, to focus on mark in environment first-class.In each embodiment, content can comprise enhancing ambient Property, comprise following one of at least: cover road, footpath information, temporal information to destination, user's forward information, continental embankment information, about in the security information of environment, environment with respect to the position in other source and about the information of the biosome in environment.In each embodiment, content can comprise the instruction for user, and instruction can be following one of at least: covering on audio frequency, video, image, 3D rendering, object, progressively instruction etc.User can move by eyes, hand posture, put and one of at least come to carry out alternately with content in first-class.User can carry out following one of at least: arrange to stay, order supply, rental equipment, arrange excursions, obtain for movable license or licence, input about the comment of environment etc.In addition, content can strengthen following one of at least: the continental embankment in camera input, GPS information, environment and the feature in outdoor environment.In each embodiment, eyepiece is used to identify the existence of a people in environment, and present with wearer and know the relevant social networking content of relation between others.In addition, user can by make with a position of his health posture come in sending and receiving friend request one of at least.In each embodiment, content can be according to the analysis of user's situation is played up.In each embodiment, identification one feature can comprise following one of at least: robotization processing to the image that comprises this feature, check this feature, communicate with this feature, identify this feature by processing the position of this feature, specify about the information of this feature, the user of this feature from database retrieval etc. with signal.In addition, user can be by specifying a feature for keeping overlay content with the user interface interaction of eyepiece.In each embodiment, covering can or be close to the feature rendering content of identifying in identified feature.In addition the feature of identifying, can be following one of at least: pattern, marine communication means and the animal of plant, trees, shrub, road, footpath, rock, fence, path, field, campsite, cabin, tent, water transportation.
In each embodiment, user can use glasses in exercise environment.In each embodiment, user can check with glasses, download or otherwise carry out alternately with content.For example, user can such as by inquiry glasses " I dewater? ", " my hypothermia? ", " my oxygen level is low? " Deng taking ego behavior or status analysis.In each embodiment, user can check the content towards health club, such as club's expense and offer, training course on the horizon etc.User can check towards the content of training, such as instructing and instruction content.For example, user can check about instruction how to squat, stretch, how to use equipment etc., video, AR etc.User can check, comments on and upgrade blog, tempers blog such as the individual relevant with tempering environment.
In each embodiment, user can use glasses in the business of taking exercise in environment.As example, user can be by payment or free download master plan, such as instructing relevant plan with instruction, trainer or other.In each embodiment, user can be followed the tracks of successfully and/or be made progress by plan, until finish.In each embodiment, application can have the advertisement that will be displayed to user associated with them.In each embodiment, user can come for utility appliance purchase and sale with glasses.For example, user can buy the new sport footwear that increases arch of foot shoe-pad with running use.In each embodiment, user can come for charitable activity with glasses, and such as, but not limited to " happy race " or " for charities X climbs Mountain Everest ", wherein user is via eyewear platform collectoin and/or check or upgrade the blog entries for this.
In each embodiment, user can input information and/or the data of taking exercise in environment with glasses.In each embodiment, user can input data for showing tracking, inputting data and input picture and video via sensor.Only, as example, user can record a video to another person in certain activity, and then makes form, technology etc. become perfect at his training period with this video.
In each embodiment, system can comprise the mutual wear-type eyepiece that user wears, and wherein eyepiece comprises: for determining that eyepiece wearer is tempering environment or just contiguous module one of at least of taking exercise in environment; Check the optics assembly of tempering environment around by its user; For play up the processing module of tempering related content on wear-type eyepiece; For catching and process wearer's the image of environment the image-processing module of the feature of environment-identification of wear-type eyepiece; For content being incorporated into the integrated image source of optics assembly, wherein the covering on the environment for being checked by user is played up by tempering content in integrated image source, wherein cover when user moves eyepiece and be fixed to and be close to the feature of identifying, and wherein integrated image source presents the content relevant with tempering environment.In each embodiment, playing up of content can be following result one of at least: enter take exercise environment, by user's eyes be fixed on of environment upper, automatically identify feature in the eyepiece visual field, use mark in an equipment, environment-identification within the border at training ring, to focus on mark in environment first-class.In each embodiment, content can comprise strengthening tempers content, comprise following one of at least: towards the content of training, club information content, for the instruction of taking exercise, about the information of course on the horizon etc.In each embodiment, content can comprise in 3-D content, audio frequency, vision, video and content of text one of at least.User can move by eyes, hand posture, put and one of at least come to carry out alternately with content in first-class.In each embodiment, content can comprise user profile, comprise in vital signs heart rate, exercise time, swimming contest once back and forth in required time, best setting-up time, historical user data etc. one of at least.Content can allow user buy training course, use machine time, in more time of club, beverage, health etc.In each embodiment, content can be following advertisement one of at least: course on the horizon, Health Club, every discount, equipment sale etc. to juice bar.In addition, in each embodiment, eyepiece can be used to social networking, wherein eyepiece provide following one of at least: the comment of user to environment and the face recognition to another people in environment.In addition, user can by make with a position of his health posture come in sending and receiving friend request one of at least.In each embodiment, user can or receive friend's request there from them to transmission friend requests such as another member, trainer, directors.In each embodiment, covering can or be close to the feature rendering content of identifying in identified feature.In addition the feature of identifying, can be following one of at least: a region, a region in tennis court etc. in calendar, wall, window, plate, mirror, treadmill, weighing machine, bicycle, stationary bicycle, elliptical machine (gymnastic equipment one), prize ring bag, runway, scoring plug, goal, court.In each embodiment, identification one feature can comprise following one of at least: robotization processing to the image that comprises this feature, check this feature, communicate with this feature, identify this feature by processing the position of this feature, specify about the information of this feature, the user of this feature from database retrieval etc. with signal.In each embodiment, user can be by specifying a feature for keeping overlay content with the user interface of eyepiece etc. alternately.
May be the mobile online game that uses augmented reality eyes to the attractive Another Application of user.These game can be computer video game, are moved by Electronic Arts Mobile(electronics artistic skill such as those), the moving severe snow of looking of UbiSoft and Activision Blizzard() game of supply, for example, World of (WoW) (World of Warcraft).Play home computer (and inoperative use tricks calculation machine) is upper just as game and entertainment applications, augmented reality glasses also can use game to apply.Screen can appear at the inside of glasses so that obtain the game of user's Observable and participate in game.In addition, can provide the control to playing games by virtual game controller (such as operating rod, control module or mouse of other local description herein).Game console can comprise that sensor or other are attached to the element of the output type of user's hand, such as the feedback of passing through acceleration, vibration, power, pressure, electric pulse, body temperature, electric field sensing etc. for from user.Sensor and actuator can be attached to by modes such as sheath, ring, liner, gloves, bracelets user's hand.Thus, eyepiece virtual mouse can allow user that the motion of hand, wrist and/or finger is construed to the motion of cursor on eyepiece shows, wherein " motion " can comprise change in slow movement, rapid movement, jerk campaign, location, position etc., and can allow user with three dimensional constitution operation and without physical surface, and comprise some or all in six-freedom degree.
As seen in Figure 27, game application realizes 2700 and can use the Internet and GPS.In one embodiment, game downloads to subscriber computer or augmented reality glasses via game supplier (perhaps by using as their web services and the Internet of demonstration) from customer database.Meanwhile, the glasses that also have telecommunication capability are via cell tower and the concurrent power transmission letter of satellite reception and telesignalisation.Thus, online game system can be accessed the information of the ludic activity of wanting about customer location and user.
Game can utilize this knowledge about each player's position.For example, game can build the feature that uses player position via GPS steady arm or magnetometer steady arm, rewards plot point when reaching this position.Game also can send message (for example showing clue) or scene or image in the time that player reaches ad-hoc location.For example, message can be to go to next destination that is next provided for player.Scene or image can be used as a part for the struggle that must be overcome or obstacle or provide as the chance of getting game plot point.Therefore, in one embodiment, augmented reality eyepiece or glasses can be accelerated and enliven computer based video-game with wearer's position.
A kind of method of playing augmented reality game is described in Figure 28.In the method 2800, user signs in to website, accesses thus licensed game.Game is selected.In one example, if multiple player is available and desirable, user can add game; Alternatively, the special role that user can perhaps wish by user creates self-defined game.Game can be arranged the time, and in some cases, and player can select special time and the place for game, instruction is assigned to the website of playing games etc.After a while, player uses augmented reality glasses and one or more player meet and sign in to game.And if it is applicable that participant then plays games, game result and any statistics (player's mark, game number of times etc.) can be stored.Once game starts, position just can change for different players in game, thereby a player is delivered to a position and another player or other multiple player are delivered to different positions.The position that game can provide according to each player or every group of player's GPS or magnetometer has the different scenes for each player or every group of player.Each player also can be sent out different message or image according to his or her role, his or her position or both.Certainly, each scene then can cause other sight, other is mutual, to the instruction of other position etc.In a kind of meaning, the game that such game is participating in the reality of player position with player mixes mutually.
Game can be extended from the easy game type (such as small-sized, single player) that can play player's palm.Alternatively, also can play game more complicated, multiplayer.At the former, classification is game, such as SkySiege, AR Drone and Fire Fighter360.In addition, also can easily expect multi-player gaming.Due to the necessary logging in game of all players, certain game can be logged and specify other one or more people's friend to play.Player's position also can be used via GPS or additive method.Sensor in augmented reality glasses described above or game console (such as accelerometer, gyroscope or magnetic compass even) also can be used to directed and play games.An example is AR invader (Invader), available in the iPhone application that comes self-application shop.Can, from other suppliers and for the non-iPhone type system that the Layar such as Amsterdam and Parrot company (Paris, FRA) (being the provider of AR Drone, AR Flying Ace and AR Pursuit) provide, obtain other game.
In each embodiment, game can be 3D's so that user can experience 3D game.For example, other environment that when playing 3D when game, that user can check is virtual, augmented reality or user can control his visual angle therein.The various aspects that rotatable his head of user is checked virtual environment or other environment.Thus, when user rotates his head or makes other while moving, he can check game environment, just looks like that he is in fact just in such environment.For example, user's visual angle can be make user by least certain control to visual angle stayed " in " 3D game environment, wherein user can move his head and have the view changing with the corresponding game environment of head position changing.In addition, in the time that user walks in fact forward, he can " entering into " game, and has the visual angle of moving along with user and change.In addition,, along with user moves the view of watching attentively of his eyes, visual angle also can change.Can provide additional image information such as the side place in the user visual field visiting by rotation head.
In each embodiment, 3D game environment can be projected onto on the lens of glasses or be checked by other means.In addition, lens can be opaque or transparent.In each embodiment, 3D game image can be associated with user's external environment condition and in conjunction with user's external environment condition so that user can rotate he head and 3D rendering together with external environment condition.In addition, such 3D game image is associated with external environment condition to be changed, so that 3D rendering is associated with more than one object in external environment condition or the more than one position of an object in all cases, be that 3D rendering carries out with various aspects or the object of true environment alternately thereby user is looked.As example, user can check that 3D game strange beast climbs up a building or climb up automobile, and wherein such building or automobile are the real objects in user environment.In such game, user can be used as alternately with strange beast a part for 3D gaming experience.User's true environment around can be a part for 3D gaming experience.Lens are in transparent each embodiment therein, when user can move everywhere in his or her true environment and 3D game environment carry out alternately.3D game can be attached to the element of user environment in game, and it can be both mixing by fully assembling of game or it.
In each embodiment, 3D rendering can be associated with augmented reality program, 3D Games Software etc. or by augmented reality program, 3D Games Software etc. or generate by other means.In each embodiment that augmented reality is used for the object of 3D game therein, 3D rendering can occur or be discovered by user according to user's position or other data.Such augmented reality application can offer user to carry out alternately with such 3D rendering, thereby provides 3D game environment in the time using glasses.Along with user for example changes his position, each 3D element that the process in game can be advanced and play can become the addressable or inaccessible of viewer.As example, the various 3D enemies of user's game role can appear in game according to user's actual position.The 3D element that the user that user can play games with other and/or the user who is playing games with other are associated carries out alternately or causes its reaction.The element being associated with user like this can comprise weapon, message, currency, user's 3D rendering etc.According to user's position or other data, the 3D element that he or she can be run into, check, participate in other users and be associated with other users by any mode.In each embodiment, 3D game also can provide by being arranged on software in glasses or that download to glasses, and in glasses, user's position is used or do not used.
In each embodiment, lens can be opaque so that virtual reality or other virtual 3D gaming experiences to be provided to user, and wherein user " places oneself in the midst of " in game, and wherein user's movement can change the visual angle of user's 3D game environment.Thereby user can come in the following manner to move everywhere in virtual environment or explore and play 3D game: various healths, head and/or eyes move, the use of game console, one or more touch-screens, or the user of permission described here navigates, handles 3D environment and mutual any control technology.
In each embodiment, user can be in the following manner to 3D game environment navigate, mutual and handles and experience 3D and play: health, hand, finger, eyes or other move, by using one or more wired or wireless controllers, one or more touch-screens, any control technology described here etc.
In each embodiment, to eyepiece can with inside and outside facility can be used for learning eyepiece user's behavior and by study to behavior be stored in behavior database to enable location aware control, activity perception control, PREDICTIVE CONTROL etc.For example, user can make event and/or to the tracking of action by eyepiece record, such as instruction, the recommendation that use or that provide etc. of the order from user, the image sensing by camera, user's GPS position, the sensor input along with the time, the action being triggered by user, communication to and from user, user's request, web activity, the music of listening to, request.This behavioral data can be stored in behavior database, such as with user identifier or mark independently.Eyepiece can be collected this data in mode of learning, collection mode etc.Eyepiece can utilize passing data that user makes to inform or what reminding user done before them, or alternatively, eyepiece can utilize these data may need what eyepiece function and application according to the experience predictive user of passing collection.In this way, eyepiece can be used as user's robotization assistant, for example, user conventionally start when application start them, the flow transmission music etc. when closing augmented reality and GPS near a position or while entering buildings, in the time that user enters gymnasium.Alternatively, multiple eyepiece users' the behavior of learning and/or action can be independently stored in common behavior database, and the behavior of wherein learning between multiple users can be used each user based on similar situation.For example, user can visit a city and on platform, wait for train, user's eyepiece access common behavior database determine other users wait for did when train a little what, such as obtaining instruction, search for interested place, listen to some music, search train time schedule, contact website, city and obtain travel information, be connected to social networking website and know public place of entertainment in this region etc.In this way, eyepiece can be experienced the help that robotization is provided to user by benefiting from many different users.In each embodiment, the behavior of learning can be used to the behavior profile of perfect preference profile, recommendation, advertisement location, social networking contacts, user or customer group for user etc.
In one embodiment, augmented reality eyepiece or glasses can comprise the one or more sound transducers 2900 for detection of sound.More than an example in Figure 29, describe.In some sense, acoustic sensor is similar to microphone, because they all detect sound.Acoustic sensor has their more responsive one or more frequency bandwidths conventionally, and sensor can be therefore selected for should being used for of wanting.Acoustic sensor can obtain and can use in conjunction with suitable frequency converter and other circuits needing from various manufacturers.Manufacturer comprises ITT Electronic Systems(ITT electronic system) (Salt Lake City, Utah, United States); Meggitt Sensing Systems(Meggitt sensor-based system) (California, USA San Juan-Ka Pisitelanuo); And National Instruments(American National instrument) (texas,U.S Austin).Suitable microphone comprises that microphone that those comprise single microphone and those comprise microphone or the microphone array of microphone array.
Acoustic sensor can comprise the sensor that those use Micro Electro Mechanical System (MEMS) technology.Due to structure very meticulous in MEMS sensor, sensor is very responsive and conventionally have susceptibility widely.MEMS sensor is normally made by semiconductor fabrication.The moving girder construction that one element of typical MEMS accelerometer is made up of two groups of finger pieces.One group is fixed to the plane massively of substrate; Another group is attached to the known piece on the spring that can move in response to the acceleration applying that is arranged on.Acceleration that this applies changes the electric capacity between fixing and moving beam finger beams.Result is highstrung sensor.Such sensor is for example by STMicroelectronics(STMicw Electronics) (Dezhou Austin) and Honeywell International(Honeywell Int Inc) (New Jersey Morrison town) manufacture.
Except mark, the sound capabilities of augmented reality equipment also can be used to the origin of location sound.As is generally known, at least need two sound or acoustic sensor to carry out location sound.Acoustic sensor will be equipped with suitable frequency converter and signal processing circuit (such as digital signal processor) for explaining signal and completing the target of wanting.It can be the origin of determining sound from emergency location (such as buildings, the traffic accident etc. of burning) for an application of sound localization sensor.Be equipped with that the first-aid personnel of embodiment described here is each has one or more than one acoustic sensor or a microphone being embedded in shelf.Certainly, sensor also can be worn on people's clothes or even be attached on the person.In any case signal is sent to the controller of augmented reality eyepiece.Eyepiece or glasses are equipped with GPS technology and also can be equipped with direction-finding ability; Alternatively, by everyone two sensors, microcontroller can be determined the direction of noise origin.
If have two or more fire fighters or other emergency reactions personnel, can know their position from their GPS ability.Two reaction personnel's position and the direction from every reaction personnel to the noise detecting are then known by any in two or Chief Fire Officer or control general headquarters.The definite originating point of noise then can be by determining with known technology and calculating method.Estimate referring to for example Acoustic Vector-Sensor Beamforming and Capon Direction Estimation(acoustic vectors-sensor beam forming and Capon direction, M.Hawkes and A.Nehorai, IEEE signal is processed transactions, volume 46, the 9th phase, in September, 1998,2291-2304 page); Also referring to Cram é r-Rao Bounds for Direction Finding by an Acoustic Vector Sensor Under Nonideal Gain-Phase Responses, Cram é r-Rao circle of Noncollocation or Nonorthogonal Orientation(for being found by acoustic vectors sensor travel direction under imperfect gain phase response, non-arrangement or nonopiate orientation, P.K.Tam and K.T.Wong, IEEE sensor magazine, volume 9, the 8th phase, in August, 2009,969-982 page).The technology using can comprise timing difference (time of arrival of institute's sensor parameter is poor), velocity of sound difference and acoustic pressure difference.Certainly, acoustic sensor is measured the grade (for example, taking decibel as unit) of acoustic pressure conventionally, and these other parameters can be used in the acoustic sensor of suitable type, comprises calibrate AE sensor and sonac or frequency converter.
Suitable algorithm and all other necessary program designs can be stored in the microcontroller of eyepiece or be stored in the addressable storer of eyepiece.By using more than one reaction personnel or several reactions personnel, then can determine possible position, and reaction personnel can attempt locating personnel to be succoured.In other application, reaction personnel can determine that the position of paying close attention to personage is to enforce the law by these acoustic capability.In other application, the multiple personnel in manoeuvre can run into enemy firepower, comprise direct firepower (boresight) or indirect firepower (outside boresight, comprising high-angle fire).The constructed position that is used to estimate enemy firepower described here.If there are some personnel in region, estimating can be more accurate, is especially separated in wider region some personnel at least certain scope.This can be the effective tool for guide anti-artilleryman or counter mortar firepower for enemy.If target enough approaches, directly firepower also can be used.
Use an example of augmented reality eyepiece embodiment to describe in Figure 29 B.In this example 2900B, several soldiers are in patrol, and everyone is equipped with augmented reality eyepiece and reports to the police for enemy firepower.Acoustic sensor or the sound that detects of microphone by them can be passed to the squad's vehicle as shown, their platoon leader or long-range tactical operations center (TOC) or command post (CP).Alternatively or additionally, signal also can be sent to mobile device, such as the airborne platform showing.Communication between these soldiers and additional position can be by being promoted with LAN (Local Area Network) or other network.In addition, all signals that are transmitted can come protected by encryption or other safeguard procedures.One or more in squad's vehicle, platoon leader, mobile platform, TOC or CP will have integration capability, for to combining from some soldiers' input and the possible position of definite enemy firepower.From every soldier's signal by the soldier position comprising from augmented reality glasses or the intrinsic GPS ability of eyepiece.Every soldier's acoustic sensor can be indicated the possible direction of noise.By using the signal from some soldiers, the direction of enemy firepower and possible position can be determined.Soldier then can eliminate the threat of this position.
Except microphone, augmented reality eyepiece can be equipped with earplug, mentions as other place in this article, and it can be hinge earplug, and can be add movably audio frequency output jack 1401 maybe can be equipped with 1403.Eyepiece and earplug can be equipped with to send noise and eliminate interference, thereby allow user to hear better the sound of sending from the audio-video communication capacity of augmented reality eyepiece or glasses, and can be to automatic gain control feature.The loudspeaker of augmented reality eyepiece or earplug also can be connected with the full acoustic frequency of equipment and visual capabilities, with send from the high-quality of included telecommunication apparatus and clearly the ability of sound be connected.Mention as other place in this article, this comprises radio or cell phone (smart phone) audio capability, and can comprise supplementary functions, such as the Bluetooth for Wireless Personal Network (WPAN) tM(bluetooth) ability or correlation technique, such as IEEE802.11.
What strengthen audio capability comprises speech recognition and identification capability on the other hand.Speech recognition has been paid close attention to and has been explained orally what and voice identifier and pay close attention to and understand whom speaker is.Voice identifier can be combined with the face recognition ability of these equipment to be made to pay close attention to personage for identifying more for certain.As other local description in this article, the phase function connecting as a part for augmented reality eyepiece focuses on the personnel that want not obviously, such as the multiple faces in single people or crowd in crowd.By using camera and suitable facial recognition software, people or personnel's image can be obtained.Each feature of image is then divided into tolerance and the statistics of any amount, and the database of result and known people compares.Then can make identification.In same mode, can obtain from voice or the speech sample of paying close attention to personage.Sampling can be made mark or be marked in for example specific time interval, and the physical features of for example employment or the description of number carry out mark.Speech sample can compare with the database of known people, and if personnel's voice match can be made identification.In each embodiment, multiple interested individualities can be selected such as for biological label.Can make multiple selections by moving with cursor, hand posture, eyes etc.As the result of multiple selections, can be such as by showing, being provided for user by audio frequency etc. about the information of selected individuality.
Camera is used in the each embodiment of bio-identification of multiple people in crowd therein, and the control technology of describing in this article can be used to select face or iris for imaging.For example, use the cursor of handset type opertaing device to select can be used to select the multiple faces in the visual field of user surrounding environment.In another example, which face watches tracking attentively can be used to select will select for biological label.In another example, handset type opertaing device can be used for selecting individual posture by sensing, such as pointing to each individuality.
In one embodiment, the key property of someone's speech can or be understood from a sampling from multiple samplings of these people's voice.Sampling is divided into segment, frame and subframe conventionally.Conventionally, key property comprises basic frequency, energy, resonance peak, speech rate of people's voice etc.Analyze these characteristics by the software that carrys out analyzing speech according to specific formulation or algorithm.This field just constantly changes and improves.But current this sorter can comprise such as following algorithm: neural network classifier, k-sorter, hidden Markov model, gauss hybrid models and pattern matching algorithm etc.
General template 3100 for speech recognition and speaker identification has been described in Figure 31.The first step 3101 is to provide voice signal.Ideally, a people has the known sampling from previous experience, and this signal just can compare with known sampling.Signal is then digitized and is split to segmentation in step 3103 in step 3102, such as segment, frame and subframe.The feature of speech samples and statistics are then generated and extract in step 3104.Sorter or more than one sorter are then used in step 3105 the general classification of determining sampling.The aftertreatment of sampling can then for example be employed sampling and known sampling are compared for finding possible coupling and mark in step 3106.Result then can be output in step 3107.This output can be directed to the people of request coupling, and also can be recorded and send to other people and one or more databases.
In one embodiment, the audio capability of eyepiece comprises the hearing protection that uses the earplug that is associated.The audio process of eyepiece can be enabled automatic noise suppression, such as in the case of detecting loud noise near wearer's head.Any control technology described here can be used in conjunction with automatic noise suppression.
In one embodiment, eyepiece can comprise Nitinol headband.Headband can be bending metal sheet band, and eyepiece is fixed to head by its rear side that can extract or rotate and reach head from the mirror holder of eyepiece.In one embodiment, the end of Nitinol headband can have silica gel sheath, so that this silica gel sheath is promptly pulled out from arm end.In each embodiment, only have an arm to there is Nitinol band, and it is fixed to another arm and form headband.In other embodiments, two arms all have Nitinol band and two limits are all drawn out to form together headband or catch individually a part for head that eyepiece is fixed on wearer's head.In each embodiment, eyepiece can have interchangeable device eyepiece is attached to individual head, links and the joint that can be attached such as connects such as headband, spectacle frame, helmat belt, the helmet.For example, having a joint near user's temple places, can be attached to headband at this place's eyepiece, and can be disconnected connection at this place's headband, thereby user can adhere to spectacle frame, make eyepiece have glasses form, be attached to helmet etc.In each embodiment, the interchangeable device that eyepiece is attached to user's head or the helmet can comprise flush type antenna.For example, Nitinol headband can have inner embedded antenna, such as for specific frequency, for multiple frequencies etc.In addition, mirror holder, headband etc. can comprise that RF absorbent foam materials to help the absorption of RF energy in the time that antenna is used to transmit.
With reference to Figure 21, eyepiece can comprise the one or more adjustable sheath around scalable mirror holder 2134.Adjustable sheath around collapsible mirror holder 2134 can be fixed on the position of eyepiece user's head.One or more can making with shape-memory material in scalable mirror holder 2134.In each embodiment, one or two of mirror holder is made up of Nitinol and/or any shape-memory material.In other cases, can coating be stamped silicones around at least one the end in the sheath of scalable mirror holder 2134.In addition can stretch out from the end of eyepiece mirror holder 2116 around the adjustable sheath of scalable mirror holder 2134.They can telescopically stretch out and/or they can skid off from the end of eyepiece mirror holder.They can skid off from the inside of eyepiece mirror holder 2116 or they can slide along the outside surface of eyepiece mirror holder 2116.In addition, scalable mirror holder 2134 can be in contact with one another and fix.Another part that scalable mirror holder also can be attached to wear-type eyepiece creates the device for eyepiece being fixed to user's head.Sheath around scalable mirror holder 2134 can interfix by alternate manner, interlocks, connects, magnetically be coupled or be fixing to be provided to fixedly adhering to of user's head by other means.In each embodiment, also can be regulated to be individually attached to user's head or be caught each position of user's head around the adjustable sheath of scalable mirror holder 2134.The mirror holder that can regulate separately thus, can allow user for the customizability of adjusting lifting of personalization eyepiece is fixed to user's head.In addition, in each embodiment, can from wear-type eyepiece, separate around at least one of the sheath of scalable mirror holder 2134.In other embodiments, can be the supplementary features of wear-type eyepiece around the sheath of scalable mirror holder 2134.In this case, user can select scalable, non-telescoping or other mirror holder to be placed into wear-type eyepiece.For example, mirror holder can be used as and allows user that eyepiece is customized to and meets the external member of his or her certain preference or a part for external member is sold.Therefore the different external members that, user can have a specific scalable mirror holder of the preference that is appropriate to him by selection customize the material type can be made into around the adjustable sheath of scalable mirror holder 2134.Therefore, user can customize for his specific needs and preference his eyepiece.
In other embodiments, adjustable headband 2142 can be attached to eyepiece mirror holder, so that it is extended to eyepiece is fixed on to correct position around the rear portion of user's head.Headband can be adjusted to suitable correct position.It can be made up of any suitable material, includes but not limited to rubber, silicones, plastic cement, cotton thread etc.
In one embodiment, eyepiece can be fixed to by multiple other structure user's head, such as rigidity mirror holder, flexible mirror holder, the bending mirror holder of gooseneck, cable tension system etc.For example, such as in gooseneck configuration, flexible mirror holder can build from flexible pipe, and wherein flexible mirror holder can be bent to put in place to adjust to and be applicable to given user, and wherein flexible mirror holder can formalize as required again.In another example, such as in robot finger's configuration, flexible mirror holder can be from cable tension system constructing, and this flexibility mirror holder has multiple joints that connect to form part, and each ingredient is curved bending shape by being applied to the pulling force laying through the cable of each joint and ingredient.In this case, cable traction system can realize hinged ear angle and keep cable tension system can have two or more for size adjustment and eyepiece headwear connecting, cable can be stainless steel, based on Nitinol, electric actuation, ratchet, wheel is adjusted etc.
Each embodiment of cable tension system 17800 is shown in Figure 178-179A and B.In each embodiment, cable tension system can comprise ear angle 17802, the joint that this ear angle 17802 connects to form part by two or more forms, and each ingredient is curved bending shape by being applied to through the tension force of the cable 17804 of each joint and/or ingredient.The erection position place showing in Figure 178, ear angle can be straight along user's head location.Cable 17804 can be attached and be strained by regulator 17808, and the tension force that thus regulator is positioned to promote in cable 17804 causes ear angle to be bent or to curve to meet the shape of user's head.By increasing such tension force, can be fastening and/or become rigidity more in ear angle.By according to the setting of user's head, ear angle 17802 can regulate for specific user's head and/or assist eyepiece to keep by eyepiece being remained on regularly to user's head.In each embodiment, along with the tension force of cable 17804 increases, ear angle becomes more rigidity or does not more relax to be positioned to user's head, and along with the tension force in cable 17804 discharges, ear angle becomes more flexible, thereby allows one or two in ear angle to stretch and/or folding.In each embodiment, regulator 17808 can be ratchet, electric actuation, wheel is adjusted, comprise wedge block etc.In each embodiment, wedge block can be taper adjustment member, and it can provide adjusting by draw ring etc. is drawn in or pulled out, and allows the position of the one or more parts that make ear angle and/or eyepiece raised or decline.In each embodiment, as shown in Figure 179 B, ear angle 17804 can be configured to robot finger and configures and formalize.Adjustable ear described here angle can provide folding convenient for the easy-to-use benefit that eyepiece is fixed to user's head that provides simultaneously.In each embodiment, ear angle can provide the design of parcel head, wherein the ear angle of eyepiece left and right sides parcel user's head and contact or the rear portion close to contact user head.In each embodiment, ear angle can hitch to provide the fixing of increase mutually.Hitching like this can realize by the hitch gear on magnet, ear angle on each ear angle etc.In each embodiment, ear angle can partially or even wholly be wrapped up user's head or be agreed with mutually with user's contouring head, and/or they can be by the side along user's head and/or be fixed on user's ear rear and be secured to user's head.In each embodiment, ear angle can be attached to the earphone of eyepiece, such as the earphone 2104 showing in Figure 22.Ear angle can for good and all or movably be attached to earphone.In each embodiment, as shown in Figure 180, ear angle can comprise a part for the earphone of eyepiece, or it can comprise whole earphone (not shown).In each embodiment, regulator 17808 can be positioned at part that ear angle is adjacent to eyepiece, ear angle near or through end or ear angle and/or any other position of eyepiece of user's ear.In each embodiment, as described in this, one or two in ear angle is adjustable.In each embodiment as described in this, shown in Figure 184, ear angle (only illustrate it is own and there is no eyepiece) can wrap up user's head and/or agree with mutually with user's contouring head.
In each embodiment, the changeable gravitation in laminate between multiple layers can be used to ear angle.For example, one or more ears angle can comprise that the gravitation between layer and the layer in laminate can come from magnetic, static and/or vacuum plant.In each embodiment, magnet can come to be used in the following manner: thus thereby magnetic pole is rotated to and attracts or repel position to allow in laminate each layer to attract each other to make ear angle fastening and mutually repel ear angle is relaxed.In each layer of each embodiment being close together of laminate, can apply voltage and produce the electrostatic attraction that can be changed by TURP therein.Along with gravitation is produced, ear angle can be fastening.When voltage is removed or when electrostatic attraction is switched, ear angle can relax.In each embodiment, can be by being laminated together to produce vacuum by two, these two layers combine and have resilience in one or more parts of each layer, thereby between each layer of this resilience, produce chamber or space produces vacuum.Be combined in together along with each layer, they can make ear angle firm.Vacuum seal can be broken to allow ear angle lax.In each embodiment, because ear angle is firm, they can provide the more rigid and/or fixing maintenance of eyepiece on user's head.In each embodiment, ear angle can partially or even wholly be wrapped up user's head or be agreed with mutually with user's contouring head, and/or they can be by the side along user's head and/or be fixed on user's ear rear and be secured to the rear portion of user's head and/or user's head.Along with electrostatic potential, polarity and/or vacuum are adjusted, ear angle can firmly allow ear angle to be fixed to user's head, and ear angle can relax or discharge to stretch and/or fold into make-position.
In each embodiment, one or more ears angle can comprise interior bar and/or line structure, and wherein each ear angle further comprises magnet.The magnet at each ear angle can interconnect, thereby allows two ear angle parcel users' head.The interconnective action of magnet can allow line and/or interior bar structure to tighten, thereby is provided to more fixing being applicable to of user's head.In each embodiment, by connecting magnet, the harder head that allows ear angle parcel user can be erect or become to the interior bar at ear angle, and/or ear angle parcel user's head can be tightened and allow to the inner wire at ear angle.In each embodiment, ear angle can partially or even wholly be wrapped up user's head or be agreed with mutually with user's contouring head, and/or they can be by the side along user's head and/or be fixed on user's ear rear and be secured to user's head.In the time that magnet is not connected, ear angle can be stretched and/or can be folded.
In each embodiment, one or more ears angle can utilize the air pressure in the chamber of inside, ear angle, and it can firm ear angle.Air pressure can be increased firm ear angle.The firm ear angle that can allow in the time that eyepiece uses like this regulates and/or wraps up user's head.In each embodiment, ear angle can partially or even wholly be wrapped up user's head or be agreed with mutually with user's contouring head, and/or they can be by the side along user's head and/or be fixed on user's ear rear and be secured to user's head.Air pressure can be lowered the ear angle that relaxes.In the time that ear angle is relaxed, they can stretch and/or be folded.Air pressure can be conditioned being placed on user's head or before or after user's head is taken away.In each embodiment, air pressure can regulate by the pump in the side frame being operated by finger presses or alternate manner.In each embodiment, pump can be via being presented at the user interface in glasses or regulating via other means.
In each embodiment describing in this article, the solidness at ear angle can be relevant with cubic relationship to thickness.As example, than single layer, the solidness of two layers that do not connect can reach twice, if but layer is connected to single layer, the combination layer so with double thickness will have the solidness that is raised 8 times.As further example, three independent layers have the solidness of three times than single layer, but three layers are joined together than single layer the solidness with 27 times.
In each embodiment, as shown in Figure 181, one or more ears angle can comprise inside and outside position, thus inner portion be form from one of ear angle part and outer portion is to form from another part at ear angle.Bifurcated or otherwise form and form two independent positions from ear angle in Ke Conger angle, inside and outside position, one of them position is outer portion and another is inner portion.In each embodiment, inner portion can contact with user's head and outer portion can contact with inner portion.In each embodiment, inside and outside position is interlockable, as what show in the embodiment describing in Figure 182.Inside and outside position can comprise interlock slot, tooth or other device that their are interlocked or are tied.Top and/or outer portion can comprise draw ring or other projection, can make inside and outside position no longer lock together by its user.In each embodiment, each position can be bent to user's head.In addition, inside surface can be facing to outside surface to extrapolation.By interlocking inside and outside position, the thickness at each position can be doubled.Therefore,, by promoting the thickness of ear angular position, solidness can be raised.In each embodiment, by the thickness at multiplication ear angle, than single layer, solidness can be raised 8 times.Divest skin and ear angular position can be turned back to flexible state, thereby allow ear angle folded.In each embodiment, ear angle can or be attached and be secured to user's head by alternate manner by magnet, clip, suspension hook.
In addition,, in each embodiment, as described in Figure 183, one or more ears angle can comprise three positions.In such embodiments, ear angle can comprise as the inside and outside positions with reference to Figure 181 and 182 descriptions, but this embodiment also can comprise middle part 18302, to make ear angle as being explicitly made up of three positions in Figure 183.Ear angle can further comprise that one or more buttons, hasp, interlock slot, tooth, nail or other device lock together each ear angle position.One or more draw ring or other projections of comprising at each position, can be via making tooth or other device that each position is locked together discharge to make inside and outside position no longer to lock together by its user.In each embodiment, than single layer, three layers that do not connect can have the solidness of three times, but in the time that three layers are locked/linked together, than single layer, ear angle can have the solidness of 27 times.In the time that three positions are not connected or are not locked in together, ear angle can be flexible, to make them can stretch and/or fold.In addition, although each position is not locked in together, can slide mutually in each position, thereby to allow them be flexible and in the time being used, be more prone to storage, and in the time layer being locked in together or being nailed together, they can not slide mutually.Each position at ear angle can reside in sheath, pipe or other comprises in the structure at ear angle, so that independent each position is not demonstrated.Although described the ear angle with two and three positions, those skilled in the art can understand in each embodiment, and ear angle can be by forming more than the position of three and/or the vicissitudinous thickness of tool.
In each embodiment described here, parcel ear angle is collapsible.In the time that ear angle is folded to make-position (as in the time that user does not use eyepiece), ear angle can be vertically to make their folding and ear angle parcel user's head and/or ears or can not disturb folding with the ability that the profile of user's head and/or ear agrees with mutually.In each embodiment describing in this article, thereby ear angle can be folded and vertical, allows thus ear angle to become the flat eyepiece that allows and store with flat, configuration.In each embodiment, in the time discharging at hinge place or discharge by alternate manner, ear angle can be vertical, thereby allow eyepiece folded.As described in this, in each embodiment, ear angle can become more not rigidity, thereby allows them folding.
In each embodiment, leveller pad can be used to one or more ears angle, to make them to regulate or to solve user's ear or the diverse location of eyes as providing in the ear of different vertical position.In each embodiment, pad can be placed on the diverse location that on ear angle and contact point user's ear, eyepiece is regulated to be applicable to user's ear and/or eyes.In each embodiment, leveller pad can regulate by wedge block or by various means.Leveller pad can be the part at ear angle, or leveller pad can by clip, gluing, friction or other means be attached to ear angle.
In each embodiment describing in this article, eyepiece can be installed closed-cell foam material with ear angle on the one or more regions that contact with user.Foamed material can provide comfortableness to user, also prevents that moisture and sweat from infiltrating foamed material simultaneously.In addition, closed-cell foam material also provides impunctate surface to prevent that eyepiece from carrying bacterium, microorganism and other biosome and preventing their growth.In each embodiment describing in this article, foamed material can be antimicrobial and/or antibacterial and/or use for the material of such object and process.
In one embodiment, eyepiece can comprise security feature, such as M-Shield Security(M-Shield safety), when secure content, DSM, safe operation, IPsec etc.Other software features can comprise: user interface, application, framework, BSP, codec, integrated, test, system verification etc.
In one embodiment, eyepiece material can be selected to realize strengthening.
In one embodiment, eyepiece perhaps can be accessed and be comprised that the wireless 3G access point of 3G, 802.11b connect and be connected with bluetooth, can jump to the eyepiece embodiment that enables 3G from an equipment so that obtain data.
The disclosure also relates to for catching the method and apparatus about individual biometric data.Described method and apparatus provides the wireless seizure to individual fingerprint, iris patterns, face structure and other unique biometric feature, then data is sent to network or directly sends to eyepiece.Also can compare with the data of previous collection from the data of individuality collection, and be used to identify particular individual.
In each embodiment, eyepiece 100 can with such as bio-identification flashlamp 7300, bio-identification phone 8000, bio-identification camera, pocket biometric apparatus 5400, arm belt biometric apparatus 5600 etc. mobile biometric apparatus be associated, wherein this moves biometric apparatus and can be used as autonomous device or communicate by letter with eyepiece, such as the storage of the control for to equipment, demonstration to the data from equipment, data, be linked to external system, be linked to other eyepieces and/or other move biometric apparatus etc.Mobile biometric apparatus can make soldier or other non-soldiers gather or utilize existing biometric data to carry out sidelights on to a certain individuality.This equipment can provide tracking, monitoring and gather the biometric record such as comprising video, speech, gait, face, iris biometric feature etc.The data that equipment can be collection provide geo-location label, and for example band is free, date, place, data acquisition people, environment etc.Such as utilizing thin film sensor, record, collection, mark and checking face, fingerprint, iris, laten fingerprints, the palmmprint of diving, speech, articles in pocket and other to identify witness marking and environmental data, equipment perhaps can catch and record fingerprint, palmmprint, scar, mark, tatoos, audio frequency, video, annotation etc.Equipment perhaps can read wet or dry printed article.Equipment can comprise camera, for example, with IR illumination, UV illumination etc., with the ability of perspective dust, cigarette, haze etc.Camera can be supported dynamic range expansion, self-adaptation defect pixel correction, senior acutance enhancing, geometry distrotion correction, senior color management, the detection of hardware based face, video stabilization etc.In each embodiment, camera output can be transmitted to eyepiece for presenting to soldier.Depend on requirement, this equipment can hold multiple other sensors, as described herein all, comprises accelerometer, compass, ambient light sensor, proximity sense, baroceptor and temperature sensor etc.Equipment also can have mosaic plating sensor as described herein, thereby produces individual fingerprint, the simultaneously swirls of multiple fingerprints, palmmprint etc. and the high-definition picture of flow liner.Soldier can utilize mobile biometric apparatus more easily to gather personal information, for example, for document and Media Development utilization (DOMEX).For example, during interview, registration, inquiry etc., operator is capable of taking pictures and read identification data or " articles in pocket " (as passport, ID card, individual document, cell phone catalogue, photo), gather biometric data etc., typing is paid close attention in personage's profile, and this profile can be imported in the safety database that can search for.In each embodiment, can add the artificial input biometric data of filing with specific image, catch thereby realize partial data.Data can be by geo-location automatically, add time/date tag, filing is medium to digital archives, such as the Globally Unique Identifier with this locality or network allocation (GUID).For example, can be at IED(improvised explosive devices) blast scene place capturing facial image, can catch left iris image at suicide blast scene place, can extract laten fingerprints from sniper rifle, they are all to gather with different mobile biometric apparatus in different places and time separately, and from multiple inputs, mark is paid close attention to personage together, for example, at random vehicle inspection point.
The further embodiment of eyepiece can be used to provide biometric data collection and report the test.Biometric data can be the vision biometric data such as facial biometric data or iris biometric data, can be maybe audio frequency biometric data.An embodiment who provides biometric data to catch has been provided Figure 39.Assembly 3900 combines the eyepiece 100 of discussing about Fig. 1 above.Eyepiece 100 provides the mutual wear-type eyepiece that comprises optics assembly.Also can use other eyepieces that similar functions is provided.Eyepiece also can combining global positioning system ability allow positional information to show and report.
Optics assembly allows user to observe surrounding environment, comprises near the individuality that wearer is.One embodiment of eyepiece allows user with face-image and iris image or face-image and iris image or near the individuality of audio sample biologically identifying.Eyepiece combines the correcting element of the view of correcting user to surrounding environment, also shows the content that offers user by integrated processor and image source.Integrated image source will be displayed to user's content and introduce optics assembly.
Eyepiece also comprises the optical sensor for catching biometric data.In one embodiment, integrated optical sensor can be in conjunction with the camera being arranged on eyepiece.This camera is used to catch near the individual bio-identification image of eyepiece user.By eyepiece being positioned in suitable direction, optical sensor or camera are pointed near individuality by user, and this can be only by seeing that this individuality completes.User can select whether to want one or more in capturing facial image, iris image or audio sample.
The seizable biometric data of eyepiece shown in Figure 39 comprises for the iris image of the face-image of face recognition, client iris identification and for the audio sample of voice identification.Eyepiece 3900 combines the multiple microphones 3902 that support the end-fire array format of both settings along the right support of eyepiece and the left side.Microphone array 3902 is tuned to the speech that allows to catch people in the environment with high-level neighbourhood noise specially.Microphone can be directivity, that can turn to and can change.Microphone 3902 provide for improvement of the option selected of audio capture, comprise omni-directional operation or directed beams operation.Directed beams operation allows user to record the audio sample from this particular individual by the direction that microphone array is redirect to target individual.Self-adaptation microphone array can be created, and it will allow operator that the direction of microphone array is dimensionally turned to, and wherein directed beams can be on-fixed target maximum signal or minimise interference noise by adjusting in real time.ARRAY PROCESSING can allow to come cardioid element (cardioid element) summation by analog or digital means, wherein between omnidirectional and directional array operation, can have switching.In each embodiment, beam forming, array turn to, adaptive array processing (speech source location) etc. can be carried out by airborne processor.In one embodiment, microphone perhaps can carry out the directed record of 10dB.
Be used for following the tracks of the phased array Voice & Video tracking that Voice & Video catches by combination, audio frequency bio-identification catches and is enhanced.Audio frequency is followed the tracks of and allow to catch continuously while movement audio sample in target individual is had the environment of other noise sources.In each embodiment, user's speech can be reduced from track, to make it possible to target individual to have more clearly and reappear, for example, for what having been distinguished, providing better position to follow the tracks of, provide better audio frequency tracking etc.
In order to give display optics and biometric data collection power supply, eyepiece 3900 is also combined with lithium ion battery 3904, and it can work and exceed 12 hours on the basis of single charge.In addition, eyepiece 100 is also combined with processor and solid-state memory 3906 for the treatment of the biometric data capturing.Processor and storer are configurable to work with catch any software of a part of agreement or form (such as .wav form) or algorithm as bio-identification together with.
The further embodiment of eyepiece assembly 3900 provides the integrated communicaton ability that the biometric data capturing is sent to the remote equipment of storing biometric data in biometric data storehouse.The biometric data that biometric data database interpretation catches, decryption, and preparing content is for showing on eyepiece.
In operation, want to catch wearer near the eyepiece of the individual biometric data of observing he is own or herself be located so that this individuality appears in the visual field of eyepiece.Once in place, user initiates the seizure to biometric information.The biometric information that can be captured comprises iris image, face-image and voice data.
In operation, wanting to catch wearer near the eyepiece of the individual audio frequency biometric data of observing he is own or herself be located so that this individuality appears near eyepiece, is near the microphone array that is arranged in eyepiece temple specifically.Once in place, user just initiates the seizure to audio frequency biometric information.This audio frequency biometric information is made up of the sample that records of target individual voice.Audio sample can be captured together with the vision biometric data such as iris and face-image.
In order to catch iris image, wearer/user observes desirable individuality and eyepiece is located to optical sensor assembly or camera can gather the image of the bio-identification parameter of this desirable individuality.Once capture, eyepiece processor and solid-state memory are sent to remote computing device for further processing with regard to the image of preparing to catch.
Remote computing device receives the bio-identification image sending and this image sending is compared with the biometric data of the same type previously catching.Iris or face-image are compared with the iris or the face-image that previously gather, determine whether this individuality had previously run into and identified.
Once make relatively, remote computing device just sends report relatively to wearer/user's eyepiece, for showing.This report can indicate caught bio-identification image and the images match previously catching.In such cases, user receives and comprises identity that this is individual and the report of other identification informations or statistics.Be not that all biometric data that capture all allow non-ambiguity and determine one's identity.In this type of situation, remote computing device provides the report of discovery situation, and can ask user to gather additional biometric data (may be dissimilar biometric data), helps mark and relatively processes.Vision biometric data can be supplemented as further auxiliary to what identify by audio frequency biometric data.
Face-image is captured in the mode that is similar to iris image.Due to the size of gathered image, the visual field must be larger.This also allows the subscriber station must be farther from the target of the facial biometric data that is just captured.
In operation, user may catch this individual face-image at first.But this face-image may be incomplete or uncertain, because this individuality may wear is clothing or other dress ornaments that has blocked facial characteristics, such as cap.In this case, remote computing device can ask to use dissimilar bio-identification to catch and transmit additional image or data.In above-mentioned situation, bootable user obtains iris image and supplements caught face-image.In other examples, the additional data of asking can be the audio sample of this individual speech.
Figure 40 exemplifies and catches iris image for iris recognition.This accompanying drawing exemplifies for the focus parameter of analysis image and comprises the geographic position of this individuality in the time that biometric data catches.Figure 40 has also described to be presented at the sample report on eyepiece.
Figure 41 exemplifies the seizure of polytype biometric data, is face and iris image in this example.This seizure can be carried out simultaneously, or carries out according to the request of remote computing device in the time that the biometric data of the first kind causes uncertain result.
Figure 42 illustrates the electricity configuration of the multiple microphone arrays that comprise in the temple of eyepiece of Figure 39.End-fire microphone array allows, with larger distance, signal is carried out to larger differentiation and better directivity.By being incorporated in the transmission line of rear microphone postponing, signal processing is enhanced.The use of two omni-directional microphones realizes the switching from omni-directional microphone to directional microphone.This audio capture that allows for desirable individuality is carried out better direction finding.Figure 43 exemplifies and has described to improve by the obtainable directionality of different microphones.
As shown in the top of Figure 43, can use single omni-directional microphone.This microphone can be placed on from the given distance of sound source, and the acoustic pressure at microphone place or DAB input (DI) will be in given dB levels.Substitute single microphone, can use multiple microphones or microphone array.For example, 2 microphones can be placed on distance sources twice at a distance, are 2 apart from the factor, and acoustic pressure increases 6dB.Alternatively, can use 4 microphones, be 2.7 apart from the factor, and acoustic pressure increases 8.8dB.Also can use array.For example, increase at the DI can apart from 8 microphone arrays at the factor 4 places with 12dB, and increase at the DI can apart from 12 microphone arrays at the factor 5 places with 13.2dB.The figure of Figure 43 has described point, and these given sound pressure levels from this point produce same signal level at microphone place.As shown in figure 43, can use the super cardioid microphone of the first order in identical distance, having in this example 6.2dB increases, and the second level.Multiple microphones can be arranged by compound microphone array.Substitute and catch audio sample with the high-quality microphone of a standard, eyepiece temple part holds multiple microphones of different qualities.For example, this bio-identification fingerprint that can just generate someone speech user is for catching in the future and being provided relatively time.The example that multiple microphones are used uses the microphone isolated with cell phone to carry out the definite electric and acoustic characteristic of the speech of rendering individual.This sample is stored in database for comparing in the future.If this individual speech caught afterwards, previous sample just can be used for comparison, and because the acoustic characteristic of two samples will be mated, will be reported to eyepiece user.
Figure 44 illustrates and improves audio data capture with adaptive array.The algorithm for audio frequency processing being pre-existing in by amendment, can create the adaptive array that allows user that the directivity of antenna is turned in three-dimensional.Adaptive array processing allows the source of location voice, therefore the voice data capturing is related to specific individuality.ARRAY PROCESSING allows digitally or completes by analogue technique the simple summation of the heart-shaped element to signal.In routine is used, user should be switched microphone between omni-directional mode and directionality array.Processor allows on eyepiece, to carry out beam forming, array turns to and adaptive array processing.In each embodiment, audio frequency phase array can be used to the audio frequency of particular individual to follow the tracks of.For example, user may lock a certain individuality in surrounding environment audio feature code (such as real-time acquisition or from the database of sound characteristic code), follow the tracks of this individual position and without keeping eye contact or user to move their head.This individual position can be projected to user by eyepiece displayer.In each embodiment, also can provide by the embedded type camera in eyepiece the tracking of certain individuality, wherein user can not be required to keep to follow with this individual eye contact or their head of movement.That is to say, in any situation of audio frequency or vision tracking, eyepiece perhaps can be followed the tracks of the individuality in home environment, and user does not need to illustrate the occurent physical motion of indicators track, or even in the time that user moves their line of vision.
In one embodiment, integrated camera is recording of video file sustainably, and integrated microphone record audio file sustainably.The integrated processor of eyepiece can make it possible to add event tag in the long section of continuous audio frequency or videograph.For example, in the time that event, dialogue, experience or other interested projects occur, the passive record of whole day can be tagged.Tag and can press button, noise or physics tapping, gesture or any other control technology as herein described by explicitly and complete.Mark can be placed in audio or video file or be stored in metadata stem.In each embodiment, mark can comprise the gps coordinate of event, dialogue, experience or other interested projects.In other embodiments, mark can be synchronizeed in time with the GPS daily record on the same day.The trigger of other logic-based also can tag to audio or video file, such as with the proximity relation of other users, equipment, position etc.Event tag can be the active event label of user's manual activation, automatically occur passive event tag (such as by pre-programmed, by event profile management equipment etc.), by the label of the position sensing of user's location triggered etc.The event of trigger event label can be by with the triggering of getting off: sound, view, visual indicia, be received from network connection, optical flip-flop, sound trigger, contiguous trigger, Trigger of time, geographical space trigger etc.Event trigger can generate feedback (as audio tones, visual detector, message etc.) to user, storage information (as the entry in storage file, document, list, audio file, video file etc.), information generated transmission etc.
In one embodiment, eyepiece can be used as SigInt(signals intelligence) glasses.Use one or more in integrated WiFi, 3G or bluetooth radio, eyepiece can be used to collect significantly and passively near equipment user and individual signals intelligence.In the time that particular device ID is in nearby sphere, in the time that special audio sample is detected, in the time having arrived specific geographic position etc., signals intelligence can automatically be collected or can automatically be triggered.
The various embodiment of tactics glasses can comprise independent mark or the collection to biometric information, pay close attention to personage (POI) to utilize vision biometric information (face, iris, walking gait) to carry out geo-location at safe distance place, and utilization identifies POI for certain to the sane sparse recognizer of face and iris.Glasses can comprise the display without hand as bio-identification computer interface, so that plating and vision biometric information are incorporated on an integrated display and (are emphasized with the target strengthening), and check coupling and warning and without warning POI.Glasses can comprise position consciousness, such as showing that current and average velocity adds route and the ETA(Estimated Time of Arrival of destination), and prestrain or record trouble spot and retirement route.With the friendly troop of knowing you always where glasses can comprise is followed the tracks of the real-time interconnection of blue and red army,, realize the visual separation scope between blueness and red army, and enemy is carried out to geo-location their position of real-time sharing.The processor being associated with glasses can comprise the ability of OCR conversion and speech conversion.
Tactics glasses can be used in fight so that the graphic user interface being incident upon on eyeglass to be provided, this graphic user interface provides direction and the augmented reality data about the things such as following information to user: Team Member's position data, the cartographic information in area, SWIR/CMOS night vision, soldier's vehicle S/A, for to be conventionally less than the accuracy of 2 meters to POI or to be greater than the target of 500 meters the geo-location laser range finder that carries out geo-location, the blue army of S/A rang ring, Domex registers (registration), AR battlefield is repaired and is covered, and real-time UAV video.In one embodiment, laser range finder can be 1.55 microns of eye protection laser range finders.
Eyepiece can utilize GPS and inertial navigation (as utilized Inertial Measurement Unit) as described herein, all as described herein those, position and direction accuracy are provided.But eyepiece can utilize additional sensor and the algorithm being associated to strengthen position and direction accuracy, such as utilizing three axle digital compasses, inclinometer, accelerometer, gyroscope etc.For example, military operation may require, than from the obtainable larger position accuracy of GPS, to utilize other navigation sensors to increase the position accuracy of GPS therefore capable of being combinedly.
The resolution that tactics glasses can strengthen is feature, and as 1280 × 1024 pixels, focusing is feature automatically.
Getting off and capture in the belligerent task of enemy army, win that the war institute of low-intensity, low-density, unsymmetric form is duty-bound is effective information management.Tactics glasses system shows that by the tactics directly perceived of disoperative data recording and the comprehensive picture to situation perception combining the each soldier of ES2(is a sensor) ability.
In each embodiment, tactics glasses can comprise the one or more waveguides that are integrated in picture frame.In certain embodiments, total internal reflection eyeglass turning on simple eye or eyes/under turn over configuration and be affixed to a secondary Anti-bullet glasses.Tactics glasses can comprise the omnidirectional's earplug for senior hearing and protection, and for passing on the de-noising suspended microphone of differentiated order on voice.
In certain embodiments, waveguide can have contrast control.Can control contrast by any control technology as herein described, be arranged on the manual control of the controller on temple etc. such as ability of posture control, automated sensor control, use.
Tactics glasses can comprise anti-skidding, adjustable elasticity headband.Tactics glasses can comprise inserts clip correcting lens.
In certain embodiments, total internal reflection eyeglass is affixed to the equipment of installing on the helmet, in Figure 74, and can comprise VIS/NIR/SWIR CMOS color camera at day/night.This equipment utilization " perspective ", on the electric light projector image display that turns over, allow to obtain to threatening and uncrossed " visual field " of soldier's oneself weapon.The equipment being arranged on the helmet shown in Figure 74 A can comprise IR/SWIR luminaire 7402, UV/SWIR luminaire 7404, visible to SWIR full shot 7408, visible to SWIR object lens (not shown), the transparent pane 7410 of checking, iris recognition object lens 7412, generating laser 7414, laser pickoff 7418, or about described any other sensor of eyepiece as herein described, processor or technology, such as integrated IMU, eye protection laser range finder, integrated gps receiver, for compass and the inclinometer of positional accuracy, the perspective control that the viewing angle of change image mates eye position, electronic image is stablized and real time enhancing, the threat storehouse of accessing by tactical network stored on-board or remote storage etc.The wireless computer of wearing on health can with Figure 74 in equipment interconnection.The equipment on the helmet of being arranged on comprises SWIR projector optical device visible, such as RGB micro-projector optical device.Multispectral IR and UV imaging help identification fakement or the document through changing.The equipment being arranged on the helmet can be controlled with grip controller before the wireless UWB wrist strap of encrypting or weapon.
In one embodiment, transparent observing pane 7410 rotatable by 180 ° project image onto one surface come with other people share.
Figure 74 B shows the decomposition side view that is arranged on the equipment on the helmet.This equipment can comprise the ambidextrous base on left side or the right side for being arranged on the helmet.In certain embodiments, two equipment can be installed on the left and right sides of the helmet, to allow binocular vision.This equipment or two equipment can interlock enter MICH or the PRO-TECH helmet base of standard.
Now, soldier can not effectively utilize the data equipment in battlefield.Tactics glasses system combined low profile (low profile) form, lightweight material and fast processor determine to make in battlefield fast and accurately.The modular design of system allows equipment to be disposed efficiently to individual, squad or company, retains the ability with any battlefield computing machine interoperability simultaneously.Tactics glasses system combines the real time communication to data.Utilize airborne computer interface, operator can real time inspection, upload or comparing data.This provides valuable situation and environmental data can promptly be propagated individual and command post (CP) and tactical operation center (TOC) to all networkings.
Figure 75 A and 75B have described respectively the exemplary embodiment of bio-identification and situation perception glasses with front elevation and side view.This embodiment can comprise the multiple visuals field sensor 7502 that gathers situation perception and enhancing view user interface for biometric information, quick lock in gps receiver and IMU(comprise the 3 axle digital compasses for position and direction accuracy, gyroscope, accelerometer and inclinometer), with helping, biometric information catches and 1.55 microns of eye protection laser range finders 7504 of aiming, store the integrated digital video register of two quick flashing SD cards, real-time electronic image stabilization and Real-time image enhancement, be stored in carried micro SD card or by the threat storehouse of tactical network remote loading, on turn over optic metachromatic eyeglass 7508, flexible de-noising suspended microphone 7510, and add the 3 axle removable stereographic sound earplugs that strengthen hearing and protection system.For example, the plurality of visual field sensor 7502 can allow the FOV of 100 ° × 40 °, and this can be panorama SXGA.For example, sensor can be VGA sensor, SXGA sensor and the VGA sensor that generates the panorama SXGA view of 100 ° × 40 °F of OV with stitching on the display of glasses.Display can be translucent and have perspective control, the viewing angle of this perspective control break image mates eye position.This embodiment also can comprise that SWIR detects to make wearer to see the sightless 1064nm of enemy and 1550nm laser guidance, and can following content be feature: 256 AES encryption connections of ultra low power, tactics radio and the computing machine between glasses, instant 2 times of amplifications, automatic face tracking, face and iris record and have identification and the GPS geo-location of 1 meter of automatic identification range.This embodiment can comprise power supply, and such as 4-AA alkaline battery, lithium battery and the rechargeable battery box of 24 hour duration, its computing machine and memory expansion groove have waterproof and dust-proof belt.In one embodiment, glasses comprise bending holographical wave guide.
In each embodiment, laser used during eyepiece perhaps can sensing such as battlefield aims at.For example, the sensor in eyepiece perhaps can detect the laser of typical military Laser Transmission frequency band (as 1064nm, 1550nm etc.).Whether the position that in this way, eyepiece perhaps can detect them is just aimed at, whether another location is just being aimed at, using laser as the position etc. of aiming at auxiliary spotter.In addition, because eyepiece perhaps can sensing laser, such as directly or reflectingly, soldier just can not only detect directed or reflex to enemy's lasing light emitter of their position, and lasing light emitter positioning optical surface (as eyes) in battlefield scene can oneself be provided.For example, soldier can use laser scanning battlefield, and with eyepiece watch swash reflection of light return to the possible position as the enemy who observes by eyes.In each embodiment, eyepiece is serially to surrounding environment scan laser, and provides feedback and/or action, the position indicating such as the audible alarm to soldier, by the visual detector on eyepiece displayer etc. according to the result detecting.
In certain embodiments, compact camera (Pocket Camera) can carry out videograph and catch picture, use and big or small be set to biometric apparatus movement, light weight, firm leaving in pocket and analyze thereby allow operator to record environmental data.One embodiment can be 2.25 " × 3.5 " × 0.375 "; and can carry out at 10 feet of places face seizure; carry out iris seizure at 3 feet of places, to come recording of voice, articles in pocket, walking gait and other mark witness marking and environmental data with the obedience EFTS of any iris/facial algorithm compatibility and the form of EBTS.This equipment is designed to audit and catch in advance the specific image of obeying EFTS/EBTS/NIST/ISO/ITL1-2007, to be mated and to be filed by any bio-identification adapting software or user interface.This equipment can comprise HD video chip, has the 1GHz processor of 533Mhz DSP, GPS chip, active illumination and pre-quantization algorithm.In certain embodiments, small-sized biological camera (Pocket Bio Cam) can not monitor inventory in conjunction with bio-identification, and therefore it can be used and/or take action for police reserve force in all echelons.Data can and add date/time stamp by geo-location automatically.In certain embodiments, equipment can move Linux SE operating system, meets MIL-STD-810(military standard 810) environmental standard, and waterproof to 3 feet dark (about 1 meter).
In one embodiment, can be known as biological plating equipment for the equipment of fingerprint collecting.Biological plating device comprises the transparent platen with two hypotenuses.Platen is thrown light on by a pile LED and one or more camera.Multiple cameras are used and closely arrange and point to the hypotenuse of platen.Finger or palm are placed on platen and to the upper surface of platen to be pressed, and wherein camera catches crestal line pattern.Use frustrated total internal reflection (FTIR) to carry out document image.In FTIR, the crestal line of light by finger that platen is pressed or palm and the air gap that valley line the forms platen of escaping out.
Other embodiment are also possible.In one embodiment, multiple cameras are placed with inversion " V " shape of saw tooth pattern.In another embodiment, form rectangle and use the directly light through a side, camera array catches the image producing.Light enters rectangle by a side of rectangle, and camera is located immediately under rectangle, thus the crestal line and the valley line that camera can be caught illuminated by this rectangle by light.
After image is captured, next the image stitching from multiple cameras together with software.Can carry out Digital Image Processing with the FPGA of customization.
Once be captured with process, image just can be spread and be given remote display, such as smart phone, computing machine, portable equipment or eyepiece or other equipment.
Description above provides the summary of the operation of method and apparatus of the present disclosure.Additional description and the discussion of these and other embodiment are provided below.
Figure 45 illustrates according to structure and the layout of the fingerprint based on optical device of an embodiment and palmmprint system.Optical array is made up of about 60 wafer scale cameras 4502.System based on optical device is used ambients lighting 4503,4504 continuously for the swirls and the flow liner that form fingerprint or palmmprint are carried out to high-resolution imaging.Configuration low profile, light weight, extremely firm that this configuration provides.Strengthen permanance by anti-scratch, transparent platen.
Mosaic plating sensor uses frustrated total internal reflection (FTIR) optic panel image to be offered to the wafer scale camera array being arranged on class PCB substrate 4505.Sensor can be scaled to any flat width and length, and the degree of depth is approximately 1/2 ".Size can change at the plate that is small enough to catch just what a finger roll printing in the scope of the plate of the plating even as big as catch both hands simultaneously.
Mosaic plating sensor allows operator to catch plating and the data contrast on-board data base collecting is compared.Data also can be by upload and download wirelessly.This unit can be used as separate unit operation or can be integrated with any biological recognition system.
In operation, mosaic plating sensor provides high reliability in the rugged surroundings with too much daylight.For this ability is provided, by using pixel subduction, multiple wafer scale optical sensors are digitally stitched together.The image producing is designed to exceed per inch 500 points (dpi).Power supply is provided by battery or draws autoeciously power supply by use usb protocol from other sources.Form is obeyed EFTS, EBTS NIST, ISO and ITL1-2007.
Figure 46 exemplifies the traditional optical method that other sensors use.The method is equally based on FTIR(frustrated total internal reflection).In this accompanying drawing, finger contact prism scattered light.Camera catches the light of scattering.On finger, be shown concealed wire by the crestal line of plating, and the valley line of fingerprint is shown bright line.
Figure 47 illustrates the method being used by mosaic sensor 4700.Mosaic sensor also uses FTIR.But plate is illuminated from a side, and internal reflection is comprised in the plate of sensor.Crestal line contact prism the scattered light of the fingerprint being taken at its image shown in this accompanying drawing top, thus allow camera to catch the light of scattering.Crestal line on finger is shown bright line, and valley line is shown concealed wire.
Figure 48 has described the layout of mosaic sensor 4800.LED array is around the periphery setting of plate.It under plate, is the camera for catching fingerprint image.Image is captured on this base plate (be called and catch plane).Catch the plane sensor plane placed thereon with finger parallel.The quantity of the thickness of plate, the quantity of camera and LED can change according to the size of effective capture region of plate.Can reduce by adding catoptron the thickness of plate, the light path of the folding camera of these catoptrons, thus reduce required thickness.Each camera should cover one inch of space, and wherein some pixel is overlapping between camera.This allows mosaic sensor to realize 500ppi.Camera can have the 60 degree visuals field; But may there is significant distortion in image.
Figure 49 shows the camera visual field of the multiple cameras that use in mosaic sensor and the mutual each camera of embodiment 4900. covers little capture region.The distance between the camera visual field and camera and the end face of plate is depended in this region.α is the half of the horizontal field of view of camera, and β is the half of the vertical visual field of camera.
Mosaic sensor can be bonded in the biological phone and tactical computer that go out as illustrated in Figure 50.Biological phone and tactical computer use complete mobile computer framework, and it combines dual core processor, DSP, 3D graphics accelerator, 3G-4G, WLAN (wireless local area network) (according to 802.11a/b/g/n), bluetooth 3.0 and gps receiver.Biological phone and tactical computer are sent the ability suitable with standard laptop with the encapsulation of phone size.
Figure 50 exemplifies the assembly of biological phone and tactical computer.Biological phone and tactical computer assembly 5000 provide the display screen 5001, loudspeaker 5002 and the keyboard 5003 that are included in shell 5004.These elements are visible in the front of biological phone and tactical computer assembly 5000.Be located at assembly 3800 behind be for the camera 5005 of iris imaging, for camera 5006 and the biological plating fingerprint sensor 5009 of facial imaging and videograph.
For secure communication and data transmission are provided, equipment combines selectable 256 AES encryption and COTS sensor and carries out the pre-software of auditing of bio-identification for POI is obtained.This software is mated and files for the bio-identification adapting software of sending and receiving safety " unabiding " speech, video and data communication by any approval.In addition, this biology phone is supported Windows Mobile, Linux and Android (Android) operating system.
Biological phone is the handheld device of enabling 3G-4G, for touching web door and enabling supervision inventory (BEWL) database of biometric information.These databases allow caught bio-identification image and data to carry out scene relatively.This equipment is designed to be suitable for LBV or the pocket of standard.In each embodiment, bio-identification phone and tactical computer can use the mobile computer framework taking following content as feature: dual core processor, DSP, 3D graphics accelerator, 3G-4G, WLAN (wireless local area network) (802.11a/b/g/n), bluetooth 3.0, allow readable capacitive touch panel type display under safety and civilian network, gps receiver, WVGA daylight, can export stereo 3 D video, sense of touch qwerty keyboard backlight, stored on-board, the multiple operating systems of support etc., it provides the ability of laptop computer with light-type design.
Polytype biometric data can be searched for, gathers, registers and be verified to biological phone, comprises face, iris, two finger fingerprint and individual life data.This equipment is recording of video, speech, gait, mark mark and articles in pocket also.Articles in pocket comprises the various small articles that conventionally carry in pocket, wallet or parcel, and can comprise article such as change, I.D., passport, credit card.Figure 52 shows typical case's set of such information.What in Figure 52, describe is the example of the set 5200 of articles in pocket.The type of the article that can be included is individual document and photo 5201, books 5202, notebook and paper 5203, document such as passport 5204.
Bio-identification phone and tactical computer can comprise the camera that can carry out biometric data collection and video conference and video camera static such as high definition.In each embodiment, eyepiece camera can be used together with biological phone and tactical computer with video conference capabilities as described herein.For example, the camera being integrated in eyepiece can catch image and image is conveyed to bio-identification phone and tactical computer, and vice versa.Exchange data between eyepiece and bio-identification phone, network connects and can or be shared etc. by any foundation.In addition, can with firm, military structure adds shell to bio-identification phone and tactical computer completely, can tolerate militarization temperature range, waterproof (such as deeply reaching 5 meters) etc.
Figure 51 exemplifies the embodiment 5100 that catches laten fingerprints and palmmprint with biological phone.Use from the active illumination of ultraviolet-ray diode and catch the fingerprint and the palmmprint that have covered engineer's scale with 1000dpi.Fingerprint and palmmprint 5100 all can catch with biological phone.
Use GPS ability, the data that gathered by biological phone are by automatic geo-location and add date and time stamp.Data can be uploaded or be downloaded and be contrasted database airborne or networking and be compared.Be convenient to this data transmission by 3G-4G, WLAN (wireless local area network) and the bluetooth capability of equipment.Data input can complete with qwerty keyboard, or completes by available additive method, such as stylus or touch-screen etc.Using after specific image gathers, biometric data is archived.Manually input allows partial data to catch.Figure 53 illustrates that the digital archives image and the bio-identification that remain on database place monitor the interaction 5300 between inventory.Bio-identification monitors that inventory is used to the data that catch to compare with the data that previously catch on the spot.
Format can provide and the compatibility of a series of and various databases of biometric data with EFTS, EBTS NIST, ISO and ITL1-2007 form.
Provide the specification of biological phone and tactical computer below:
Working temperature :-22 DEG C to+70 DEG C
Connectivity I/O:3G, 4G, WLAN a/b/g/n, bluetooth 3.0, GPS, FM
Connectivity output: USB2.0, HDMI, Ethernet
Physical size: 6.875 " (height) × 4.875 " (wide) × 1.2 " (thick)
Weight: 1.75 pounds.
Processor: the polygonal 3D graphics accelerator of double-core-1GHz processor, 600MHz DSP and 30M per second
Display: 3.8 " WVGA(800 × 480) readable, transflective, capacitive touch screen under daylight, scalable demonstration output is for connect 3 1080p high definition screens simultaneously.
Operating system: Windows Mobile, Linux, SE, Android
Storage: 128GB solid-state drive
Extra storage: for two SD draw-in grooves of additional 128GB storage.
Storer: 4GB RAM
Camera: 3 high definition static state and video camera: face, iris and meeting (user's face)
3D supports: can export stereo 3 D video.
Camera sensor is supported: dynamic range of sensor expansion, self-adaptation defect pixel correction, senior acutance enhancing, geometry distrotion correction, higher management, based on HW(hardware) face detection, video stabilization
Bio-identification: airborne optical, 2 fingerprint sensors, face, DOMEX and iris camera
Sensor: as requested, can adapt to the interpolation of accelerometer, compass, ambient light sensor, proximity sense, baroceptor and temperature sensor.
Battery: be less than 8 hours, 1400Mah, rechargable lithium ion, hot plug battery group.
Power supply: for the various power supply options that move continuously.
Software features: face/posture detection, noise filtering, pixel correction.
There is the powerful video-stream processor that covers more, rotates and adjust big or small ability.
Audio frequency: onboard microphone, loudspeaker and audio/video input.
Keyboard: there is adjustable full touch qwerty keyboard backlight.
Additional equipment also can and can be worked together with biological phone and tactical computer in conjunction with mosaic sensor with external member, is provided for gathering the complete solution on the spot of biometric data.
Such equipment is illustrated small-sized biological external member in Figure 54.The assembly of small-sized biological external member 5400 comprises gps antenna 5401, biological plating sensor 5402, keyboard 5404, and they are included in shell 5403.Below provide the specification of this biology external member:
Size: 6 " × 3 " × 1.5 "
Weight: amount to 2 pounds
Processor and storer: 1GHz OMAP processor
650MHz kernel
Per second can processing reaches 1,800 ten thousand polygonal 3D accelerators
64KB L2 high-speed cache
32 FSB of 166MHz
The embedded PoP storer of 1GB, can extend to nearly 4GB NAND
64GB solid-state hard drive
Display: 75mm × 50mm, 640 × 480(VGA) screen processing of readable LCD, anti-dazzle, antireflection, anti-scratch under daylight
Interface: USB2.0
10/100/1000 Ethernet
Power supply: battery-operated: carry out the continuous registration of about 8 hours for about 5 minutes to register at every turn.
Embedded ability: mosaic sensor optics fingerprint reader
There is the initiatively digital iris camera of IR illumination
There is digital face and the DOMEX camera (visible) of flashlamp
Quick lock in GPS
Each feature of biological phone and tactical computer also can be provided at the biological external member for biometric data acquisition system, and this biometric data acquisition system is folded in firm and compact shell.Bio-identification standard picture and data layout gather for data, can be communicated by letter for carrying out near real-time data with Ministry of National Defence bio-identification authoritative database by cross reference.
Small-sized biological external member shown in Figure 55 can be used from the active illumination of ultraviolet-ray diode and catch the laten fingerprints and the palmmprint that have covered engineer's scale with 1000dpi.This biology external member has 32GB memory card, can carry out upload and download data under Real-time Battlefield condition with fight radio or computing machine interoperability.Power supply is provided by lithium ion battery.The assembly of biological external member subassembly 5500 comprises gps antenna 5501, biological plating sensor 5502, has the shell 5503 of substrate 5505.
It is individual mobile that biometric data collection is carried out Monitor and track by geo-location.Can gather fingerprint and palmmprint, iris image, face-image, laten fingerprints and video and they are registered to database by this biology external member.Be convenient to the data acquisition of these types for the algorithm of fingerprint and palmmprint, iris image and face-image.In order to help to catch iris image and laten fingerprints image simultaneously, this biology external member has IR and the UV diode of active illumination iris or laten fingerprints.In addition, this small-sized biological external member is also obeyed EFTS/EBTS completely, comprises ITL1-2007 and WSQ.This biology external member meets MIL-STD-810 for working at extreme environment, and uses (SuSE) Linux OS.
In order to catch image, this biology external member is used to have carries out wavefront coded high dynamic range camera to the maximal field depth, thereby guarantees that the details in laten fingerprints and iris image is caught in.Once be captured, Real-time image enhancement software and image stabilization just start to improve readability and outstanding visual discrimination are provided.
This biology external member can also also be stored in complete dynamically (30fps) color video in airborne " video camera on chip " by recording of video.
Eyepiece 100 can dock with movable folding type biometric information Enrollment Kit (being biological external member) 5500, this external member is the biometric data acquisition system being folded in compact firm shell, it is opened become the small-sized workstation for biometric data such as fingerprint, iris and face recognition, laten fingerprintses as described herein.The same with other situations that move biometric apparatus, movable folding type biometric information Enrollment Kit 5500 can be used as autonomous device or be used explicitly with eyepiece 100, as described herein.In one embodiment, movable folding type biometric information Enrollment Kit can be folded into such as 6 " × 3 " × 1.5 " small size, weight is such as being 2 pounds.It can comprise processor, digital signal processor, 3D accelerator, based on quick syndrome hash (FSB) function, solid-state memory (as stacked package (PoP)), hard disk drive, display is (as 75mm × 50mm, 640 × 480(VGA) readable LCD under daylight, anti-dazzle, antireflection, anti-scratch screen), USB, Ethernet, embedded battery, mosaic optical fingerprint readers, numeral iris camera (as thering is initiatively IR illumination), there is digital face and the DOMEX camera of flashlamp, quick lock in GPS etc.Data can be used bio-identification standard picture and data layout collection, and it can be communicated by letter for carrying out near real-time data with Ministry of National Defence bio-identification authoritative database by cross reference.This equipment perhaps can gather pays close attention to personage's biometric data and geographic position for Monitor and track, has the fight radio of standard networking interface or computing machine carry out that wireless data is uploaded/downloaded etc. by use.
Except biological external member, mosaic sensor can be incorporated into fingerprint, palmmprint, geo-location and the POI registering apparatus in wrist that be arranged on as shown in Figure 56.Eyepiece 100 can dock with biometric apparatus 5600, and this biometric apparatus is tied up wrist or the arm part soldier, and the folding biometric data acquisition system that can open for biometric data such as fingerprint, iris recognition, computing machines as described herein.This equipment can have readable display under integrated computer, keyboard, daylight, bio-identification sensing platen etc., and therefore operator can store rapidly and remotely or the object of comparing data for gathering and identifying.For example, can scan palm, fingerprint etc. with the responsive platen of arm belt bio-identification.This equipment can provide pays close attention to personage's geo-location label, and band is free, the image data on date, position etc.The same with other situations that move biometric apparatus, this biometric apparatus 5600 can be used as autonomous device or be used explicitly with eyepiece 100, as described herein.In one embodiment, biometric apparatus can be small and light, to it can be cosily worn on soldier's arm, such as have 5 for active fingerprint and palmprint sensor " × 2.5 " size, weight is 16 ounces.Can there is the algorithm for fingerprint and palm seizure.This equipment can comprise that processor, digital signal processor, transceiver, qwerty keyboard, the large pressure-actuated plating sensor of wind resistance rain, daylight descend readable transflective QVGA colored backlight LCD display, internal electric source etc.
In one embodiment, the assembly 5600 being arranged in wrist comprises following element at shell 5601: with 5602, over cap 5604, pressure-actuated sensor 4405 and keyboard and the lcd screen 5606 of setting and opening/closing button 5603, sensor.
This fingerprint, palmmprint, geo-location and POI registering apparatus comprise integrated computer, qwerty keyboard and display.This display is designed to allow easily operate under strong daylight, and carrys out alert operator with lcd screen or LED indicator and successfully carried out fingerprint and palmmprint seizure.This display improves readability by transflective QVGA colored backlight lcd screen.This equipment is light weight and compactness, weighs 16 ounces, records 5 at mosaic sensor " × 2.5 ".This compact size and weight permission equipment slip into LBV pocket or are tied up on user's forearm, as shown in Figure 56.The same with other equipment that are combined with mosaic sensor, all POI have added label with geographical location information in the time being captured.
The size of sensor screen allows ten fingers, palm, four fingers to clap seal and finger tip catches.This sensor combines large pressure-actuated plating sensor, for registering fast with the speed of 500dpi under any weather condition as MIL-STD-810 is specified.Software algorithm is supported these two kinds of fingerprint trap mode and palmmprint trap modes, and carries out equipment control with (SuSE) Linux OS.Owing to having the 720MHz processor of 533MHzDSP, seizure is rapidly.This processing power consigns to the specific image of correct format the system software of any existing accreditation.In addition, this equipment is also obeyed EFTS/EBTS completely, comprises ITL1-2007 and WSQ.
The same with other mosaic sensor devices, use wireless 256 the AES transceivers of removable UWB to make to become possibility with wireless mode communication.This also provide with the equipment of being stored in outside biometric data storehouse carry out safe upload and download.
Power supply uses lighium polymer or AA alkaline battery to provide.
The above-mentioned equipment being arranged in wrist also can use together with other equipment, comprises the augmented reality eyepiece with data and video display, shown in Figure 57.Assembly 5700 comprises following assembly: eyepiece 5702 and biological plating sensor device 5700.Augmented reality eyepiece provides redundancy, eyes, three-dimensional sensors and display, and the ability of watching under various illumination conditions is provided, from noon the dazzling sun to extremely low intensity level at night.Utilization is positioned at the rotary switch on the temple of eyepiece, and the operation of eyepiece is simple, user can be in the past arm computerized or sensor or laptop devices visit data.Eyepiece also provides the hearing of omnidirectional's earplug for hearing protection and raising.De-noising suspended microphone also can be integrated in eyepiece, so that the better communication to differentiated order on voice to be provided.
The UWB that eyepiece can be encrypted with 256-bit AES comes with biological phone sensor and is arranged on the equipment radio communication on forearm.This also allows equipment and laptop computer or fight wireless communication, and network is connected to CP, TOC and biometric data storehouse.Eyepiece compatible ABIS, EBTS, EFTS and JPEG2000.
Be similar to other above-mentioned mosaic sensor devices, eyepiece provides the geo-location of the pin-point accuracy of POI with the GPS networking and RF filtrator array.
In operation, computing machine low profile, that be arranged on forearm and Tactical Display Unit are integrated face, iris, fingerprint, palmmprint and finger tip collection and mark.This equipment is recording of video, speech, gait and other distinguishing characteristicss also.Face and iris tracking are automatically, thereby permission equipment helps the non-cooperation POI of identification.The transparent display that utilizes eyepiece to provide, operator also can watch sensor imaging, mobile map, has the stack application of navigation, aims at, from individual or other target/POI of position or other information, UAV etc., data and the biometric data that is just being captured of sensor.
Figure 58 illustrates the further embodiment of fingerprint, palmmprint, geo-location and POI registering apparatus.Equipment is 16 ounces (about 450 grams), uses 5 " × 2.5 " active fingerprint and palmmprint capacitive sensor.Sensor can be registered ten fingers, palm, four fingers bat seals, finger tip plating with 500dpi.The 0.6 – 1GHz processor with 430MHz DSP provides quick registration and data capture.This hardware compatibility ABIS, EBTS, EFTS and JPEG2000, and be feature for the networking GPS that carries out pin-point accuracy location to paying close attention to personage.In addition, this equipment is encrypted UWB and laptop computer or fight radio communication by 256-bit AES.Database information also can be stored on equipment, thereby allows just can compare on the spot without uploading information.This on-board data also can with other equipment wireless sharing, such as laptop computer or fight radio.
The further embodiment that is arranged on the biological plating sensor assembly 5800 in wrist comprises following elements: biological plating sensor 5801, wrist strap 5802, keyboard 5803, fight wireless electric connector interface 5804.
Data can be stored on forearm equipment, because equipment can utilize military condition (Mil-con) data storage lid (cap) to increase memory capacity.Data inputs is carried out on qwerty keyboard, and can have gloves on and carry out.
Display is to be designed to saturating reflection QVGA, colored backlight LCD display readable under daylight.Except operation under strong daylight, equipment can be moving in environment on a large scale, because equipment meets the service requirement at the MIL-STD-810 of extreme environment.
Above-mentioned mosaic sensor also can be incorporated in movable folding type bio-identification Enrollment Kit, as shown in Figure 59.This movable folding type bio-identification Enrollment Kit 5900 folds in itself and size is set to and is suitable for tactical vest pocket, has the size of 8 × 12 × 4 inches when folding.
Figure 60 exemplifies eyepiece can be how to fetching the embodiment 6000 of the holonomic system that is provided for biometric data collection with the equipment being arranged on forearm.
Figure 61 provides the system diagram 6100 for movable folding type bio-identification Enrollment Kit.
In operation, movable folding type bio-identification Enrollment Kit allows user search, collection, mark, checking and register face, iris, palmmprint, finger tip and the life data of a certain object, and also can recording of voice sample, articles in pocket and other visible mark marks.Once collected, data just can and add date and time stamp by automatic geo-location.Can contrast the data that database airborne and networking is searched for and relatively gathered.For with the database communication on equipment not, provide to use to there is the fight radio of standard networking interface or laptop computer and carry out wireless data and upload/download.Format submits to EFTS, EBTS, NIST, ISO and ITL1-2007.Image through pre-examination & verification can be sent directly to adapting software, because equipment can use any coupling and registration software.
Provide the comprehensive solution for mobile biometric data collection, mark and situation perception in conjunction with above-mentioned equipment and system.Equipment can gather fingerprint, palmmprint, finger tip, face, iris, speech and video data, for disoperative concern personage (POI) is identified.Video catches with high-speed video, catches making it possible to, such as catching from mobile video in unsettled situation.The information capturing can easily be shared, and additional data is transfused to by keyboard.In addition, all data are all coupled with date, time and geographic position label.This is convenient to the necessary information of bamboo telegraph situation perception in potential changeable environment.More people of these equipment have been equipped in utilization, and additional data acquisition is also possible, thereby have proved the theory of " each soldier is exactly a sensor ".By biometric apparatus and fight radio and integrated being convenient to of battlefield computing machine are shared.
In each embodiment, eyepiece can utilize flexible thin film sensor, interior, medium at the external unit docking with eyepiece such as being integrated in eyepiece itself.Thin film sensor can comprise the dynamo-electric configuration of thin multilayer, and in the time being subject to unexpected contact force or continually varying power, it produces electric signal.The typical case of dynamo-electric thin film sensor apply the on-off electric switch sensing that adopts power and time resolution sensing both.Thin film sensor can comprise switch, dynamometer etc., and wherein thin film sensor can be dependent on following effect: electrically contact suddenly (switching over), gradual change, the discharging gradually of electric charge, the generation etc. of the progressive electromotive force of transconductor in the time moving in magnetic field under effect of stress of electrical impedance under the effect of power.For example, flexible thin film sensor can be used in power compression sensor, and band is useful on the microcosmic power sensitive pixels of two-dimentional power sensor array.This may be for being useful below: computing machine, smart phone, notebook, be similar to the equipment of MP3, especially those have the touch-screen of the equipment of Military Application; For control the screen of anything (comprise unmanned plane (UAV), target drone, mobile robot, based on ectoskeletal equipment) under computer control; Etc..Thin film sensor may be useful in Secure Application, such as for detection of intrusion, in the long-range or local sensor opening or closing etc. of equipment, window, equipment.It is useful that thin film sensor may detect for trip wire, such as the electronic equipment using in noiseless, long-range trip wire detector together with radio.Thin film sensor can be used to open-cut out detection, such as the power sensor of the ess-strain for detection of in compartment, hull, aircraft target ship etc.Thin film sensor can be used as biometric sensor, such as at print, while getting palmmprint, fetching point line etc.Thin film sensor may be useful for Leak Detection, such as detecting the groove tank, the storage facility etc. that leak.Thin film sensor may be useful in medical sensor, such as in the time detecting liquid or the blood etc. of health outside.These sensor application be intended to thin film sensor can with by eyepiece, external unit is carried out the illustration of control & monitor adopted many application explicitly, and be not intended to limit by any way.
Figure 62 exemplifies the embodiment 6200 of film fingerprint and palmmprint collecting device.This equipment can record four fingerprints and clap the fingerprint that prints (slap) and roll printing (roll), palmmprint and reach NIST standard.Can catch with wet or dry hand the fingerprint image of outstanding quality.Compared with other large sensor, equipment reduces in weight and power consumption.In addition, sensor is independently and hot-swappable.The configuration of sensor can be changed to adapt to various demands, and sensor can various shape and size come manufactured.
Figure 63 depicts the embodiment 6300 of fingerprint, palm and registration data collecting device.This equipment records finger tip, roll printing, bat seal and palmmprint.Built-in qwerty keyboard allows the written registration data of input.The same with above-mentioned equipment, all data have all been coupled with the date, time and the geographic position label that gather.Bist data storehouse provides the airborne coupling of contrast bist data storehouse to potential concern personage.Coupling also can be carried out by battlefield network with other databases.This equipment can gather integrated face and the iris recognition supported of eyepiece with above-mentioned optical bio identifying information.
Provide for pointing below, the specification of palm and registering apparatus:
Weight and size: 16 ounces of forearm bands or insert LBV pocket
5 " × 2.5 " finger print/palm print sensor
5.75 " × 2.75 " qwerty keyboard
3.5 " × 2.25 " LCD display
One-handed performance
Environment: under all weather conditions, work ,-20 DEG C to+70 DEG C
Waterproof: 1 meter reaches 4 hours, performance is worked without reducing
Biometric information gathers: fingerprint and palmmprint collection, mark
For paying close attention to keyboard and the LCD display of personage's registration
Retain and be greater than 30000 for carry out the complete template archives (biometric information of 2 irises, 10 fingerprints, face-image, 35 fields) of airborne coupling to paying close attention to personage.
The biometric data all gathering is added to time, date and location tags
Pressure capacitive character finger print/palm print sensor
30fps high-contrast bitmap images
1000dpi
Wireless: completely with battlefield radio, hand-held or laptop computer interoperability, 256 AES encryptions
Battery: two 2000 milliamperes of lithium polymer batteries
Be greater than 12 hours, Fast Charge Battery in 15 seconds
Process and storer: 256MB flash memory and 128MB SDRA, support 3 SD cards, the maximum 32GB of each SD card.
600-1GHz ARM Cortex A8 processor
1GB?RAM
The use of the equipment to combining the sensor for gathering biometric data has been described in Figure 64-66.Figure 64 shows the embodiment 6400 that two-stage palmmprint catches.Figure 65 illustrates the collection 6500 that uses finger tip tapping.Figure 66 illustrates and gathers the embodiment 6600 that claps seal and roll printing.
Discussion is above relevant with the method for collection of biological identification data, as used platen or touch-screen to obtain fingerprint or palmmprint, as shown in Figure 66 and 62-66.The disclosure also comprises for using polarized light to carry out the method and system of no touch or contactless print.In one embodiment, fingerprint can be by people with polarized light source and fetch fingerprint image by the reflect polarized light in two planes and be acquired.In another embodiment, fingerprint can also be fetched fingerprint image with multispectral processing with light source by people and be acquired, such as using two imagers at two diverse location places with different inputs.These different inputs may be owing to using different filtrators or different sensor/imagers to cause.The application of this technology can be included in the bio-identification inspection to unknown personnel or object in the problematic situation of safety possibility of being the people who checks.
In the method, unknown personnel or object may approach checkpoint, for example, further go to his or her destination in order to be allowed to.As described in the system 6700 as shown in Figure 67, this people P and suitable body part (such as hand, palm P or other positions) are polarized light source 6701 and throw light on.As known in the technician of optical field, polarized light source can be simply with the lamp of polarizing filter or other light sources, to be emitted in the light of polarization in a plane.Light advances to this people who is arranged in the region that is given for noncontact print, so that polarized light is incided on the finger or other body parts of this people P.Then the polarized light of incident reflects from finger or other body parts, and to all the winds transmits from this people.After light has passed through the optical element such as lens 6702 and polarizing filter 6703, two imagers or camera 6704 receive reflected light.Camera or imager can be installed on augmented reality glasses, and that as above discusses about Fig. 8 F is such.
Then light be delivered to different polarizing filter 6704a, 6704b from palm or the one or more finger of paying close attention to personage, is then delivered to imager or camera 6705.It is poor that the light that has passed through polarizing filter can have direction poor (horizontal and vertical) or other directions of 90 °, as 30 °, 45 °, 60 ° or 120 °.Camera can be the digital camera that has suitable digital imagery sensor incident light is converted to suitable signal.Then signal is processed by the suitable treatment circuit 6706 such as digital signal processor.Signal then can be such as being combined in a usual manner by the digital microprocessor 6707 with storer.The digital processing unit with suitable memory is programmed to produce the data of the image or other the desirable images that are suitable for palm, fingerprint.Then numerical data from imager is combined in this processing, for example, use the technology of U.S. Patent number 6249616 or other.As described in the disclosure above, " image " that then can contrasting data storehouse check combination is to determine this people's identity.Augmented reality glasses can comprise this database at storer, or can compare and check with reference to the signal data in other places 6708.
In the process flow diagram of accompanying drawing 68, disclose a kind of for obtaining the process of contactless fingerprint, palmmprint or other biological identification plating.In one embodiment, provide 6801 polarized light sources.In second step 6802, personage and selected body part are paid close attention to for by optical illumination in location.In another embodiment, may use incident white light instead of use polarized light source.In the time that image is ready to be acquired, light reflects 6803 to two cameras or imagers from this people.Polarizing filter be placed in two magazine each cameras before, with make camera receive light in two different planes (as horizontal and vertical plane), be polarized 6804.Then each camera detects 6805 polarized lights.Then camera or other sensors convert incident light to that to be suitable for signal or the data 6806. of prepare image last, and image is combined and 6807 forms very significantly, plating reliably.Result is very high-quality image, and it can be compared to identify with numerical data base this people and detect and pay close attention to personage.
Although should be understood that digital camera is used to, in this contactless system, can use other imagers, such as active pixel imager, cmos imager, imager with multiple wavelength imagings, CCD camera, photodetector array, TFT imager etc.Although should also be understood that and used polarized light to create two different images, also can use other variations in reflected light.For example, substitute and use polarized light, can use white light, so different light filters is applied to imager, such as Bayer light filter, CYGM light filter or RGBE light filter.In other embodiments, polarized light source be may exempt, nature or white light instead of polarized light used on the contrary.
Use to no touch or contactless print has developed a period of time, such such as what prove in previous system.For example, U.S. Patent application 2002/0106115 uses polarized light in contactless system, but requires the people's who is fetched line the enterprising row metal spraying of finger.System requirements afterwards those technology described in U.S. Patent number 7651594 and U.S. Patent Application Publication No. 2008/0219522 and platen or other Surface Contacts.Contactless system as herein described does not require in the time of imaging and contacts, and does not require contact in advance, such as coating or reflective coating are set on paid close attention to body part yet.Certainly, imager or camera position relative to each other should be known, so that more easily process.
In use, can adopt contactless system of fingerprints at the checkpoint place such as compound entrance, building entrance, checkpoint, roadside or other convenient positions.This position can be to wish to permit some people to enter and refuse the place that other concern personages entered or even detained other concern personages.In practice, if use polarized light, system can be utilized external light source, such as lamp.Can be installed in secondary augmented reality glasses (for a people's) relative both sides for the camera of out of contact imaging or other imagers.For example, two camera versions have been shown in Fig. 8 F, wherein two cameras 870 are installed on picture frame 864.In this embodiment, can be comprised in the storer of augmented reality glasses for the software of at least processing image.Alternatively, can be routed near data center from the numerical data of camera/imager and carry out suitable processing.This processing can comprise that combining digital data is to form the image of plating.This processing also can comprise that the database that checks known people determines whether object is to pay close attention to personage.
Explosive compound and the drugs compound of concentration that the method for another kind of contactless print utilizes quantum dot laser to scan non-contactly finger and hand, extremely low to detect (parts per billion or even trillion/several).For example, the laser instrument of quantum dot or other kinds, laser array can be installed in the back of biological phone or the picture frame of glasses, so that very closely but detect non-contactly, to prevent the pollution between object.Thus, except these glasses or other accessory device collections biometric data relevant with iris, fingerprint, face and speech, also can gather explosive or drugs and pollute ID.
Alternatively, but two people everyone use a camera, seen in the camera 858 in Fig. 8 F.In this configuration, two people will be relatively near, to make their image separately by the suitable combination of software of similar cause suitably.For example, two cameras 6705 in Figure 67 can be installed on the augmented reality glasses that two pairs are different, handle a checkpoint such as two soldiers.Alternatively, camera can be installed on the wall or fixed position of checkpoint itself.Then two width images can be combined by the teleprocessing unit 6707 with storer, such as the computer system at buildings checkpoint place.
As discussed above, can mutually keep continuing contact by least one in many wireless technologys with the people of augmented reality glasses, especially they are in a checkpoint time on duty.Therefore, can be sent to data center or other command posts suitably process from single camera or from the data of two camera versions, be then to check that database finds the coupling of palmmprint, fingerprint, iris line etc.Data center can be positioned near checkpoint easily.Utilize the availability of present age computer and storage, provide multiple data centers and wirelessly the cost of update software by the prime cost Consideration that is not these systems.
No touch discussed above or contactless biometric data are collected available some modes and are controlled, the control technology of discussing such as other places in the disclosure.For example, in one embodiment, user can be by pressing the touch pads on glasses or initiating Data Collection session by providing voice commands.In another embodiment, user can carry out initiation session by hands movement or posture or by any control technology as herein described.Arbitrary technology in these technology is adjustable goes out menu, and therefrom user can select an option, such as " starting Data Collection session ", " stopping Data Collection session " or " continuation session ".If selected Data Collection session, computer-controlled menu can provide the menu setecting about camera quantity, which camera etc., and this and user select printer similar.Also may there is various modes, such as polarized light pattern, color filter pattern etc.After each selection, system can complete a task or another selection is provided when appropriate.Also may require user intervention, such as opening polarized light source or other light sources, application filtrator or polarizer etc.
After having obtained fingerprint, palmmprint, iris image or other desirable data, so menu can provide and which database be used for comparison, which equipment about and be used for the selection of storage etc.No touch or contactless biometric data collection system can be controlled by any method as herein described.
Although having obvious purposes aspect the potential concern personage of mark, also there is positive battlefield use in this system and sensor.Thereby fingerprint sensor can be used to transfer soldier's case history provide rapidly and easily immediately about irritated, blood group and other times responsive and determine the information of the data for the treatment of, thereby allow to provide suitable treatment under condition of battlefield.This is for out of the count patient when the initial treatment and may lose the patient of identification (RFID) tag and be particularly useful.
Can carry out the biometric data that Storage and Processing gathers in conjunction with server for the further embodiment that catches the equipment of biometric data from individuality.The biometric data catching can comprise having the audio sample of the hand image, palmmprint, face camera image, iris image of multiple fingers, individual speech, individual gait or mobile video.The data that gather must be able to be accessed so that useful.
The processing of biometric data can locally or at the server place separating remotely be carried out.Local processing can provide and catches original image and audio frequency and make this information become available option by WiFi or USB link in the time that main frame needs.As an alternative, then another local disposal methods image also transmits treated data by the Internet.This this locality is processed and is comprised and finds the step of fingerprint, step to fingerprint grading, find face the step of then pruning, find the also then step to its grading and the similar step for Voice & Video data of iris.Although locally deal with data requires more complicated code, the advantage that it provides the internet data reducing to transmit really.
The scanner being associated with biometric data collecting device can use with scanner standard in the code of conventional USB vision facilities protocol-compliant.Other embodiment can use different scanner standards, and this depends on demand.
In the time transmitting data with WiFi network, the biological plating equipment further describing herein can move or show as the web server to network.Can be by selecting from browser client or webpage clicking link or button obtain each various types of images.This web server function can be a part for biological plating equipment, is included in specifically in microcomputer function.
Web server can be a part for biological plating micro-mainframe computer, thereby allows biological plating equipment to create a webpage, and this webpage discloses the data that catch and some control is also provided.The additional embodiment of browser application can provide following control: catch the high resolving power lines of the hand, face-image, iris image, camera resolution is set, the pull-in time of audio sample is set, and also allows to flow to transmit by IP Camera, Skype or similar mechanism to connect.This connection can be affixed to audio frequency and face camera.
Further embodiment provides the browser application that gives the access to caught image and audio frequency by file transfer protocol (FTP) (FTP) or other agreements.The further embodiment of browser application can be used for automatically refreshing and repeatedly seizing preview image with selectable speed.
Additional embodiment provides and has used microcomputer to process this locality of caught biometric data, and provides additional control caught face is graded, fetched to found each plating and fetch the iris image through pruning and allow user to grade to each iris line in order to the grading, the permission user that show the image that catches.
An embodiment provides and the USB port of opening multimedia application platform (OMAP3) system compatible again.OMAP3 is a kind of for the proprietary system on the chip of portable multimedia application.OMAP3 device port is equipped with remote network drive interface specification (RNDIS), and it is the specialized protocol can be used on USB.These systems provide in the time that biological plating equipment is inserted into Windows computing machine usb host port equipment to be shown as the ability of IP interface.This IP interface will with in WiFi(TCP/IP web server) upper the same.Data are moved apart micro-mainframe computer by this permission, and the demonstration to caught plating is provided.
One application on microcomputer can realize the above by receiving data through usb bus from FPGA.Once received, just create JPEG content.This content can be written in run on laptop computer to the socket of server, or be written into file.Alternatively, server can receive socket stream, ejects image, and it is opened in window, thereby created new window for each biometric information catches.If microcomputer operational network file system (NFS) (for a kind of agreement using together with system based on Sun or SAMBA) (SAMBA heavily realizes for Windows client provides the free software of file and print service), the file catching can be shared and be accessed by any client computer of operation NFS or System Management Bus (SMB) (it is that a kind of PC communication bus is realized).In this embodiment, JPEG reader is by display file.Show that client computer can comprise the phone of laptop computer, augmented reality glasses or operation Android platform.
One additional embodiment provides server end application, and this application provides service same as described above.
One alternate embodiment of server end application shows result on augmented reality glasses.
Further embodiment provides microcomputer on moveable platform, is similar to mass-memory unit or stream transmission camera.Moveable platform is also combined with active USB serial port.
In each embodiment, eyepiece can comprise for catching from the sound of 360 degree and/or audio frequency and/or the vision sensor of vision around the wearer of eyepiece.This can come from and is arranged on originally sensor with it of eyepiece, or is coupled in the sensor being arranged on wearer's vehicle of living in.For example, sound transducer and/or camera can be installed in outside vehicle, are wherein coupled in eyepiece so that ambient sound and/or the view to surrounding environment " view " to be provided sensor communication.In addition, the audio system of eyepiece can provide sound protection, elimination, enhancing etc., helps improve wearer's acoustical quality in the time that wearer is surrounded by external or noisy noise.In an example, wearer can be coupled in the camera being arranged on the vehicle that they are driving.So these cameras can be communicated by letter with eyepiece, and provide vehicle periphery 360 to spend view, such as providing in the graph image to wearer by eyepiece displayer transmission.
In an example and with reference to Figure 69, each control aspect of eyepiece can comprise the remote control equipment that adopts wrist-watch controller 6902 forms, such as comprise in the time that user does not wear eyepiece with eyepiece to fetching the receiver and/or the transmitter that carry out information receiving and transmitting and/or control eyepiece.Wrist-watch controller can comprise camera, fingerprint scanner, discrete control knob, 2D control panel, lcd screen, the capacitive touch screen for multiconductor control, the vibrations motor/piezoelectric damper that gives tactile feedback, the button with sense of touch, bluetooth, camera, fingerprint scanner, accelerometer etc., such as what provide in the control functional area 6904 of wrist-watch controller 6902 or in other funtion parts 6910.For example, wrist-watch controller can have standard watch displays device 6908, but additionally has the function of controlling eyepiece, such as passing through to control the control function 6914 in functional area 6904.Wrist-watch controller can show and/or otherwise (for example vibration, the sound that can listen) notify user the message from eyepiece, such as Email, advertisement, schedule warning etc., and show the content from the message of the current eyepiece of not wearing of user.Vibrations motor, piezoelectric damper etc. can provide tactile feedback to touch-screen control interface.Wrist-watch receiver perhaps can provide virtual push button and click controlling in functional area 6904 user interfaces, and in the time that message is received hummed clash into user's wrist etc.Communication connection between eyepiece and wrist-watch receiver can provide by bluetooth, WiFi, cellular network or any other communication interface known in the art.Wrist-watch controller can utilize that embedded type camera carries out video conference (as described herein), iris scan (as being stored in database for the image that records iris, for authenticate together with the existing iris image of storage etc.), takes pictures, video etc.Wrist-watch controller can have all fingerprint scanners as described herein.Wrist-watch controller or any other haptic interface as herein described can be measured user's pulse, such as can be arranged in watchband by pulse transducer 6912(, below wrist-watch main body etc.).In each embodiment, eyepiece and other control/haptic interface assemblies can have pulse detection, so that must be monitored in a synchronous manner from the pulse of different control interface assembly, such as for health, activity monitoring, mandate etc.For example, wrist-watch controller and eyepiece both can have pulse and monitor, wherein eyepiece can sensing whether both synchronous, whether both all mates the profile (such as for authenticating) that previously recorded etc.Similarly, useful other biological identifying information carries out the certification between multiple control interfaces and eyepiece, such as using fingerprint, iris scan, pulse, healthy profile etc., wherein eyepiece knows whether it is that same people adorns oneself with interface module (as wrist-watch controller) and eyepiece.Can by see skin IR LED view, see that the lower pulse in surface etc. determines individual biometric information/health.In each embodiment, can use many device authentication (as the token of shaking hands for bluetooth), such as the sensor (if the fingerprint on two equipment is as the hash of bluetooth token) having on two equipment of use etc.
In one embodiment, wrist-watch controller can have touch-screen, even if this is in the situation that glasses are not arranged on user's face, (such as them in knapsack) also to control glasses may be useful.The transparent lens of wrist-watch has OLED display, post in lens bottom can switch catoptron.In other embodiments, wrist-watch controller lens can comprise electron mirror or electronic ink display.Under any circumstance, lens can cover on standard analog watchwork, and comprise can switched-mirror or the transparent lens of electron mirror or the electronic ink display displaying contents that can be activated.Wrist-watch can be used to utilize the integrated sensor that detects posture to carry out ability of posture control.Wrist-watch can be used as AR mark, so that the camera of proper glasses is while identifying wrist-watch, an application can be activated.Such application can be used wrist-watch as having a physical surface that covers virtual image, and in fact this make wrist-watch become touch screen interface.
With reference to figure 70A-70D, eyepiece can be stored in eyepiece Portable box, such as comprising rechargeable ability, integrated display etc.Figure 70 A has described to be shown the embodiment of the closed box with integrated rechargeable AC plug and digital indicator, and Figure 70 B illustrates the same embodiment that box is opened.Figure 70 C illustrates another embodiment of box closure, and Figure 70 D illustrates the same embodiment of open mode, and wherein digital indicator illustrates through lid.In each embodiment, box can have the ability in the time that eyepiece is arranged in box, eyepiece being recharged, such as connecting by AC or battery (building in as interior in Portable box for the rechargable lithium ion cell to eyepiece charging when away from AC power supplies).Electric power can flow to eyepiece by wired or wireless connection, such as the wireless induction pad configuration by between box and eyepiece.In each embodiment, box can comprise the digital indicator of communicating by letter with eyepiece, such as by blue teeth wireless etc.This display can provide the information about eyepiece state, such as the message receiving, battery levels instruction, notice etc.
With reference to Figure 71, eyepiece 7120 can be used together with unattended ground transaucer unit 7102, is formed and can inserts the stake 7104 on ground 7118 by people, by RC Goblin transmitting, by thrown with airplane etc. such as this ground transaucer unit.Ground transaucer unit 7102 can comprise camera 7108, controller 7110, sensor 7112 etc.Sensor 7112 can comprise Magnetic Sensor, sound transducer, vibration transducer, thermal sensor, passive IR sensor, motion detector, GPS, real-time clock etc., and provides supervision at the place place of ground transaucer 7102.Camera 7108 can have the visual field 7114 in orientation and elevation, such as in orientation wholly or in part 360 degree camera array and ± 90 degree elevations.Ground transaucer unit 7102 can capture events sensor and view data, and it is connected and is sent to eyepiece 7120 by wireless network.Further, then eyepiece can be sent to data external communication device 7122, such as cellular network, satellite network, WiFi network, is sent to another eyepiece etc.In each embodiment, ground transaucer unit 7102 can be relayed to another unit from a unit by data, such as from 7102A to 7102B to 7102C.Further, then data can be relayed to eyepiece 7120B and to communication facilities 7122, such as in backhaul data network from eyepiece 7120A.From ground transaucer unit 7102 or ground transaucer cell array gather data can with multiple eyepieces share, such as from eyepiece to eyepiece, from communication facilities to eyepiece etc., to make the primitive form that the user of eyepiece can data or to utilize and shared data through the form (showing by the figure of eyepiece as data) of aftertreatment.In each embodiment, ground transaucer unit may be not expensive, disposable, toy level etc.In each embodiment, ground transaucer unit 7102 can provide the backup to the computer documents from eyepiece 7120.
With reference to Figure 72, eyepiece can provide control by the inside and outside equipment of eyepiece, such as from surrounding environment 7202, from input equipment 7204, from sensor device 7208, from user action capture device 7210, from inter-process equipment 7212, from inner multimedia processing apparatus, from internal applications 7214, from camera 7218, from sensor 7220, from earphone 7222, from projector 7224, by transceiver 7228, by haptic interface 7230, from external computing device 7232, from applications 7234, from event and/or data feed 7238, from external unit 7240, from third party's 7242 initiations such as grade.The order of eyepiece and control model 7260 can be by sensing reception, the internal applications of mutual 7250, the event of input, user action 7248, the external unit by input equipment 7244 and/or feeds of data 7252 carry out 7254, applications carries out 7258 etc. and initiates.In each embodiment, can exist series of steps to be included in and carry out in control, at least comprise with the lower combination of two: event and/or feeds of data, sensing input and/or sensor device, user action catches input and/or output, move and/or move for the user who controls and/or initiate order, order and/or control model and interface (wherein input can be reflected), can utility command on platform respond the application of input, from platform, interface is to communication and/or the connection of external system and/or equipment, external unit, applications, to user's feedback 7262(such as with external unit, applications is relevant) etc.
In each embodiment, event and/or feeds of data can comprise Email, with military relevant communication, schedule warning, security event, security incident, financial events, individual event, input request, indicate, enter active state, enter military belligerent active state, enter the environment of certain type, enter hostile environment, enter certain place etc., and their combination.
In certain embodiments, sensing input and/or sensor device can comprise: charge, black silicon sensor, IR sensor, acoustic sensor, induction pick-up, motion sensor, optical sensor, opacity sensor, proximity sense, inductance sensor, eddy current sensor, passive infrared proximity sense, radar, capacitive transducer, capacitive displacement transducer, hall effect sensor, Magnetic Sensor, GPS sensor, thermal imaging sensor, thermopair, thermistor, photoelectric sensor, sonac, infrared laser sensor, inertia motion sensor, MEMS internal motion sensor, ultrasonic 3D motion sensor, accelerometer, inclinometer, power sensor, piezoelectric sensor, rotary encoder, linear encoder, chemical sensor, ozone sensor, smoke transducer, thermal sensor, magnetometer, carbon dioxide detector, carbon monoxide detector, oxygen sensor, glucose sensor, smoke detector, metal detector, Raindrop sensor, altitude gauge, GPS, to whether in outside detection, to the detection of environment, to movable detection, object detector (for example, billboard), sign detector (for example,, for making the geographic position mark of advertisement), laser range finder, sonar, electric capacity, optic response, heart rate sensor, RF/ micropower impulse radio (MIR) sensor, and their combination.
In each embodiment, user action catches input and/or equipment can comprise head tracing system, camera, voice recognition system, health movable sensor (as dynamic pickup), eye-gaze detection system, tongue touch pads, blow the formula of sobbing (sip-and-puff) system, operating rod, cursor, mouse, touch-screen, touch sensor, finger tracking equipment, 3D/2D mouse, inertia mobile tracking, microphone, wearable sensor group, robot motion's detection system, optical motion tracker, laser motion tracker, keyboard, dummy keyboard, dummy keyboard on physical platform, system is determined in back of the body border, movable determine that system is (as aboard, aboard, walking, take exercise etc.), finger is followed camera, display in virtualized hand, synthetic language system, trace ball, be arranged on the camera of hand, be positioned at the sensor of temple, be positioned at the sensor of glasses, Bluetooth communication, radio communication, satellite communication etc., and their combination.
In each embodiment, move or move and can comprise for controlling or initiate the user of order: head moves, head rocks, nod, head is looped, forehead is twitched, ear moves, eyes move, open eyes, close one's eyes, nictation, eyes are turn-taked, hand moves, clench fist, open fist, rolling fist, stretch out fist, regain fist, voice commands, blow and sob by suction pipe, tongue moves, finger is mobile, one or more fingers move, extend finger, flex one's fingers, regain finger, stretch thumb, make symbol with finger, make symbol with finger and thumb, to point by thumb, with finger drag and drop, touch and drag, with two finger touch and drag, wrist moves, wrist is turn-taked, wrist upset, arm moves, arm extends, arm is regained, arm left rotaring signal, arm right turn signal, with arms akimbo, both hands arm extends, shank moves, kicking, shank extends, leg bending, straddle is jumped, health moves, walking, run, turn left, turn right, turn round, rotation, both hands arm raising rotation, an arm is left in the basket and rotates, with various hands and arm position rotation, finger is mediated and extension movement, finger mobile (as virtual key entry), fiercely attack, tapping, buttocks motion, shoulder motion, pin motion, draw brush mobile, synthetic language (as ASL) etc., and their combination.
In each embodiment, input can be reflected in order wherein and/or control model and interface and can comprise: graphic user interface (GUI), audible command interface, the icon that can click, the list that can navigate, virtual reality interface, augmented reality interface, HUD, semi-transparent display, 3D navigation interface, order line, virtual touch screen, robot control interface, key in (as utilized the lasting dummy keyboard that is locked in appropriate position), (as study, what wearer does in " training mode " to user interface based on prediction and/or study, and they when and do wherein), simple command pattern (as started gesture of a certain application etc.), bluetooth controller, cursor keeps, locking virtual monitor, head is around movement of positioning cursor etc., and their combination.
In each embodiment, application that can utility command on eyepiece and/or input is responded can comprise: Military Application, weapon control application, military aiming applied, war game simulation, the simulator of fighting bare-handed, repair manual application, tactical operation application, mobile phone application (as iPhone application), information processing, fingerprint catches, face recognition, information shows, information is transmitted, information, iris catches, amusement, the easy acquired information of pilot, in real world with 3D anchored object, taking the common people as target, taking police as target, teaching, do not use the study course of hand to instruct (as in maintenance, in assembling, first aid is medium), blind man navigation is auxiliary, communication, music, search, advertisement, video, computer game, e-book, advertisement, shopping, ecommerce, video conference etc., and their combination.
In each embodiment, from eyepiece interface to external system with the communication of equipment and/or be connected the program, application programming interface (API), graphic user interface (GUI), navigational system controller, network router, network controller, reconciliation system, payment system, game station, pressure transducer that can comprise microcontroller, microprocessor, digital signal processor, bearing circle control interface, Joystick controller, motion and sensor resolver, steeper (stepper) controller, audio system controller, integrated sound and picture signal etc.
In each embodiment, want controlled external unit to comprise: weapon, weapon control system, communication system, bomb detection system, bomb is removed system, Remote Control Vehicle, computing machine (thereby and can by computer-controlled much equipment), camera, projector, cell phone, tracking equipment, display is (as computing machine, video, TV screen), video-game, war game simulator, moving game, fixed point or tracking equipment, radio or audio system, stadimeter, audio system, iPod, smart phone, TV, entertainment systems, computer-controlled armament systems, target drone, robot, automobile instrument panel interface, light fixture (as mental state illumination), athletic equipment, gaming platform (they like object for appreciation and so on gaming platform as identified user prestrain), vehicle, open memory-aided equipment, payment system, ATM, POS system etc.
In each embodiment, the application being associated with external unit can be Military Application, weapon control application, military aiming applied, war game simulation, the simulator of fighting bare-handed, repair manual application, tactical operation application, communication, information processing, fingerprint catches, face recognition, iris catches, amusement, the easy acquired information of pilot, in real world with 3D anchored object, taking the common people as target, taking police as target, teaching, do not use the study course of hand to instruct (as in maintenance, in assembling, in first aid), blind man navigation is auxiliary, music, search, advertisement, video, computer game, e-book, automobile instrument panel application, advertisement, military enemy aims at, shopping, ecommerce etc., and their combination.
In each embodiment, to can comprising with external unit and the relevant feedback of application of wearer: visual displays, HUD, target center or target following display, tone output or audible alarm, performance or grading indicator, score, task complete instruction, moved instruction, content play, information demonstration, report, data mining, recommendation, targeted ads etc.
In an example, the control aspect of eyepiece can comprise following combination: when mobile soldier nod initiate to mourn in silence order (such as during belligerent), by reflect control inputs be reflected on the graphic user interface at pattern wherein and/or interface, eyepiece utility command and/or the Military Application that control inputs is responded, for from eyepiece interface to external system or the audio system controller of the communication of equipment and/or connection etc.For example, soldier may be by eyepiece control safety communications equipment during belligerent, and wish change communication in a certain respect, such as channel, frequency, code levels etc., need not sound and with minimum motion to minimize the possibility of being heard or seeing.In this example, nodding of soldier's head can be programmed to indicate this change, indicates start transmission such as nodding forward fast, nods backward fast to indicate and finishes transmission etc.In addition, eyepiece may be to soldier's projection the graphic user interface for safety communications equipment, such as illustrate what channel be movable, what alternate channel be in available, their troop current transmitting other people etc.Then nodding of soldier can be interpreted as changing order by the treatment facility of eyepiece, and this order is transmitted to audio system controller, and the graphic user interface of communication facilities illustrates this change.Further, some nod/body kinematics can be interpreted as the special command that will transmit, and is the audible pre-established communication that just sends to make eyepiece without soldier.That is to say, soldier perhaps can send preprepared communication (for example determining together with troop before belligerent) to their troop by body kinematics.In this way, the soldier who wears and use eyepiece equipment perhaps can be in the mode of complete secret with external security equipment connection with dock, thereby during belligerent, keep communicating by letter with their mourning in silence of troop, even outside the sight line of troop.In each embodiment, also can apply as described herein for other of controlling or initiating order move or move, input can be reflected in order wherein and/or control model with can utility command on interface, platform and/or the application that input is responded, interface communicating by letter or connection etc. to external system and equipment from platform.
In an example, the control aspect of eyepiece can comprise following combination: the stadimeter that motion and position transducer can be informed therein soldier's order as sensing input, augmented reality interface as input and control interface, motion sensor and armament systems is used as will be controlled and the external unit of therefrom Information Monitoring, the feedback relevant with external unit to soldier etc.For example, the soldier who wears eyepiece may monitor military movement with motion sensor in a certain environment, and in the time that motion sensor is triggered, target can be helped identify to wearer by projection in augmented reality interface, such as people, vehicle etc., for further monitoring and/or aiming at.In addition, stadimeter perhaps can be determined the distance of object and this information is fed back to soldier that (such as manually, wherein soldier carries out the action of opening fire for aiming at; Or automatically, wherein armament systems reception information aims at, soldier provides the order of opening fire).In each embodiment, augmented reality interface can provide the information about target to soldier, position such as object on the map that 2D or 3D project, for example, identity from the target of the information (be stored in object database, comprise face recognition, object identification) of previous collection, coordinate, the night vision imaging of target etc. of target.In each embodiment, the triggering of motion detector can be interpreted as Warning Event by the treatment facility of eyepiece, this order can be transmitted to stadimeter and determine the position of object, and the loudspeaker that passes to eyepiece earphone provides the audio-alert that senses mobile object in the region being just monitored to soldier.Soldier's audio-alert is added to visual detector can serve as the input that should be noted that this mobile object to soldier, when being identified as soldier's suspicion object at object, such as find known soldier, known type of vehicle etc. by accessed database.For example, soldier may be at sentry post and monitor sentry post periphery night.In this case, environment may be dark, and soldier may enter low state of attention, because may be to the late into the night, and all environmental aspects be quiet.Strengthen equipment so eyepiece can serve as sentry, carry out " observation " (contrary with some external supervisory equipment at sentry post) from soldier's individual visual angle.When eyepiece senses when mobile, soldier is warned and is directed to position, distance, identity of this motion etc. immediately.In this way, soldier perhaps can react to avoid individual dangerous, location mobile aimed at and opened fire etc., and to sentry post warning potential danger.Further, occur if fought, soldier may improve the reaction time due to the warning from eyepiece, by making better decision-making about the information of target, and the danger being injured or sentry post is slipped into is minimized thereupon.In each embodiment, also can apply that other sensing inputs as described herein and/or sensor device, input can be reflected in order wherein and/or control model and interface, controlled useful external unit, the feedback relevant with external unit and/or applications etc.
In each embodiment, eyepiece can allow the remote control to the delivery vehicle such as truck, robot, target drone, helicopter, ship etc.For example, the soldier who wears eyepiece perhaps can give an order for controlling delivery vehicle by intercommunication interface.Delivery vehicle control can be moved by voice commands, health (as soldier is equipping and the movable sensor of eyepiece interactive type communication, controlling delivery vehicle by eyepiece to fetching), keyboard interface etc. and be provided.In an example, the soldier who wears eyepiece can provide the remote control to bomb disposal robot or delivery vehicle, wherein orders by soldier and generates by the command interface of eyepiece, all as described herein.In another example, soldier can order aircraft, the remote control tactics reverse rotation helicopter etc. such as remotely piloted target aircraft.Again, soldier can provide the control to remote control aircraft by control interface as herein described.
In an example, the control aspect of eyepiece can comprise following combination: wearable sensor group inputs, utilizes robot control interface can be reflected in order wherein and control interface, target drone or other robot equipment as wanting controlled external unit etc. as input as soldier's motion capture.For example, the soldier who wears eyepiece can equip the sensor group for controlling military target drone, control the motion of target drone such as utilizing motion sensor input, utilize hand identification to control to handle the controlling feature of target drone (for example, the graphic user interface showing through eyepiece as passed through), utilize voice commands input to control target drone etc.In each embodiment, can comprise that to the control of target drone flight is controlled, control, Threat Avoidance etc. to airborne inquiry sensor (as visible camera, IR camera, radar) by eyepiece.Soldier perhaps can use the sensor being arranged on health and depict actual battlefield by virtual 2D/3D projects images, and target drone is directed to its predetermined target, and wherein flight, camera, supervision control are to carry out order by soldier's body kinematics.In this way, soldier perhaps can keep flight to target drone and personalized, the full vision of environment to immerse, and controls more intuitively.Eyepiece can have robot control interface for managing and be in harmonious proportion the various control inputs of the sensor group of wearing from soldier, and for being provided for controlling the interface of target drone.Then can carry out Long-distance Control target drone by soldier's physical action, such as passing through to the wireless connections of the military control center of the control and management for target drone.In another kind of seemingly example, soldier can control bomb disposal robot, and the sensor group that this bomb disposal robot can wear by soldier and the eyepiece robot control interface being associated are controlled.For example, can provide the bomb disposal robot 2D of environment or the graphic user interface of 3D view around to soldier, wherein sensor group provides the conversion of soldier's's (such as arm, hand etc.) the motion that moves to robot.In this way, the Remote Control Interface that soldier perhaps can be provided to robot carries out better sensitive control during careful bomb disposal process.In each embodiment, also can apply as described herein other user actions and catch inputs and/or equipment, input and can be reflected in order and/or control model and interface wherein, want controlled useful external unit etc.
In an example, the control aspect of eyepiece can comprise following combination: the event instruction to soldier in the time that soldier enters a certain position, the user interface based on prediction-study occur that as event input can be reflected in order wherein and control model and/or interface, weapon control system as wanting controlled external unit etc.For example, eyepiece can be programmed to learn soldier's behavior, such as enter soldier with specific weapon control system specific environment time what conventionally does, whether such as wearer opens system, equips with arms this system, recalls visual displays for this system etc.According to the behavior of this acquistion, eyepiece perhaps can be made soldier and be wanted and so on prediction in eyepiece control function aspects.For example, soldier may be pushed into fight situation, and need to use immediately weapon control system.In this case, eyepiece can be in the time that soldier approaches armament systems position and/or the identity of sensing armament systems, when being configured/enable into soldier near armament systems, armament systems conventionally how to configure this system, such as eyepiece previous use to these armament systems when the mode of learning, and order weapon control system configures to open system according to the last time.In each embodiment, eyepiece can come by several different methods and system position and/or the identity of sensing armament systems, such as the vision system by recognizing site, rfid system, gps system etc.In each embodiment, can be by saying the word to weapon control system to get off: to soldier provide the vision that opening fire of armament systems control graphic user interface, provide audio frequency-voice command system interface of selecting and carrying out speech recognition to say the word, the predetermined automatic activation to a certain function etc. to soldier.In each embodiment, can there is the profile being associated with the order of this acquistion, wherein soldier can revise the profile of this acquistion and/or preference is set in the profile of this acquistion and help optimization auto-action etc.For example, soldier can be ready with regard to weapon while waiting for action (be on duty and) and weapon is effectively fought and had the weapon control profile of separating with enemy.Soldier may need to revise profile and adapt to the changing condition being associated with the use of armament systems, such as ability of the increase of the variation in bidding protocol of opening fire, ammunition model, armament systems etc.In each embodiment, also can apply other events as described herein and/or feeds of data, input and can be reflected in order and/or control model and interface wherein, want controlled useful external unit etc.
In an example, the control aspect of eyepiece can comprise following combination: soldier's personal responsibility event is (such as what dispose in region of war in action, and the time of managing them) as event and/or feeds of data, voice recognition system as user action catch input equipment, audible command interface can be reflected in order wherein as input and control interface, based on the communicating by letter as the application being used on eyepiece responding from soldier's input of video, etc.For example, the soldier who wears eyepiece may obtain by projection to they, indicate about the vision of the event that is ranked of the communication of supporting groupVideo between commanding officer.Then soldier can recall contact details for calling out with voice commands to the audible command interface on eyepiece, and initiates groupVideo communication by voice commands.In this way, eyepiece can serve as soldier's personal assistant, thereby recalls the event of being ranked and the command interface without use hand of carrying out the event that is ranked is provided to soldier.In addition, eyepiece can provide visual interface for groupVideo communication, wherein other commanding officers' image is projected to soldier by eyepiece, wherein external camera is by just providing soldier's video image (such as using with the external unit of camera, using with the catoptron of the integrated camera in inside etc., as described herein) with the communication connection of eyepiece.In this way, eyepiece can provide a kind of fully-integrated personal assistant and the communications platform based on phone/video, thereby the function of the electronic equipment that other are separated traditionally is all included, such as radio, mobile phone, visual telephone, personal computer, schedule, without the order with hand with control interface etc.In each embodiment, also can apply other events as described herein and/or feeds of data, user action catch input and/or equipment, input can be reflected on order wherein and/or control model and interface, platform can utility command and/or input is responded application etc.
In an example, the control aspect of eyepiece can comprise following combination: soldier's security incident is as event and/or feeds of data; Camera and touch-screen catch input equipment as user action; Information processing, fingerprint on eyepiece catches, face recognition should be used for input to respond; For the communication between eyepiece and external system and equipment and/or the graphic user interface being connected; Be used for accessing external security equipment and internuncial external information processing, fingerprint seizure, face recognition application and database.For example, soldier receives " security incident " can be on duty in military checkpoint time, and wherein multiple individualities will be carried out safety inspection and/or mark.In this case, may there are the needs to recording these individual biometric information, because they do not appear in safety database, because of suspicious actions, because they meet sidelights on of combatant etc.Then soldier can use bio-identification input equipment, such as the camera for face is taken pictures with for recording the touch-screen of fingerprint, wherein should be used for managing bio-identification input by the internal information on eyepiece, processing, fingerprint seizure and face recognition.In addition, eyepiece can provide graphic user interface as the communication connection to external information, processing, fingerprint seizure and face recognition application, and wherein this graphic user interface provides data capture interface, external database accessing, concern figure database etc.Eyepiece can provide end-to-end equipment safety control, comprise monitor suspect, for obtain biometric data input equipment, show input and database information, to the connectivity of external security and database application etc.For example, soldier may pass through military checkpoint examinant, and has ordered soldier to meeting sidelights on but current anyone who is not present in safety database gathers face-image, such as with iris biometric information.In the time that individuality approaches soldier, to pass through the troop of checkpoint such as being arranged in, soldier's eyepiece obtains the high-definition picture of each individuality to carry out face and/or iris recognition, such as checking by link addressable database through network service.Be not considered to if someone does not meet sidelights on (as young child) or have them in database the instruction threatening, this individual just can be allowed through checkpoint.If someone has been instructed to threaten or meet sidelights on but not in database, this individuality can not be allowed through checkpoint, and is pulled to side.If they need to be imported in safety database, soldier perhaps can be directly equipment by eyepiece or utilize eyepiece control external unit to process this individuality, such as gathering this individual personal information, take the face of this individuality and/or the close-up image of iris, recording fingerprint etc., as described herein.In each embodiment, also can apply other events as described herein and/or feeds of data, user action catch on input and/or equipment, platform can utility command and/or the application that input is responded, from platform interface to external system and equipment communicate by letter or connection, for the application of external unit etc.
In an example, the control aspect of eyepiece can comprise following combination: finger mobile as soldier initiate the user action of eyepiece order, the icon that can click as user action can be reflected in application on order wherein and control model and/or interface, eyepiece (as weapon control, army move, information data is fed to etc.), Military Application follow the tracks of API as be applied to from eyepiece external system communicate by letter and/or application, feedback to army personnel etc. are followed the tracks of in connection, external staff.For example, can realize for monitoring the system of soldier to the selection of applying on eyepiece by an API, to make this supervision be provided for the service of Monitor and track application service condition, behavior based on monitoring feedback about their obtainable other application to soldier etc. to the military.In the process of a day, soldier can select a certain should being used for to use and/or download, such as by present can clickable icon graphic user interface, and soldier perhaps can realize equipment (such as camera or inertia system based on mobile control of finger, finger movement by camera or inertia system soldier is used as control inputs, be in this case select can clickable icon) select this icon.Then can follow the tracks of API by this Military Application and monitor this selection, this Military Application is followed the tracks of API and is sent to external staff to follow the tracks of application multiple selections of this selection or storage (selecting such as the storage on a period of time).The selection of soldier to application, is " virtual click " in this case, then can be analyzed to optimize utilization rate, such as by increasing bandwidth, change useful application, improving existing application etc.Further, this external staff follows the tracks of application and can utilize this analysis to determine what wearer is in the preference of application use, and uses the form of the list of application that using of recommendation, preference profile, other similar military users etc. that may interested application to wearer to send feedback to wearer.In each embodiment, in helping the military use of guiding ocular and its application, eyepiece can provide service to improve the experience of soldier to eyepiece, the use recommendation that can benefit from it to soldier such as utilization etc.For example, may not utilize its ability completely for the soldier who with eyepiece is new hand, such as in the time using augmented reality interface, organizations, task support etc.Eyepiece can have and monitors soldier's utilization rate, this utilization rate and utilization rate tolerance (such as being stored in outside eyepiece utilization rate equipment) is compared and provide feedback so that the ability of the efficiency that uses and be associated of improvement to eyepiece etc. to soldier.In each embodiment, also can apply as described herein move or move for controlling or initiate other users of order, input can be reflected in order wherein and/or control model with can utility command on interface, platform and/or the application that input is responded, from platform interface to external system and equipment communicate by letter or connection, for the application of external unit, the feedback relevant with external unit and/or applications etc.
In an example, the control aspect of eyepiece can comprise that the sensors such as following combination: IR, heat, power, carbon monoxide are as input; Microphone is as additional input equipment; Voice commands is initiated the action of order as soldier; HUD can be reflected in order wherein and control interface as input; In reducing soldier and use their needs of hand, provide the teaching-guiding application of guidance, such as rushing to repair on the spot, safeguard, when assembling etc.; Action based on soldier and sensor input provide the visual displays of feedback to soldier; Etc..For example, soldier's delivery vehicle may be damaged in fighting, thereby make soldier stranded and there is no instant movement capacity.Soldier perhaps can recall teaching-guiding application, and the problem of diagnosing delivery vehicle without the access of the instruction with hand and computer based expertise is provided in the time moving by eyepiece.In addition, application can provide soldier unfamiliar step study course, such as the interim function of fundamental sum of recovering delivery vehicle.Eyepiece also may monitor and diagnose relevant various sensor inputs, sensors such as IR, heat, power, ozone, carbon monoxide, so that sensor input can and/or directly be accessed by soldier by teaching application access.Application also can provide the microphone that can accept voice commands; For showing the HUD of being described by the 2D of mend or 3D of indication information, delivery vehicle; Etc..In each embodiment, eyepiece perhaps can provide without with the virtual assistant of hand to soldier, helps their diagnosis and repairs delivery vehicle, to re-establish transportation means, thereby allows soldier and enemy again belligerent or move to point of safes.In each embodiment, also can apply other sensing inputs as described herein and/or sensor device, user action and catch input and/or equipment, move or move for controlling or initiate the user of order, input and can be reflected in order wherein and/or control model and application, the feedback relevant with external unit and/or applications etc. that can utility command on interface, platform and/or input is responded.
In an example, the control aspect of eyepiece can comprise following combination: eyepiece enters " active state ", such as " military belligerent " activity pattern, for example, by the assignment instructions receiving, soldier's order eyepiece enters military fire patterns, or eyepiece senses it near a certain military activity, may or even predetermined or as the belligerent region of target, this may be partly by from monitoring and study wearer's general belligerent appointment is further developed.Continue this example, entering active state (such as the belligerent active state of military affairs) (such as driving delivery vehicle and enemy when meeting with or entering hostile territory) can be with combined below: object-detection device is inputted or sensor device as sensing, wear-type camera and/or eye-gaze detection system catch input as user action, eyes move as moving or move for the user who controls or initiate order, 3D navigation interface can be reflected in order and control model and/or interface wherein as input, belligerent management application airborne on eyepiece is as the application for coordinating order input and user interface, with external system or devices communicating or the navigational system controller that is connected, Navigation for Means of Transportation system is as the external unit that will be controlled and/or will dock with it, military planning and actuating equipment are as the applications of the user action for the treatment of about military instruction, target center or Target Tracking System as to wearer about the feedback etc. that aims within view enemy's chance in driving.For example, soldier may enter hostile environment in the time driving their delivery vehicle, and detects that the eyepiece (for example by GPS, observe directly target etc. by integrated camera) of the existence in the belligerent region of enemy can enter " military belligerent active state " (as enabled by soldier and/or ratifying).The object-detection device that then eyepiece can utilize location to aim at enemy's chance detects enemy's delivery vehicle, adverse party residence etc., such as passing through wear-type camera.Further, eye-gaze detection system on eyepiece can monitor where soldier is seeing, and may highlight the information about wearer's the target of watching position attentively, such as enemy personnel, enemy's delivery vehicle, enemy's weapon and You Fang army, wherein friend and enemy identified and distinguish.Soldier's eyes move also can be tracked, such as the target for changing concern, or for ordering input (indicate select command as nodded fast, downward eyes move order indicating for additional information etc.).Eyepiece can call the projection of 3D navigation interface and help provide the information relevant with their surrounding environment to soldier, and military belligerent application is for coordinating the state of military belligerent activity, such as obtaining from soldier's input, providing output, and external unit and interface applications etc. to 3D navigation interface.Eyepiece can for example utilize navigational system controller to come and Navigation for Means of Transportation system docking, thereby Navigation for Means of Transportation system can be included in military belligerent experience.Alternatively, eyepiece can use its oneself navigational system, such as alternative delivery vehicle system or strengthen it, such as leave delivery vehicle soldier and wish to they provide above the ground to time.As a part for the belligerent active state of military affairs, eyepiece can dock with outside military planning and actuating equipment, and such as being used to provide, current state, troop move, weather condition, friendly troop position and troops etc.In each embodiment, by entering active state, soldier can be provided the feedback being associated with this active state, such as for the belligerent active state of military affairs, provides feedback with the form of the information that is associated with the target being identified.In each embodiment, also can apply other events as described herein and/or data feedback, sensing input and/or sensor device, user action catch input and/or equipment, move or move for controlling and/or initiate the user of order, input can be reflected in order wherein and/or control model with can utility command on interface, platform and/or the application that input is responded, from platform interface to external system and equipment communicate by letter or connection, for the application of external unit, feedback relevant with external unit and/or applications etc.
In an example, the control aspect of eyepiece can comprise following combination: secure communication receives as the trigger event to soldier, inertia mobile tracking catches input equipment as user action, the mobile conduct of brush of drawing made from finger drag and drop and soldier is moved or is moved for controlling or initiate the user of order, the list that can navigate can be reflected in order wherein and control interface as input, information is passed on as the application of a type that can utility command on eyepiece and input is responded, mediation system is communicating by letter or connection to external system and equipment as interface from eyepiece, iris seizure and recognition system are as applications for external system and equipment etc.The soldier who wears eyepiece can receive secure communication, and this communication can enter eyepiece as " event " to soldier, such as being used for triggering certain mode of operation of eyepiece, with application or action etc. on vision and/or aural alert, startup eyepiece.Soldier perhaps can react to event by multiple controlling mechanisms, carry out " drag and put ", draw (for example, by the airborne camera of eyepiece and gesture application, wherein the Email in communication or information are dragged into file, application, another communication etc. by wearer) such as brushes by gesture interface with their finger and hand such as wearer.Wearer can recall the list that can navigate as the part that communication is made to action.User can convey to external system and equipment from this secure communication by this information by certain eyepiece application, such as the mediation system of the action for following the tracks of communication and be associated.In each embodiment, eyepiece and/or security access system can require mark checking, and such as passing through bio-identification authentication, such as fingerprint catches, iris catches identification etc.For example, soldier can receive secure communication, and this communication is safety warning, and wherein this secure communication is accompanied by the secure link of further information, and wherein this soldier is required to provide biometric authentication before being provided access.Once certified, this soldier with regard to perhaps can they make to use gesture when responding and handle by the obtainable content of eyepiece, such as handling the list that can directly obtain and/or pass through comprised link and obtain from communication, link, data, image etc.Provide the ability of the content that response and manipulation be associated with secure communication can allow better soldier to come and message and content exchange not endanger the mode that they may current residing any insecure environments to soldier.In each embodiment, also can apply other events as described herein and/or feeds of data, user action catch input and/or equipment, move or move for controlling or initiate the user of order, input can be reflected in order wherein and/or control model with can utility command on interface, platform and/or the application that input is responded, from platform interface to external system and equipment communicate by letter or connection, for the application of external unit etc.
In an example, the control aspect of eyepiece can comprise following combination: catch input equipment by inertia user interface as user action the military affairs instruction that offers soldier by eyepiece is offered to external display device.For example, the soldier who wears eyepiece may wish to offer one group of other soldier in battlefield by the instruction of the obtainable bulletin of equipment of eyepiece from them.This soldier can be by using physics 3D or 2D mouse (as with inertia motion sensor, MEMS inertial sensor, ultrasonic 3D motion sensor, IR, ultrasound wave or capacitance type touch sensor, accelerometer etc.), virtual mouse, virtual touch screen, dummy keyboard etc. to obtain help, to be provided for handling the interface of the content in bulletin.This bulletin is can be by eyepiece viewed and handle, but is also derived in real time, such as giving the outside router that is connected to external display device (as computer monitor, projector, video screen, TV screen etc.).Thus, eyepiece can be soldier a kind of mode that makes other people watch thing that they see by eyepiece and that control by the opertaing device of eyepiece is provided, thereby allows soldier that the content of multimedia being associated with the bulletin of enabling by eyepiece is exported to other non-eyepiece wearers.In an example, mission briefing can be provided for battlefield commanding officer, commanding officer perhaps can do with by the bulletin of the obtainable multimedia of eyepiece and augmented reality resource to their troop by eyepiece, as described herein, thereby has obtained the benefit that these visual resources provide.In each embodiment, also can apply other sensing inputs as described herein and/or sensor device, user action and catch that input and/or equipment, input can be reflected in order wherein and/or control model and interface, interface communicating by letter or connection, wanting controlled useful external unit, the feedback relevant with external unit and/or applications etc. to external system and equipment from platform.
In an example, the control aspect of eyepiece can comprise following combination: nod as the user who initiates order move, for reflect control inputs be reflected in utility command on the graphic user interface at pattern wherein and/or interface, eyepiece and/or the entertainment applications that control inputs is responded, by eyepiece interface and external system or devices communicating and/or the audio system controller that is connected etc.For example, the wearer of eyepiece may be by eyepiece control audio player, and wishes to change to next track.In this example, nodding of wearer can be programmed to indicate track to change.In addition, the graphic user interface for audio player to wearer's projection of eyepiece, such as illustrating play which bar track.So nodding of wearer can be interpreted as changing track order by the treatment facility of eyepiece, then this order can be sent to audio system controller for changing track, so and the change of track can be shown to wearer for the graphic user interface of audio player.
In one embodiment, the control aspect of eyepiece can comprise following combination: motion sensor can be informed wearer's order as sensing input, augmented reality interface as input and control interface, stadimeter as being controlled and the external unit of therefrom Information Monitoring etc.For example, the wearer of eyepiece may just monitor the movement in a certain environment with motion sensor, and in the time that motion sensor is triggered, augmented reality interface can, by projection to wearer, help mark object.In addition, other sensors can contribute to mark, are used for determining the distance of object such as stadimeter.Augmented reality interface can provide the information about object to wearer, position such as object on the map that 2D or 3D project, the identity from the object of the information (such as being stored in object database, comprising face recognition, object identification) of previous collection, coordinate, the night vision imaging of object etc. of object.The triggering of motion detector can be interpreted as Warning Event by the treatment facility of eyepiece, then this order can be transmitted to stadimeter and determine the position of object, and the loudspeaker that sends eyepiece earphone to provides the audio-alert that senses mobile object to wearer.This audio-alert add to wearer's visual detector can serve as to wearer, about the input that should note this mobile object, when be identified as the object that wearer pays close attention at this object.
In an example, the control aspect of eyepiece can comprise following combination: wearable sensor group catches input, robot control interface as inputting the order that can be reflected in wherein and controlling interface, target drone or other robot equipment as wanting controlled external unit etc. as user action.For example, the wearer of eyepiece can be equipped with the sensor group for controlling target drone, and motion, the hand identification of controlling target drone such as motion sensor input is controlled the controlling feature (as by the graphic user interface showing through eyepiece), the voice commands input that are used for handling target drone and is used for controlling target drone etc.Eyepiece can have the various control inputs of robot control interface for managing and being in harmonious proportion from sensor group, and for being provided for controlling the interface of target drone.Then can remotely control target drone by wearer's action, such as by for the control center of target drone control and management, the wireless connections etc. of more directly arriving target drone.Like in example, can carry out control (as bomb disposal robot) another kind of by sensor group and eyepiece robot control interface.For example, can provide graphic user interface to wearer, this graphic user interface provides 2D or the 3D view of robot environment around, and wherein sensor group provides the conversion of wearer's's (such as arm, hand etc.) the motion that moves to robot.In this way, wearer perhaps can provide the Remote Control Interface to robot.
In an example, the control aspect of eyepiece can comprise following combination: enter a certain position and occur that as event input can be reflected in order wherein and control model and/or interface, entertainment systems as wanting controlled external unit etc. as the event to eyepiece, user interface based on prediction-study.For example, eyepiece can be programmed to learn wearer's behavior, while entering the room with entertainment systems, what conventionally does such as wearer, and such as whether wearer turns on televisor, audio system, games system etc.According to the behavior of this acquistion, eyepiece perhaps can be made wearer and be wanted and so on prediction in eyepiece control function aspects.For example, come into parlor, eyepiece senses this position and wearer conventionally can open music by entertainment systems in the time entering room, and order entertainment systems is opened the music of playing last time.In each embodiment, eyepiece can carry out sense position by several different methods and system, such as the vision system by recognizing site, rfid system, gps system etc.Entertainment systems is said the word can be by carrying out to get off: the graphic user interface of selection is provided, provides and select and audio frequency-voice command system interface, the automatic activation of order etc. of speech recognition to order to wearer to wearer.Can have the profile being associated with the order of this acquistion, wherein wearer can revise the profile of this acquistion and/or the preference that arranges in the profile of this acquistion is helped optimization auto-action etc.
In an example, the control aspect of eyepiece can comprise following combination: individual event as event and/or feeds of data, voice recognition system as user action catch input equipment, audible command interface can be reflected in order wherein and control interface, video conference as the application being used on eyepiece responding from wearer's input as input, etc.For example, wearer can obtain the vision instruction of projection to their calendar events about a certain Conference Calling.Then user can recall to the audible command interface on eyepiece the information of dialling in of this calling with voice commands, and initiates video conference by voice commands.In this way, eyepiece can serve as personal assistant, recalls calendar events and the command interface without use hand of carrying out calendar events is provided to wearer.In addition, eyepiece can be provided for the visual interface of video conference, and wherein other people image is projected to wearer by eyepiece, and external camera is just by providing wearer's video image to the communication connection of eyepiece.Eyepiece can provide a kind of fully-integrated personal assistant and phone/videoconferencing platform, thereby the function of the electronic equipment that other are separated traditionally is all included, such as mobile phone, PDA, schedule, without the order with hand with control interface etc.
In an example, the control aspect of eyepiece can comprise following combination: security incident is as event and/or feeds of data; Camera and touch-screen catch input equipment as user action; Information processing, fingerprint on eyepiece catches, face recognition should be used for input to respond; For the communication between eyepiece and external system and equipment and/or the graphic user interface being connected; Be used for accessing external security equipment and internuncial external information processing, fingerprint seizure, face recognition application and database.For example, security official may process " security incident ", this may be that certain checkpoint will carry out safety inspection and/or mark to many people, need to check and/or identify certain individuality etc., wherein identify the needs (for example, because they do not appear in safety database, because suspicious actions etc.) of the biometric information to recording individual.Then security official can use bio-identification input equipment, such as the camera for face is taken pictures with for recording the touch-screen of fingerprint, wherein should be used for managing bio-identification input by the internal information on eyepiece, processing, fingerprint seizure and face recognition.In addition, eyepiece can provide graphic user interface as the communication connection to external information, processing, fingerprint seizure and face recognition application, and wherein this graphic user interface provides data capture interface, external database accessing, concern figure database etc.Eyepiece can provide end-to-end equipment safety control, comprise monitor pay close attention to personage, for obtain biometric data input equipment, show input and database information, to the connectivity of external security and database application etc.
In an example, the control aspect of eyepiece can comprise following combination: finger mobile as initiate eyepiece order user action, can clickable icon can be reflected in application (as phone application, music searching, advertisement selection etc.), advertisement on order wherein and control model and/or interface, eyepiece as user action and follow the tracks of API as be applied to communicating by letter and/or connection, external advertisements application, feedback to user etc. of external system from eyepiece.For example, can realize for monitoring the system of user to the selection of applying on eyepiece by an API, with make this supervision to advertisement implanting device provide service, behavior based on monitoring to wearer may interested other application about wearer feedback etc.In the process of a day, wearer can select a certain should being used for to use and/or download, such as by present can clickable icon graphic user interface, wearer perhaps can realize equipment (such as camera or inertia system based on mobile control of finger, finger movement by camera or inertia system wearer is used as control inputs, be in this case select can clickable icon) select this icon.Then can follow the tracks of API by this advertisement and monitor this selection, this advertisement is followed the tracks of API multiple selections of this selection or storage (such as the selection of upper storage of a period of time) is sent to external advertisements application.Wearer's application choice (being " virtual click " in this case) then can be analyzed to produce advertising income, such as by advertisement being planted back to wearer, selling third party's advertising equipment etc. by data.Further, the application of this external advertisements can utilize this analysis to determine what wearer is in the preference of application use, and uses the form of recommendation to the possible interested application of wearer, preference profile, other similar users download interested and so on list etc. to send feedback to wearer.In each embodiment, in helping to be applied as third party's generating advertisement income by external advertisements, eyepiece can provide the service that improves the experience of wearer to eyepiece, such as utilize may interested download to wearer recommendation, utilize advertisement that wearer is may interested specific aim stronger etc.
In an example, the control aspect of eyepiece can comprise following combination: health moves (as the dynamic pickup of the dynamic pickup of sensing head movement, the chirokinesthetic camera of sensing, sensing body kinematics) and touch sensor or audio frequency and catches sensor device (as the game station of sensing such as bearing circle, sword etc. as user action, another player in sensing game, etc.), head and hand for example move, as the user action for controlling and/or initiate order (passing through ability of posture control), virtual reality interface can be reflected in order wherein and control interface as input, information shows the application that can respond to input as on eyepiece, computer game device is as the external unit that will pass through game application controls, and to wearer's game replaying content and performance, grading, marks etc. are as the feedback relevant with application with external unit to user.For example, wearer perhaps can play interactive computer game (as on computers, on computer based gaming platform, on moving game platform) together with eyepiece, wherein wearer's health moves and is interpreted as control inputs, such as passing through health movable sensor, touch sensor, infrared sensor, IR camera, visible camera etc.In this way, the movement of wearer's body can be fed into computer game, instead of uses more traditional control inputs, such as portable game controller.For example, eyepiece can by the IR on eyepiece and/or visible camera come sensing user hand etc. motion and process by airborne or outside gesture recognition algorithm, eyepiece can by the motion sensor on eyepiece come sensing user head motion (as for sensing user in response to playing to jump, move around, from one side to mobile on one side) etc.In one embodiment, posture can be by prone camera or by such as catching by the camera that uses folded optics (fold optics) imaging downwards.In other embodiments, camera can catch non-sight line posture and identifies.For example, spill (foviated) or sectional type camera can carry out motion tracking and room mapping in the time of eyes front, catch posture and user interface control order but have quadrant or the hemisphere of looking down, wherein user's hand is placed on them at one's side or is not parallel with the axis of centres of display on thigh.Certainly, gesture can be as described herein tracked like that, uses IMU sensor, magnetic mark, RF label etc. such as passing through in equipment (as finger ring controller, wrist-watch, pistol grip etc.).The visual depiction of game environment is provided to user in the virtual reality interface of these body kinematics control inputs on then can feed-in eyepiece and information display application, can come according to user's motion control gaming platform by feed-in computer game platform, the virtual reality interface and the information that offer eyepiece and gaming platform show that both come to create augmented reality gaming platform etc. by eyepiece.In each embodiment, move or otherwise illustrate that the opertaing device of user interactions or Interactive control element can be removed from computer picture by the processor being associated with eyepiece for sensing health.In the case of not wishing that sensor becomes a part for game, all or part of of the image of opertaing device can be removed from the image producing of playing games.For example, in the situation that sensor is only used to detect hand/limbs and moves, sensor can be removed from image generates, but is when playing games relevant object (such as sword) at sensor or opertaing device, this object itself can be depicted in play games or AR environment in.In each embodiment, may wish that opertaing device is its viewed arriving in the position outside present position in fact.For example, the target that user throws dartlike weapon can be displayed on the end in passageway before user, instead of the dartlike weapon of throwing with user is shown explicitly.As further example, if user is never at the actual middle release dartlike weapon of playing games, can be illustrated as advancing to target based on the feature of user's throwing as the dartlike weapon of opertaing device.In each embodiment, computer game can be used as local game application and operates on eyepiece completely airbornely, dock with the outside game station of wearer this locality, with game station (as a large amount of multiplayer online gamings, the MMOG) interface of networking, the combination on eyepiece and by gaming platform etc.Dock and control this outside game station with a local outside game station (as the gaming platform in wearer family) at eyepiece, the eyepiece applying portion that game is carried out can provide visual environment and information to show to wearer, and outside game station can provide game application to carry out.Alternatively, eyepiece can provide user's moving sensing interface and this information is offered to gaming platform, and wherein then gaming platform offers user by the visual interface of game.Alternatively, eyepiece can provide user's moving sensing interface, and wherein this information is made for creating augmented reality interface by eyepiece and gaming platform, and this augmented reality interface has combined visual interface and gaming platform in the game of giving user presents.In each embodiment, AR application can strengthen advertisement etc. through prospect along with object on the side of buildings or other structures.Drive through out-of-date user, camera can notice object (if kerbside is along upper lamp stand) just with the enhancing surface than in background faster speed move through the visual field.Display system can deduct the virtual hierarchy that a part that is enhanced image retains content after image.This can require the strict calibration of the parallax between user's glasses, display and camera.In each embodiment, this technology can be used to produce depth map.Those skilled in the art are by clear: the many different demarcations that can realize between the processing being provided by eyepiece and the processing being provided by external unit configure.In addition, game realizes can expand to outside game station across the Internet, such as using MMOG.External unit (no matter be local or across the Internet) then can provide feedback to wearer, such as provide played content at least a portion (as the game projection providing with the combined this locality of content from external unit and other players), show instruction, mark, grading etc.In each embodiment, eyepiece can be provided for the user environment of computer game, and wherein eyepiece and external control input and external processing apparatus create gaming platform of future generation to fetching.
Detect by the sensor (as dynamic pickup, health movable sensor, touch sensor) being connected with wearer's direct physical alternative that health moves as eyepiece, eyepiece can be in conjunction with for wearer's health being moved forward into the active remote sensing system that connects sensing and explanation in the ranks, and such as the IR projecting by use utilization, sonar, RF, energy etc. carried out the initiatively depth transducer of 3D of the position of sensing wearer hand, pin etc.Initiatively 3D depth transducer also can with eyepiece on visual or IR camera combined use.The combination of camera and 3D depth transducer can provide 3D motion-captured, and this 3D is motion-captured processedly on eyepiece provides senior gesture recognition.3D initiatively depth transducer can comprise source (as IR laser projecting apparatus) and receiving sensor (as IR sensor).In each embodiment, camera and 3D under initiatively depth transducer can point to respect to eyepiece sight line, point on one side, point to outside, carry out the observability of hand, the pin etc. of raising system to user.In each embodiment, on eyepiece, may there are multiple cameras, all as described herein for one or more cameras of imaging (as a face forward, detect eye motion, towards rear), for sensing wearer's motion order with control one or more cameras of eyepiece function, application, external unit etc.In an example, the combination of depth transducer and camera can point to image and the motion of seizure wearer's hand, wherein eyepiece processor is used and is followed the tracks of hand from the input of depth transducer and camera and move (such as the motion of each finger of the translation of hand and rotation, hand), calculate the motion of hand by motion algorithm, and motion based on detecting and control eyepiece function according to the on-board data base of the command function of detected motion.In each embodiment, the hands movement explaining can be used to control eyepiece function, eyepiece application, by eyepiece control external unit, is input to outside gaming platform, is input to internal virtual reality game medium.In each embodiment, camera, 3D active depth transducer and the algorithm being associated can combine to detect with onboard microphone or microphone array sound and the motion of surrounding environment.
The disclosure can comprise the mutual wear-type eyepiece that a kind of user wears, wherein eyepiece comprises optics assembly, this optics assembly simultaneously to user provide surrounding environment along the view of the sight line looking to the front and introduce the view of the content demonstrating of optics assembly from integrated image source, described eyepiece provides the integrated camera of the view along the sight line of looking down with surrounding environment, for carrying out user's posture identification by integrated gesture recognition equipment.In each embodiment, the motion that gesture recognition equipment can be explained eyepiece is designated the order to eyepiece.This motion can be hands movement, arm motion, finger motion, pin motion, leg motion etc.Integrated camera perhaps can be checked the surrounding environment towards front sight line, and the sight line of looking down is used for gesture recognition.This integrated camera can have sectional type optical element and be used for the view imaging to the sight line towards front simultaneously and the view imaging to the sight line of looking down.In addition, this eyepiece can have for user's health being moved forward into the active sensing system that connects sensing and explanation in the ranks, and wherein this active sensing system provides active signal by optics assembly along the sight line view of looking down.This active signal can be the active signals such as IR, sonar, RF.This active sensing system can comprise at least one the initiatively depth transducer of 3D of position in the hand of sensing user, pin, health etc.This active sensing system can be used for further providing user's posture identification together with integrated camera.
In each embodiment, eyepiece can comprise locate and follow the tracks of for mark double mode.The GPS that can create near the generally labeling position of POI, then can create another second mark.The second mark can read by sensor, image processing, image recognition, user feedback etc. generate.This second mark can be used to follow the tracks of obtaining between GPS reading.In each embodiment, the second mark can be used for being provided to or leaving the distance of point of interest.Double-tagging system can provide distance, time and the direction between 2 to user.Point can be the point of interest of travelling, transport, business, business etc.This double mode user of permission who is used for mark location and tracking locates the article that will buy, the project that visit, Reiseziel, Transportation Model etc.Transport item can comprise user's automobile, train, aircraft, a taxi call a taxi point, taxi, subway etc.Business item can comprise various projects, such as being not limited to food, amusement, shopping, clothing, books, service etc.In each embodiment, the project that locate can be tourist attractions, restaurant, park, street etc.May there is the ecosystem of mark, communication facilities (router and switch) from QR code to broad range or passive sensor (the RFID label that can be checked), they all may want certain relevant information to be forwarded to glasses, allow back-end network to estimate will send the exact position of what content or the position of mark peculiar certain content itself no matter be.Folk prescription may use two marks to help orientation or triangle location glasses, thereby with than providing definite orientation and range information by the easier mode of some single marking (not being especially those of vision) itself.In each embodiment, can process two marks.Eyepiece perhaps can be identified near two marks in or the visual field, and they are worked, side by side (for example, for triangulation) or one of wherein give priority (mark may be than non-paying mark more preferably as paid in advertisement scene; Towards the mark of security may than AD tagged more preferably etc.).Mark can be from glasses, but also can be by such as, placing such as other glasses or other system (systems of advertiser, government side etc.).
In each embodiment, a kind of system can comprise the mutual wear-type eyepiece that user wears, wherein eyepiece comprise that user is used for observing surrounding environment and the optics assembly of the content that demonstrates, for content being introduced to the integrated image source of optics assembly and being generated the mark of point of interest and mark is stored in to the integrated processor of the storer of eyepiece based on GPS reading, wherein integrated processor creates second mark relevant to this GPS point, and this second mark is stored in storer.In each embodiment, this second mark can read by the sensor to current location, one of at least the generating of image processing, image recognition, user feedback etc.This second mark can be used to calculate the distance, direction of point of interest and at least one in the time.In each embodiment, point of interest can be at least one in tourist attractions, restaurant, park, street etc.In each embodiment, this GPS point can use together with the second mark, is provided to the distance, direction of a certain business item etc. and at least one in the time.In each embodiment, this GPS point can use together with the second mark, is provided to the distance, direction of a certain Transportation Model and at least one in the time, the tracking to a certain Transportation Model etc.In these embodiments, Transportation Model can comprise at least one in train, subway, automobile etc.In each embodiment of system, this GPS point can make to be used to provide the tracking to point of interest together with the second mark.In each embodiment, this GPS point can make to be used to provide the tracking to a certain business item together with the second mark.In each embodiment, user feedback can be the Oral input to eyepiece.The second mark can generate by various means.For example, the static state that the second mark can be based on catching eyepiece and/or the processing of video image obtain the positional information of the main body of image.In each embodiment, the data that the second mark can be based on obtaining from internet hunt, scanning QR code, bar code, object etc.
In various embodiments, eyepiece can comprise the earphone that is used to provide enhancing hearing, and wherein user can hear his surrounding environment, also has additional audio frequency.This audio frequency can comprise game content, sports commentary etc.In each embodiment, microphone and/or earplug can binaural ground or audio plays otherwise.In each embodiment, can use osophone together with eyepiece.These earphones can allow user to obtain the audio wave that is sent to inner ear (thereby walk around user ear-drum) by cranium.In each embodiment, this earphone can be used together with being just in time positioned at user's the cheekbone of ear front portion, or can transmit other bones of audio frequency.Therefore, can make the user can be in cognition to monitoring in his or her surrounding environment or hearing audio frequency.In each embodiment, earphone also can employing sound laser instrument, and by the use to laser, earphone sends sound wave whereby.In each embodiment, earphone also can adopt the raising volume of the sound that allows user to experience external voice or earphone generation and/or the equipment of sharpness.In various embodiments, the audio frequency, the audio frequency obtaining by internet etc. that from wireless audio frequency, wirelessly obtain can be play and/or send to the earphone of eyepiece.In each embodiment, eyepiece also can send satellite broadcasting audio frequency to user.
In various embodiments, eyepiece can comprise the RF shielding at other positions to brain or user's body.In each embodiment, any part of the eyepiece of transmitting electromagnetic field can shield with the barrier that conduction or magnetic material or other materials are made.In each embodiment, barrier can comprise sheet metal, wire netting, foam metal, foamed material etc.The comparable radiation being blocked of hole in shielding or grid or the wavelength of other radiation are much smaller.In each embodiment, inner or other parts of eyepiece and/or eyepiece shell can be coated with metallic ink or other materials provides shielding.Such metal can be copper, the nickel etc. that adopts very little particle form.Such metal can be sprayed on shell.In a further embodiment, this RF shielding can wear to prevent that various frequencies from touching other positions of his or her brain, eyes or health by user.
In each embodiment, the mutual wear-type eyepiece that a kind of user wears can comprise that user is used for watching surrounding environment with the optics assembly of the content demonstrating, for content being introduced to integrated image source, radio component and the shielding of optics assembly, the wherein radiation-curable electromagnetic radiation of radio component, and shielding obstruct gives off eyepiece from the radiation of a part for eyepiece.In a further embodiment, shielding can be oriented to protect not raying of user.In addition, shielding can be oriented to protect user and other people not raying.In each embodiment, shielding can at least shield user's brain, a certain position, another position of user's body etc. of user's head.In each embodiment, shielding can be made up of at least one in conductive material, magnetic material, sheet metal, wire netting, grid and foam metal.In embodiment as herein described, shielding can comprise the hole less than the wavelength of particular radiation, and the wavelength of the comparable radiation giving off from eyepiece of hole is little.In each embodiment, shielding can comprise at least one in metallic ink, copper China ink, nickel China ink etc.In each embodiment, shielding can be coated in eyepiece inside.In each embodiment, at least one during the frons that shielding can be arranged on mirror leg, the eyepiece of eyepiece grades.In each embodiment, at least one during the frons that shielding can be arranged on mirror leg, the eyepiece of eyepiece grades.In each embodiment, shielding can be worn by user.
In one example, the controlling party face of eyepiece comprises following combination: as sensors such as the IR inputting, heat, power, ozone, carbon monoxide; As the microphone of additional input equipment; The voice command of giving an order as the action of being made by wearer; As inputting therein the order that can be reflected and the head mounted display of controlling interface; The needs of application to provide guidance also to reduce the hand that uses them are simultaneously provided in instruction, such as in the time safeguarding and assemble; The visual display of feedback etc. is provided to wearer according to the input of wearer's action and sensor.For example, Motor Vehicle Technician can be the wearer of eyepiece, and wherein technician is just assisting the maintenance of vehicle with eyepiece.While instructing as the instruction that moves by eyepiece the problem that application can be on diagnosis vehicle, provide slip out of the hand formula instruction and computer based expertise to access to technician.In addition, this application can provide technician the guide of unfamiliar program.Eyepiece also can monitor and diagnosis and security-related various sensors input, sensors such as IR, heat, power, ozone, carbon monoxide, to make sensor input can be instructed to application access and/or can directly be accessed by technician.This application also can provide can received microphone by its voice command; The demonstration, the vehicle that are used to indicate information are in the 2D of the part in repairing or the head mounted display that 3D describes; Timely feedback and the cost etc. of repairing.In each embodiment, eyepiece can provide the formula virtual assistance of slipping out of the hand to assist them in the diagnosis of vehicle with while repairing to technician.
In one example, the control aspect of eyepiece can comprise following combination: eyepiece enters " active state " (such as " shopping " activity pattern), for example, user command eyepiece enters shopping pattern or eyepiece senses its just contiguous shopping region, perhaps or even the interested shopping of the wearer who draws by preference profile region, its shopping preferences that can partly monitor and learn wearer by oneself is come by perfect further.Continue this example, can be with combined below such as enter active state (such as shopping activity state) in the time driving: object-detection device is as sensing input or sensing equipment, wear-type camera and/or gaze detection system catch input as user action, eyes move as moving or move for the user who controls or initiate order, 3D navigation interface is as inputting therein order and control model and/or the interface that can be reflected, E-business applications airborne on eyepiece are as the application for coordinating order input and user interface, navigational system controller with external system and devices communicating or connection, Vehicular navigation system is as the external unit that will be controlled and/or dock with it, advertisement mechanism is as the applications for process user action about advertising database, target center or Target Tracking System are as the feedback that wearer is understood about the shopping machine in sight line in the time driving.For example, wearer can enter shopping region in the time driving their car, and (such as by GPS, directly check target etc. via integrated camera) eyepiece can enter " shopping activity state " (such as being enabled by wearer and/or ratifying) in the time the appearance in shopping region being detected.Eyepiece is followed available objects detector such as the notice board, the StoreFront etc. that carry out the meeting of detection and location shopping machine by wear-type camera.In addition, the gaze detection system on eyepiece can monitor where wearer is just seeing and about the possible outstanding information of target of staring position wearer, such as the merchandising of supplying in current shop or Special Events.Wearer's eye moves also can be tracked, such as for changing interested target or for example, for ordering input (, nod fast instruction select command, downwards eye moves the order of pointer to additional information etc.).For example, user's iris or retina can trackedly provide control inputs.Eyepiece can call the projection of 3D navigation interface and assist to provide about their information around and for coordinating the E-business applications of shopping activity state to wearer, such as obtaining from wearer's input, output being provided, docking with external unit and application etc. to 3D navigation interface.Eyepiece can for example utilize navigational system controller to dock with Vehicular navigation system, and Vehicular navigation system can be comprised thus in shopping experience.Alternatively, such as in the time that wearer out also wishes to have the direction of travel that offers them from car, eyepiece can use its navigational system (such as replacing Vehicular system or it being strengthened).As a part for shopping activity state, eyepiece can dock with external advertisements mechanism, and all Tathagata provides for current preferential, the Special Events of businessman around, pop-up advertisement etc.External advertisements mechanism also can be connected with third party advertiser, publisher, businessman's organization of supply etc., and they can make contributions to the information that offers wearer.In each enforcement mutual benefit, by entering active state, wearer can be provided the feedback being associated with active state, such as the feedback that is provided the form with the information being associated with the target of mark for shopping activity state.
In an example, the control aspect of eyepiece can comprise following combination: the Email as trigger event receives, catch the inertia mobile tracking of input equipment as user action, mobile as the drag and drop made from finger of moving or moving for the user who controls or initiate order and slip, can navigating lists as what input therein the order that can be reflected and control interface, as energy utility command on eyepiece with to inputting the information transmission of the application type responding, the communicating by letter or the payment system of connection to external system and equipment as interface from eyepiece, as iris seizure and the recognition system etc. of the applications for external system and equipment.For example, wearer can receive bill via e-mail, and Email can be used as wearer " event " entered into eyepiece, thereby all Tathagata starts application on eyepiece etc. by operator scheme visual and/or that can listen warning to trigger eyepiece.Wearer can react to email event by multiple controlling mechanisms, such as wearer by hand posture interface (for example, by camera airborne on eyepiece and the application of hand posture, wherein wearer is dragged into the information in Email or Email file, application, another Email etc.) use and point and hand " drag and drop ", slip etc.Wearer can call the bill list that can navigate and pay etc.User can for example, be delivered to external system and equipment by the information from Email (, the amount of money of bill information, account number, payment etc.) via eyepiece application, such as the payment system for paying bill.In each embodiment, eyepiece and/or payment system can require authentication, and such as passing through bio-identification authentication, such as fingerprint catches, iris catches identification etc.
In an example, the control aspect of eyepiece can comprise that catching input equipment by inertia user interface as user action provides the combination of instruction to external display device by eyepiece.For example, wearer can wish from the equipment by eyepiece can with demonstration provide instruction to one group of individuality.Wearer can for example, be provided for the interface of operating content in the time demonstrating by using physics 3D or 2D mouse (, having inertia motion sensor, MEMS inertial sensor, ultrasonic 3D motion sensor, accelerometer etc.), virtual mouse, virtual touch screen, dummy keyboard etc. to assist.Demonstration can be also can handling by eyepiece of can checking by eyepiece, but also can derive in real time, such as for example, outside router to being connected to external display device (, graphoscope, projector, display screen, TV screen etc.).Thus, eyepiece can provide and allow other people check the mode what wearer seen and controlled by the control device of eyepiece by eyepiece to wearer, thereby allows wearer that the multi-media events of enabling by eyepiece is exported to other non-eyepiece wearers.
In one example, the control aspect of eyepiece can comprise the combination of use case/feeds of data and sensing input/sensor device, can be implemented such as the additional acoustic sensor of security incident wherein.Can exist and send to soldier's safety warning and acoustic sensor and be used as input equipment and monitor voice content, the direction of artillery fire etc. in surrounding environment.For example, safety warning is broadcast to all army personnels in specific region, and by reporting to the police, eyepiece activates and monitors the application of embedded acoustic sensor array, the sound that this embedded acoustic sensor array analysis rings with the type of mark sound source and sound from direction.In each embodiment, as described in this, other events and/or feeds of data, sensing input and/or sensing equipment etc. also can be employed.
In one example, the control aspect of eyepiece can comprise that use case/feeds of data and user action catch the combination of input/equipment, such as for asking to input additional use camera.Soldier can be positioned at interested position and be sent out from them photo of position or the request of video, such as wherein asking with the instruction for what is taken pictures.For example, soldier is just in inspection post, and in certain central command post, one pays close attention to individuality may be attempted by this inspection post by definite.Command post of central authorities then can provide instruction to record and upload image and video to the eyepiece user of contiguous this checkpoint, and this can automatically be carried out in each embodiment, and must manually open camera without soldier.In each embodiment, as described in this, other events and/or feeds of data, user action catch input and/or equipment etc. and also can be employed.
In an example, the control aspect of eyepiece can comprise that use case/feeds of data and user move or move to control or initiate the combination of order such as entering " active state " and they as soldier when hand posture is used for controlling.Soldier can enter and be ready in the active state belligerent with enemy, and soldier uses hand posture with order eyepiece voicelessly belligerent commander and in controling environment.For example, soldier can enter suddenly the definite enemy region of new information receiving as basis, and this new information is placed in eyepiece the alarm condition of lifting.In this case, requiring noiseless may be a kind of demand, so and eyepiece be transformed into hand posture command mode.In each embodiment, as described in this, other events and/or feeds of data, for controlling or initiating, the user of order moves or action etc. also can be employed.
In one example, the control aspect of eyepiece can comprise use case/feeds of data and the input order/control model that can be reflected and the combination at interface therein, such as the user of environment and virtual touch screen who enters a type.Soldier can enter armament systems region, and virtual touch screen can be by the part for armament systems are controlled to wearer.For example, soldier enters weapon vehicle, and the eyepiece of the existence of detection armament systems and the authorized use of soldier weapon recalls virtual firepower control interface with virtual touch screen.In each embodiment, as described in this, other events and/or feeds of data, input can be reflected therein order and/or control model and interface etc. also can be employed.
In an example, the control aspect of eyepiece can comprise the combination of the application of energy utility command on use case/feeds of data and platform/input is responded, the combination such as the security incident for for pilot with the easy access to information.Squadron pilot's (or certain is responsible for the people of the flight check of unmanned spacecraft) receives security incident notice can they are near this aircraft before aircraft takeoff time, and can recall an application and make them by preflight check.For example, target drone expert prepare for startup it near target drone, and an interactive checking process is displayed to soldier by eyepiece.In addition, communication channel can be opened the driver of target drone, and therefore they can be included in preflight check.In each embodiment, as described in this, can utility command on other events and/or feeds of data, platform and/or application that input is responded etc. also can be employed.
In an example, the control aspect of eyepiece can comprise that use case/feeds of data and interface from platform are to the communicating by letter or the combination of connection of external system and equipment, such as soldier in-position and graphic user interface (GUI).Soldier can enter that they are required to carry out mutual position with external unit therein, and its peripheral equipment docks by GIU.For example, soldier enters military means of transport, and presents GUI to soldier, and this GUI launches that they need to do and so on interactive interface in the different phase of transport to user's instruction.In each embodiment, as described in this, other events and/or feeds of data, from platform interface to external system and equipment communicate by letter or connection etc. also can be employed.
In an example, the control aspect of eyepiece can comprise use case/feeds of data and want the combination of controlled useful external unit, such as the instruction for provided and armament systems.Can provide being fed to of instruction or instruction to soldier, wherein at least one is indicated about the control to outside armament systems.For example, soldier can operate piece of artillery, and eyepiece not only provides the information in performance and the program being associated with weapon to them, also provides and aims at being fed to of the instruction that is associated, correction etc.In each embodiment, as described in this, other events and/or feeds of data, want controlled useful external unit etc. also can be employed.
In an example, the control aspect of eyepiece can comprise the combination of the application of use case/feeds of data and useful external unit, such as in the catch/identification of security incident/be fed to and biometric feature.Soldier can be sent out security incident by (such as being fed to by safety) and notify to catch the biometric feature of particular individual (fingerprint, iris scan, gait profile), wherein biometric feature by outside biometric feature apply that (such as what provide from the server/cloud based on safe Military Network) is stored, assesses, analysis etc.In each embodiment, as described in this, the application of other events and/or feeds of data, external unit etc. also can be employed.
In an example, the control aspect of eyepiece can comprise use case/feeds of data and to soldier and combination external unit and the relevant feedback of application, such as the demonstration that enters active state and soldier and be provided information.Soldier can be placed in eyepiece to enter active state, such as for military buildup, preparation, take action, debrief etc., and as to being placed in the feedback that enters active state, soldier receives about the information of entered state and shows.For example, soldier enters the assembly stage of task, and wherein eyepiece captures the information of a part for the task that must complete during assembling as soldier from remote server, comprises fixed equipment, additional training etc.In each embodiment, as described in this, other events and/or feeds of data, feedback relevant with external unit and/or applications etc. also can be employed.
In an example, the control aspect of eyepiece can comprise the combination that uses sensing input/sensing equipment and user action to catch input/equipment, such as utilizing inertia motion sensor and head tracing system.Soldier's head movement can be followed the tracks of by the inertia motion sensor in eyepiece, such as the line of vision sensing of nod control, eyepiece for eyepiece etc.For example, soldier can just aim at armament systems, and the eyepiece gaze-direction that senses soldier's head by inertia motion sensor provides the lasting aiming of weapon.In addition, armament systems can constantly move in response to soldier's gaze-direction, and constantly ready for target is opened fire thus.In each embodiment, as described in this, other sensing inputs and/or sensing equipment, user action catch input and/or equipment etc. and also can be employed.
In an example, the control aspect of eyepiece can comprise the combination that uses sensing input/sensing equipment and move or move for controlling or initiate the user of order, moves such as utilize optical sensor and close one's eyes, nictation etc.The state of soldier's eye can carry out sensing by the optical sensor being included in the optical train of eyepiece, such as for move to control eyepiece with eye.For example, soldier can aim at their rifle, wherein rifle has ability by opening fire from the control command of eyepiece (such as the sniper in the situation that, wherein initiate order by eyepiece and can reduce due to the error in the aiming that manually cocking causes).Soldier can be then by by detecting that predetermined eye moves the order that the optical sensor of (such as in the commanded profile being retained on eyepiece) sends weapon is opened fire.In each embodiment, as described in this, other sensing inputs and/or sensing equipment, for controlling or initiating, the user of order moves or action etc. also can be employed.
In an example, the control aspect of eyepiece can comprise and uses sensing input/sensing equipment and input therein the order/control model that can be reflected and the combination at interface, such as utilizing adjacency sensor and robot control interface.Be integrated in adjacency that adjacency sensor in eyepiece can be used to the relative robot control of sensing soldier interface so that the use that activates and enable robot.For example, soldier moves towards bomb and detects robot, and robot automatically activates and the configuration (for example, for soldier preference configure) of initialization for this specific soldier.In each embodiment, as described in this, other sensing inputs and/or sensing equipment, input can be reflected therein order and/or control model and interface etc. also can be employed.
In an example, the control aspect of eyepiece can comprise that use on sensing input/sensing equipment and platform can utility command/to inputting the combination of the application responding, such as utilizing audio sensor and music/acoustic application.Audio sensor can monitor surrounding environment sound and start and/or regulate the volume of music, surrounding environment sound, sound elimination etc. to help resist undesired surrounding environment sound.For example, soldier is loaded onto the engine of means of transport and this means of transport and is closed when initial.During this time, except having a rest, soldier may not have other tasks, helps their rest so they open music.In the time that the engine of transporter starts, music/acoustic application regulates volume and/or starts additional sound elimination audio frequency and music input remained on the same with before engine unlatching with help.In each embodiment, as described in this, can utility command on other sensing inputs and/or sensing equipment, platform and/or application that input is responded etc. also can be employed.
In an example, the control aspect of eyepiece can comprise use sensing input/sensing equipment and from platform interface to the communicating by letter or the combination of connection of external system and equipment, such as utilizing passive IR adjacency sensor and external digital signal processor.Soldier can use passive IR adjacency sensor monitoring night scene, this sensor instruction motion, and the connection that eyepiece starts to external digital signal processor is to help the target of mark from adjacency sensing data.In addition, IR imaging camera can be activated digital signal processor is contributed to additional data.In each embodiment, as described in this, other sensing inputs and/or sensing equipment, from platform interface to external system and equipment communicate by letter or connection etc. also can be employed.
In an example, the control aspect of eyepiece can comprise the combination that uses sensing input/sensing equipment and want controlled useful external unit, such as utilizing acoustic sensor and armament systems, the eyepiece of wherein being worn by soldier senses loud sound (such as being blast or the report of a gun), and wherein eyepiece starts the control of armament systems for may move for the target being associated with the generation of the sound ringing.For example, soldier is carrying out guard duty, and hears the report of a gun.Eyepiece can detect the direction of the report of a gun, and soldier is directed to the position that this report of a gun is made.In each embodiment, as described in this, other sensing inputs and/or sensing equipment, want controlled useful external unit etc. also can be employed.
In an example, the controlling party face of eyepiece comprises the combination of the application that uses sensing input/sensing equipment and those useful external units, such as utilizing camera and applications to obtain instruction.Be embedded in camera in soldier's eyepiece and can check and show to indicate available target icon, and eyepiece access applications obtains instruction.For example, soldier is delivered to assembly area, and in the time entering, eyepiece camera is checked icon, externally access instruction and provided for work and so on instruction to soldier, wherein can be automatically in steps, to make providing instruction in the situation that soldier does not know icon.In each embodiment, as described in this, the application of other sensing inputs and/or sensing equipment, external unit etc. also can be employed.
In an example, the control aspect of eyepiece can comprise the combination that uses sensing input/sensing equipment and the feedback relevant with application with external unit to user, such as utilizing GPS sensor and the visual display from remote application.Soldier can have position coordinates is sent/spread the embedded GPS sensor of delivering to remote location mechanism/application, and this remote location mechanism/application sends the visual display of physical environment around/spread delivers to eyepiece for demonstration.For example, soldier can check surrounding environment constantly by eyepiece, and by embedded GPS sensor, even if eyepiece is flowed constantly and transmits the visual display that allows soldier also to have the augmented reality view of surrounding environment in the time changing position and cover.In each embodiment, as described in this, other sensing inputs and/or sensing equipment, feedback relevant with external unit and/or applications etc. also can be employed.
In an example, the combination that the control method of eyepiece can comprise user's motion capture input/equipment and move or move for controlling or initiate the user of order, for example, such as utilizing health movable sensor (, motion sensor) and arm motion.Soldier can have the health movable sensor that is attached to their arm, and wherein the motion of their arm transmits order.For example, soldier can have the motion sensor on their arm, and the motion of their arm is copied in aircraft landing illuminator, to make conventionally can be become larger and more visual by the lamp that personnel were held that helps to land.In each embodiment, as described in this, other user actions catch inputs and/or equipment, for controlling or initiating, the user of order moves or action etc. also can be employed.
In an example, the control aspect of eyepiece can comprise user's motion capture input/equipment and the input order/control model that can be reflected and the combination at interface therein, such as wearable set of sensors and the user interface based on predictability study.The wearable set of sensors of soldier, is wherein collected constantly from the data of this set of sensors and is fed to machine learning mechanism by the user interface based on study, and wherein soldier can accept, refuses, amendment etc. is from the study of their action and behavior.For example, soldier generally carries out same task in each morning Monday in same physical mode, machine learning mechanism can set up the routine of acquistion, the routine of this acquistion offers soldier, the prompting such as for following in ensuing morning Monday: clean particular device, fill in certain table, play specific music, with particular person meet etc.In addition, soldier can be by revising the result of study, such as in the behavior profile of acquistion to the direct editing of routine.In each embodiment, as described in this, other user actions seizure inputs and/or equipment, input can be reflected therein order and/or control model and interface etc. also can be employed.
In an example, the control aspect of eyepiece can comprise on user's motion capture input/equipment and platform can utility command/combination of application that input is responded, such as camera and the Video Applications of subsidiary finger.Soldier can control the embedded camera of eyepiece just by the direction of intrinsic Video Applications capture video.For example, soldier can check scene of fighting, and wherein they must stare in a direction (such as the new development in belligerent is maintained vigilance), takes (such as current belligerent point) in different directions simultaneously.In each embodiment, as described in this, other user actions catch on inputs and/or equipment, platform application etc. that can utility command and/or input is responded and also can be employed.
In an example, the control aspect of eyepiece can comprise user's motion capture input/equipment and from platform interface to the communicating by letter or the combination of connection of external system and equipment, input additional bearing circle control interface such as microphone and speech recognition.Soldier perhaps can change the each side of processing vehicle via voice command, and this voice command is received and be delivered to the bearing circle control interface (such as by the wireless communication between eyepiece and bearing circle control interface) of vehicle by eyepiece.For example, steering vehicle on soldier Zheng highway, so and vehicle there is particular procedure ability desirable for highway.But vehicle also has other patterns for driving under different condition, such as cross-country, in snow, in mud, greatly in the rain, in the time catching up with another vehicle etc.In this example, soldier perhaps can carry out change pattern by voice command in the time that vehicle changes riving condition.In each embodiment, as described in this, other user actions catch inputs and/or equipment, from platform interface to external system and equipment communicate by letter or connection etc. also can be employed.
In an example, the control aspect of eyepiece can comprise user's motion capture input/equipment and want the combination of controlled useful external unit, input additional automobile instrument panel interface equipment such as microphone and speech recognition.Soldier can control and related each equipment of instrument panel of vehicle with voice command, such as heating and ventilation, radio, music, bright light, trip computer etc.For example, soldier may just execute the task, pass through rough terrain at steering vehicle, manually controls vehicular meter disc apparatus to make them arbitrary hand can not be departed to bearing circle.In this example, soldier can control vehicular meter disc apparatus by the voice control to eyepiece.Such as with respect to by the voice control of panel board microphone system, specific useful by the voice command of eyepiece, because military vehicle can be dipped in very loud acoustic enviroment, under this condition, use thus the microphone in eyepiece can provide the performance promoting in fact.In each embodiment, as described in this, other user actions catch inputs and/or equipment, want controlled useful external unit etc. also can be employed.
In an example, the controlling party face of eyepiece comprises the combination of the application of user's motion capture input/equipment and those useful external units, such as utilizing joystick device and outside entertainment applications.The addressable game paddle controller of soldier also can be played games by outside entertainment applications, such as the multi-player gaming of main memory on the webserver.For example, soldier may just experience the down-time period during deployment, and they access joystick device, the eyepiece docking with eyepiece and dock with outside amusement equipment in base.In each embodiment, together with soldier can network with other army personnels on network.Soldier can have preference, the profile etc. with the storage that is associated of playing games.Outside entertainment applications can be such as according to they play games of the management such as soldier's deployment, current standby condition, required standby condition, passing history, ability rating, command post position, rank, geographic position, future deployment.In each embodiment, as described in this, the application of other user actions seizure inputs and/or equipment, external unit etc. also can be employed.
In an example, the control aspect of eyepiece can comprise the combination of user's motion capture input/equipment and the feedback relevant with application with external unit to user, determines system and tone output or audible alert such as utilization activity.Soldier can determine that system monitors and determines the active state soldier when in extreme activity, rest, boring, anxiety, exercise etc. by eyepiece access activity, and wherein, when situation is when (such as default, acquistion, typical) exceeds restriction by any way, eyepiece can provide the form of tone output or audible alert.For example, can during fighting, monitor soldier's current health status, and wherein in the time that healthiness condition enters danger level, soldier and/or another individuality are (for example, another member, the command centre etc. of doctor, hospital personnel, soldier team) be provided earcon, such as instruction, this soldier is injured in fight.Thus, other people can be alerted this soldier's the condition of the injury, and may be able to look after the condition of the injury in more efficient mode.In each embodiment, as described in this, other user actions catch inputs and/or equipment, the feedback relevant with external unit and/or applications etc. and also can be employed.
In an example, the control aspect of eyepiece can comprise using for controlling or initiate the user of order and move or move the additional wherein input order/control model that can be reflected and the combination at interface, such as the fist of holding and can navigating lists.Soldier can with the postures such as the fist such as holding with a firm grip recall show as eyepiece be projected content can navigating lists.For example, eyepiece camera can be checked soldier's hand posture, identification and identify hand posture and carry out fill order according to predetermined posture to order data storehouse.In each embodiment, hand posture can comprise the posture of hand, finger, arm, leg etc.In each embodiment, as described in this, order and/or control model and interface etc. that other move or move for controlling or initiate the user of order, input can be reflected therein also can be employed.
In an example, the control aspect of eyepiece can comprise and use that move or move on additional platform can utility command for controlling or initiate the user of order/combination of application that input is responded, such as nodding and information shows.Soldier can with such as shaking the head, the posture such as arm motion, leg exercise, eye motion recalls information display application.For example, soldier can wish by eyepiece access application, database, network connection etc., and can with the nodding of their head (such as by eyepiece, on soldier's head, sense at the first-class motion detector of soldier's helmet) recall the display application as a graphic user interface part.In each embodiment, as described in this, other for control or initiate the user of order and move or action, platform on can utility command and/or application that input is responded etc. also can be employed.
In an example, the control aspect of eyepiece can comprise use for control or initiate the user of order move or move additional from platform interface to the communicating by letter and the combination of connection of external system and equipment, such as the nictation of eyes with via the API to applications.Soldier can such as the movement of the nodding of the nictation with eyes, head, arm or leg etc. recall Application Program Interface with access applications.For example, soldier can visit applications by the API being embedded in eyepiece facility, and with so doing the nictation (detecting such as the Optical Monitoring ability of the optical system by via eyepiece) of eyes.In each embodiment, as described in this, other move or move for controlling or initiate the user of order, from platform interface to external system and equipment communicate by letter or connection etc. also can be employed.
In an example, the controlling party face of eyepiece comprises using for controlling or initiate the user of order and moves or move and want the combination of controlled external unit, such as by the outside stadimeter equipment of access that dubs of pin.Soldier can have and will detect the sensor (such as the dynamic pickup on their shoes etc.) of motion of soldier pin, and soldier uses the motion (such as dubbing of their pin) of pin to use outside stadimeter equipment to determine the distance of an object (such as enemy's target).For example, soldier can just aim at armament systems, and uses during the course two hands.In this case, give an order by the mode of being done by foot-propelled of eyepiece and can allow " without hand " give an order.In each embodiment, as described in this, other move or move for controlling or initiate the user of order, want controlled useful external unit etc. also can be employed.
In an example, the control aspect of eyepiece can comprise and uses the combination of application of moving or moving additional those useful external units for controlling or initiate the user of order, such as make mark and messenger with hand.Soldier can utilize mark that hand forms by external information transmit application (such as external information be fed to, photo/video sharing application, text application etc.) trigger shared information.For example, soldier opens embedded camera with hand signal, and storage etc. is shared, shared to video flowing with another people.In each embodiment, as described in this, other are for controlling or initiate the user of order and move or the application of action, external unit etc. also can being employed.
In an example, the control aspect of eyepiece can comprise the combination that the additional feedback relevant with application with external unit to soldier moved or moved to use for controlling or initiate the user of order, such as the additional warning of listening of shaking the head.Soldier can adorn oneself with the eyepiece that is equipped with accelerometer (or similar can be used in detected the sensor that gravity is shaken the head), wherein when soldier experiences in the time that the high-caliber gravity of danger is shaken the head, can listen warning as user's feedback is heard, determine such as the part as applying on eyepiece or as a part for the application that departs from eyepiece.In addition, the output of accelerometer can be recorded and be stored for analysis.For example, soldier can experience the gravity being produced by the blast approaching and shake the head, and eyepiece can sensing record and this related sensing data of shaking the head.In addition, shaking the head of danger level can trigger the auto-action of being made by eyepiece, such as transmitting warning, start to monitor and/or send from the soldier's of other sensors of wearing health and fitness information with it, provide relevant listened to the instruction of the condition of the injury possible with them etc. to soldier to other soldiers and/or to command centre.In each embodiment, as described in this, other move or move for controlling or initiate the user of order, the feedback relevant with external unit and/or applications etc. also can be employed.
In an example, each control aspect of eyepiece can comprise that use is inputted on the additional platform of order/control model and interface that can be reflected therein can utility command/combination of application that input is responded, and such as the additional various application that reside on eyepiece of graphic user interface.Eyepiece can provide graphic user interface for you to choose with the application presenting to soldier.For example, soldier can have the graphic user interface by eyepiece projection, and this user interface provides the application of different field, such as military affairs, individual, citizen etc.In each embodiment, as described in this, other are inputted therein on the order/control model that can be reflected and interface, platform application etc. that can utility command and/or input is responded and also can be employed.
In an example, the control aspect of eyepiece can comprise use input therein the order/control model that can be reflected and interface additional from platform interface to the communicating by letter or the combination of connection of external system and equipment, be applied to the navigational system controller interface of external system such as 3D navigation eyepiece interface.Eyepiece can enter navigation mode and be connected to external system by navigational system controller interface.For example, soldier is just holding military maneuvers and is recalling the 3D rendering of pre-loaded surrounding terrain by eyepiece navigation mode, and eyepiece is automatically connected to external system to obtain renewal, current perpetual object (such as being covered by satellite image) etc.In each embodiment, as described in this, other input the order/control model that can be reflected and interface therein, from platform interface to external system and equipment communicate by letter or connection etc. also can be employed.
In an example, the control aspect of eyepiece can comprise that use inputs the additional combination of wanting controlled external unit of the order/control model that can be reflected and interface therein, such as the additional external trace device in augmented reality interface.Soldier's eyepiece can enter augmented reality pattern and and external trace device to fetching by the augmented reality demonstration covering information relevant with object to be tracked or people's position.For example, strengthen display mode and can comprise 3D map, and the people's who is determined by external trace device position can be covered on map, and along with tracked people's mobile display track.In each embodiment, as described in this, other input the order/control model that can be reflected and interface therein, want controlled useful external unit etc. also can be employed.
In an example, the control aspect of eyepiece can comprise that use inputs the combination of the application of additional those external units of order/control model and interface that can be reflected therein, such as the additional simulation application of translucent display mode.Eyepiece can be placed in translucent display mode to strengthen the demonstration of simulation display application to soldier.For example, soldier prepares for task, and entering before battlefield, soldier is provided the simulation of task environment, and owing to not existing user to check the actual needs of the true environment around them during simulating, therefore eyepiece is placed in translucent display mode.In each embodiment, as described in this, application that other input the order/control model that can be reflected and interface, external unit therein etc. also can be employed.
In an example, the control aspect of eyepiece can comprise that use inputs the combination of the additional feedback relevant with application with external unit to user of the order/control model that can be reflected and interface therein, such as the additional tone output in audible command interface feedback.Soldier can be placed in eyepiece audible command interface model, and eyepiece exports to respond the feedback as be ready to receive audible command from the eyepiece of system with tone.For example, audible command interface can comprise in the audible command interface in external position (such as the outside on network) at least partly, once and whole system be ready to accept audible command, tone is just provided.In each embodiment, as described in this, other are inputted therein the order/control model that can be reflected and interface, feedback relevant with external unit and/or applications etc. and also can be employed.
In an example, the control aspect of eyepiece can comprise on platform can utility command/application that input is responded additional from platform interface communicating by letter or the combination of connection to external system and equipment, such as the additional network router of communications applications, wherein soldier can open communications applications, and eyepiece automatically search network router search out the connection of the network facilities.For example, soldier just with their group in battlefield, and new campsite is established.Once communications facility is established, soldier's eyepiece just can be connected in safe wireless connection.In addition,, once communications facility is established, even if soldier does not also attempt communication, eyepiece also can be reminded soldier.In each embodiment, as described in this, can utility command on other platforms/application that input is responded, from platform interface to external system and equipment communicate by letter or connection etc. also can be employed.
In an example, the control aspect of eyepiece can comprise on usage platform can utility command/the additional combination of wanting controlled useful external unit of application that input is responded, such as the additional external camera of Video Applications.Soldier can be docked with the camera of disposing, such as for monitoring battlefield.For example, movement can be disposed camera and can fall from aircraft, and soldier then has the connection to camera by eyepiece Video Applications.In each embodiment, as described in this, the application of energy utility command on other platforms/input is responded, want controlled useful external unit etc. also can be employed.
In an example, the control aspect of eyepiece can comprise on usage platform can utility command/combination of the application of the additional external unit of application that input is responded, such as the additional external search application of search application on eyepiece.Search on eyepiece is applied available external search and should be used for strengthening.For example, soldier can search for the individual identity being just asked, and in the time that search on eyepiece causes not finding, eyepiece connects external search facility.In each embodiment, as described in this, energy utility command on other platforms/also can be employed inputting the application, the application of external unit etc. that respond.
In an example, the control aspect of eyepiece can comprise on usage platform can utility command/combination of the additional feedback relevant with application with external unit to soldier of application that input is responded, such as the additional performance indicator feedback of entertainment applications.Entertainment applications can be used as having a rest for needs but may be due to the soldier's of other reasons anxiety rest mechanism, and performance feedback is designed for the soldier under given environment, such as need to have a rest when them but keep in quick deployment, during idle hours when declining when notice and need to being brought back etc.For example, soldier can and will enter belligerent on means of transport.In this example, entertainment applications can be that action thinking plays to improve notice and enthusiasm, and wherein performance indicator feedback is designed to maximize that soldier will carry out and will be with the expectation of reaching a conclusion that ponders a problem of quick and efficient way.In each embodiment, as described in this, energy utility command on other platforms/also can be employed inputting the application, the feedback relevant with external unit and/or applications etc. that respond.
In an example, the control aspect of eyepiece can comprise and uses from platform interface to the communication of external system and equipment or be connected the additional combination of wanting controlled external unit, such as the additional outer projections instrument in processor interface on the eyepiece to outside plant.Eyepiece processor can be connected to outer projections instrument, to make other people can check the content that can use eyepiece.For example, soldier can be in battlefield and is accessed them and need to not wear the content that people's (such as non-military individuality) of eyepiece is shared with other.In this example, soldier's eyepiece perhaps can dock with outer projections instrument, and content is fed to projector from eyepiece.In each embodiment, projector can be projector, the projector of long range positioning etc. in projector, the meeting room in pocket projector, vehicle.In each embodiment, projector also can be integrated in eyepiece, to make the content can be from integrated projector from outer projections.In each embodiment, as described in this, other are interface communicating by letter or connection, wanting controlled useful external unit etc. also can be employed to external system and equipment from platform.
In an example, the control aspect of eyepiece can comprise and uses from platform interface to the communication of external system and equipment or be connected the combination of the application of additional external unit, such as the additional external voice system in audio system controller interface.Soldier can for example, be connected to external voice system by the audio-frequency unit of eyepiece facility (, music, voice reproducing, audio network file etc.).For example, soldier perhaps can repair the communication that is just received vehicle sounds system by eyepiece, so that other people can be heard.In each embodiment, as described in this, other from platform interface to external system and equipment communicate by letter or the application of connection, external unit etc. also can be employed.
In an example, the control aspect of eyepiece can comprise and uses from platform interface to the communication of external system and equipment or be connected the combination of the additional feedback relevant with application with external unit to soldier, such as the additional feedback of status in steeper controller interface.Soldier can utilize digital steeper control access control gear by steeper controller interface, and wherein mechanism provides the feedback about this mechanism's state to user.For example, the soldier who is removing roadblock can have elevating mechanism on their vehicle, and soldier can directly be docked with this elevating mechanism by eyepiece.In each embodiment, as described in this, other from platform interface to external system and equipment communicate by letter or connection, the feedback relevant with external unit and/or applications etc. also can be employed.
In an example, the control aspect of eyepiece can comprise that use wants the combination of application of additional those external units of controlled external unit, such as the additional automated back-up application of the equipment of enabling storage.The automated back-up application that soldier in battlefield can be provided data storage facility and be associated.For example, storage facility can be arranged in military vehicle, can backup to vehicle from multiple soldiers' eyepiece so that obtain data, especially when network linking is not useable for downloading in the situation of remote backup website.Memory device can be associated with campsite, with battlefield in soldier's subset (for example,, in group) to be associated, to be positioned at soldier they self first-class.In each embodiment, in the time that network service connection becomes available, local storage facility can be uploaded backup.In each embodiment, as described in this, other wants the application etc. of controlled useful external unit, external unit also can be employed.
In an example, the control aspect of eyepiece can comprise that use wants controlled external unit additional to soldier and combination external unit and the relevant feedback of application, such as the additional feedback from system of outside payment system.The payment system of the addressable military management of soldier, and wherein this system provides feedback (for example, receipt, account balance, account activity etc.) to soldier.For example, soldier can make payment to seller by eyepiece, wherein eyepiece and outside payment system swap data, mandate, fund etc., and payment system provides feedback data to soldier.In each embodiment, as described in this, other wants controlled useful external unit, the feedback relevant with external unit and/or applications etc. also can be employed.
In an example, the control aspect of eyepiece can comprise the combination that uses the additional feedback relevant with application with external unit to soldier of the application of external unit, additionally shows feedback together with information such as showing from the information of outside 3D map-play up facility.Soldier perhaps can make 3D map information data show by eyepiece, wherein map facility can such as according to the information of passing information of sending, passing request, from this region other people request, provide feedback according to the change that is associated with geographic area to soldier.For example, soldier can receive 3D map from applications and play up, and wherein applications also provides 3D map to play up at least the second soldier in same geographical area.Soldier then can receive the feedback relevant with the second soldier from outside plant, such as play up their position, identity information, mobile historical etc. of describing at 3D map.In each embodiment, as described in this, the application of other external unit, feedback relevant with external unit and/or applications etc. also can be employed.
In each embodiment, in response to medical conditions, eyepiece can provide various forms of guidances to user.As the first example, user can simulate for training goal the medical conditions may be in fight, in training, time on duty or not on duty etc. occurring with eyepiece.This simulation can regulate towards medical professional or non-medical personage.
As example, low-level fight soldier can check as the medical simulation of a training module part and be provided for the training in response to the medical conditions on battlefield with eyepiece.Eyepiece can provide enhancing environment, and wherein user checks that the condition of the injury covering on another soldier simulates those conditions of the injury common or that can find afield afield.Soldier then can obtain prompting by user interface presented situation is responded.User can be given the progressively instruction of a succession of action for emergency medical relief is provided afield, or user can perform an action in response to this situation, and these actions are then repaired until suitable response is presented.
Similarly, eyepiece can be provided for medical professional's training environment.Eyepiece can present and need the condition of medical response or the situation training goal for medical professional to user.Eyepiece can be emitted essential suitable response and the common scene of battle field of lifesaving skill grasped for its user.
As example, can present to user the augmented reality of wounded soldier, wherein soldier's health has bullet wound.Medical professionalism personage can then carry out him and think that for this situation be the step suitably responding, and selects him to think that the user interface that step is input to eyepiece is medium to this situation proper step from the user interface of eyepiece.User can be by the making for carrying out response of sensor and/or input equipment, or he can by the step of his response via eye move, hand posture etc. is input to user interface.Similarly, the appropriate step that he can move via eye, hand posture etc. selects to be presented to him by user interface.When action be implemented and user make about treatment decision time, can present additional guidance and instruction to user according to his performance.For example, if present chest and have the soldier of bullet wound to user, and user starts soldier to be raised to danger position, and user can be cautioned or point out to change his therapeutic process.Alternatively can be to the step of user's prompting right to implement suitable process.In addition, can present to trainee the example of the medical records of wounded soldier under Training scene, wherein user may at least make his decision according to the content being included in this medical records.In each embodiment, user's action and performance can be recorded by eyepiece and/or put on record for be suspended in training course or otherwise stop after doing further to pass judgment on and instruction.
In each embodiment, in response to real medical conditions in fight, eyepiece can provide various forms of guidances to user.As example, in the time that doctor can not occur immediately, can the progressively lifesaving instruction for the comrade-in-arms under condition to unbred soldier's prompting.In the time that comrade-in-arms is injured, user can input the type of the condition of the injury, and eyepiece can detect this condition of the injury or these combination can occur.Now, can provide the lifesaving instruction that is used to treat wounded soldier to user.Such instruction can the form with augmented reality present in the step-by-step procedure of the instruction for user.In addition, eyepiece can provide the visual help of enhancing about position, the dissection covering of this soldier's health etc. of the vitals near the wounded soldier condition of the injury to user.In addition, eyepiece can be recorded a video to this situation, and this video recording can then be sent back to not doctor afield or the doctor who is rushing for battlefield, thereby allows doctor to instruct untrained user by lifesaving skill suitable on battlefield.In addition, the eyepiece of wounded soldier can send to important information the soldier's who is treating eyepiece (information of the relevant wounded soldier of collecting such as the sensor by integrated or associated), this information is sent to doctor or it can be sent straight to the doctor in remote location, can provide medical treatment to help to wounded soldier according to the information of collecting from wounded soldier eyepiece with the soldier who makes treating.
In other embodiments, in the time that the condition on battlefield is presented, trained doctor can provide the dissection of soldier's health to cover with eyepiece, so that he can more suitably be responded to this situation on hand.Only do not limit the present invention as example, if wounded soldier just because the bullet wound of shank is bled, can present to user the augmented reality view of soldier's artery so that user can determine whether artery has been hit and injured have how serious.Can present the suitable draft for given wound to user by eyepiece, to make him can check each step in therapeutic process.Such draft also can be presented to user with augmented reality, video, audio frequency or other form.Eyepiece can provide Xiang doctor the draft of the augmented reality instruction form having in step-by-step procedure.In each embodiment, the augmented reality that also can present wounded soldier organ to user covers to guide doctor by any process, to make doctor can not make other injury to soldier's organ in therapeutic process.In addition, eyepiece can provide the visual help of enhancing about position, the dissection covering of this soldier's health etc. of the vitals near the wounded soldier condition of the injury to user.
In each embodiment, eyepiece can be used to scan the retina of wounded soldier to obtain afield his medical records.Other material particular that this can remind the possible drug allergy of doctor or benefit can be provided in medical procedure.
In addition,, if wounded soldier is worn eyepiece, equipment can send to the information including the heart rate of wounded soldier, blood pressure, respiratory pressure etc. doctor's glasses.The gait that this eyepiece also can help user to observe soldier, determines whether soldier has craniocerebral injury and they can help user to determine hemorrhage or injured position.Such information can provide about information that may medical treatment to user, and in each embodiment, suitable draft or can be displayed to user to help him to treat patient to the selection of draft.
In other embodiments, eyepiece can allow other symptom that user monitors patient for psychological health states inspection.Similarly, user can check to determine whether patient is representing eye fast and moving and further provide sedation treatment with eyepiece to patient, such as provide eye to move exercise, respiratory training etc. to patient.In addition,, in the time that doctor's eyepiece is collected and be sent to the information of the vital sign about wounded soldier and health data from the eyepiece of wounded soldier, doctor can be provided this information.This can provide the real time data from wounded soldier Xiang doctor, and does not need him own to determining such data by the blood pressure that measures wounded soldier.
In each embodiment, can provide the prompting from eyepiece to user, how far this prompting tells his aerial or ground rescue to leave his position afield has.This can provide important information and remind the pot life in given this situation to him Xiang doctor, whether some process should or must be attempted, and this can provide gratifying rescue knowing or reminding him he may need other help source just on the way to injured soldier.
In other embodiments, if the problem of detecting, user can be provided the warning of his vital sign.For example, if soldier's hyperpiesia, he can be warned, thereby warns him he must take medicine or if possible he be withdrawn to his blood pressure is turned back to security level from fight himself.And user can be warned about other such personal data, such as his pupil size, heart rate, gait change etc., to determine whether user is experiencing medical problem.In other embodiments, user's eyepiece also can be to the medical condition of healthcare givers's reminding user of another location, to send for user's help and no matter whether he knows that he needs such help.In addition, general data can from multiple eyepieces, assemble with provide to commanding officer about he wounded soldier, he have the number of how many soldiers in fight, in them injured etc. details.
In each embodiment, trained medical professional also can use eyepiece in the medical response outside fight.Such eyepiece has doctor described above in general headquarters or the similar use not in general headquarters but outside Combat Condition.In this way, eyepiece can provide obtaining augmented reality assistance, the medical procedure of putting on record during medical procedure to user, under remote command official's guidance, carry out in Huo Bu military base, military base the mode of medical procedure via video and/or audio etc.This therein doctor may need to provide assistance under additional multiple situation of assisting.When doctor just goes out in training routine, gymnastics, military affairs are gone hiking etc. time on duty, an example of this situation can occur.When doctor be only respondent, when he be new doctor, while approaching new situation etc., such assistance may be important.
In certain embodiments, eyepiece can provide user guided in the environment relevant with military transportation airplane.For example, when training, enter war, in investigation or rescue duty, during at mobile device, carry out while maintenance etc. aboard, eyepiece can be used in such environment.Such use can be suitable for the personnel of various grades and rank.
For purposes of illustration, user can pass through eyepiece audio reception and visual information on transporter and while entering training routine.This information can provide the details about training mission to user, such as the map in condition of battlefield, weather condition, task instruction, region etc.Eyepiece can war scene that is virtually reality like reality make user ready for fight.Eyepiece also can be by the response of various means recording users and action.Such Data Collection can allow user to receive the feedback about his performance.In addition, eyepiece then can change simulation to change this simulation or change the simulation in future for user or each user in the time that simulation is underway according to the result obtaining during training routine.
In each embodiment, in the time that military transportation airplane will come into action, eyepiece can provide user guided and/or mutual on this military transportation airplane.User can receive audio frequency and the visual information about task in the time of this user's aboard.Can present checklist to user and guarantee that he has suitable material and the equipment of task.In addition can be presented with together with information (such as position and the safety equipment of emergency exit, welding bottle) about aircraft for the suitably used instruction of fixed equipment and securing band.User can be presented instruction, such as when having a rest before task and for the when administration of this object.Eyepiece can provide noise to eliminate for having a rest before task to user, and then can in the time that user's rest finishes and further task preparation will start, remind this user.Additional information can be provided, such as quantity, the weather condition in fight region etc. of vehicle and/or personnel on the map, battlefield in fight region.Equipment can be provided to other soldiers' link, and to make instruction and fight prepare to comprise that soldier is mutual, wherein commanding officer can be heard by subordinate etc.In addition can formattedly be applicable to, his specific needs for each user's information.For example, commanding officer can receive information higher level or more secret that may needn't offer lower grade official.
In each embodiment, user can use eyepiece in investigation or rescue duty on military transportation airplane, wherein in the time flying over each region, eyepiece catches and stores various images and/or the video in place interested, and it can be used for obtaining the information about territory, combat zone, potential ground etc.Thereby eyepiece can be used to detect the movement of ground people and vehicle and detect the friendly troop that the enemy that will defeat maybe will succour or assist.Eyepiece can provide the ability to map or the image in the region of flying over and searching for by tag application, thereby to searched mistake or still need searched region to provide specific color coding.
In each embodiment, the user on military transportation airplane can be provided for will be by the instruction of the equipment of stock, the quantity that be moved and position and/or checklist, and for the special processing instruction of various equipments.When article can provide for the warning near vehicle to guarantee safety to user in the time unloading or load.
For maintenance and the safety of military transportation airplane, can provide the correct running of preflight check for aircraft to user.If do not complete correct maintenance before task, pilot can be alerted.In addition, can provide the graphical overview of aircraft history or list to carry out the history that tracking aircraft is safeguarded to aircraft operators.
In certain embodiments, eyepiece can provide user guided in the environment relevant with military fighter aircraft.For example, when training, come into action, when safeguarding etc., eyepiece can be used in such environment.Such use can be suitable for the personnel of various grades and rank.
As example, user can be by eyepiece for the training to military fighter aircraft fight.Can present to user the augmented reality situation of the Combat Condition of simulation in specific military jet plane or aircraft.User's response and action can be recorded and/or analyze additional information, judge is provided to user and changes training routine according to passing data.
Fighting in relevant each embodiment with reality, can present around his demonstration and/or near his friendly troop and the information of non-friendly troop aircraft to user.Can present the information about enemy's aircraft to user, such as maximal rate, maneuverability and scope.In each embodiment, user can receive the information relevant with the appearance of ground danger and be warned this situation.Eyepiece can be synchronized to user's aircraft and/or aircraft instrument and meter, to make pilot can see urgent warning and the general additional information not showing about aircraft in cockpit.In addition, eyepiece can be shown to the number of seconds of target area, time according to the threat that is about to bring from vehicle launch guided missile or ejection.Eyepiece can according to surrounding environment, potential threat etc., suggestion be motor-driven carries out for pilot.In each embodiment, though friendly troop's aircraft in undercover operations pattern, eyepiece also can detect and show this friendly troop's aircraft.
In each embodiment, can provide the correct running of preflight check for fighter plane to user.If do not complete correct routine maintenance before task, pilot can be by linking and be alerted with maintenance record, aircraft computer etc.Eyepiece can allow pilot to check historical and this historical chart and diagram of craft preservation.
In certain embodiments, eyepiece can provide user guided in the environment relevant with military helicopter.For example, training, come into action, when safeguarding etc., eyepiece can be used in such environment.Such use can be suitable for the personnel of various grades and rank.
As example, user can the training for the military helicopter operation in fight or high pressure situation by eyepiece.Can present to user the augmented reality situation of the Combat Condition of simulation in given aircraft.User's response and action can be recorded and/or analyze additional information, judge is provided to user and changes training routine according to passing data.
In training and/or course of battle, user's eyepiece can be synchronized to aircraft for important statistics and maintenance about aircraft are warned.User can check for passenger's plan and security procedure and emergency procedure in the time that he climbs up aircraft.Such program can show take how safely aircraft, how actuating doors enters and leaves the out of Memory such as position of aircraft, help equipment.In each embodiment, eyepiece can present to user position and/or the orientation of threat, may in the typical flight course of helicopter, produce dangerous threat to it such as those.For example, user can be presented the position of low-latitude flying (such as target drone, other helicopter) threat and the position that land threatens.In each embodiment, noise is eliminated earphone and can be provided together with eyepiece with multi-user's user interface, thereby allows communication during flying.In the situation that helicopter declines therein, user's eyepiece can be sent to commanding officer and rescue team by position and helicopter information.In addition, in low-altitude mission process, use the night vision of eyepiece can allow user powerful helicopter spotlight can be closed to search for or find enemy in the situation that not being detected.
In each embodiment, as described in each example described here, eyepiece can be assisted and determine whether that correct routine maintenance is implemented providing aspect the tracking of craft preservation.In addition, use other aircraft and vehicle as mentioned herein, augmented reality can be used to the maintenance and operation aspect of aircraft that assistance is provided.
In certain embodiments, eyepiece can provide user guided in the environment relevant with military target drone aircraft or robot.For example, in investigation, while arresting with rescue duty, fight, generation the mankind's particular risk etc., eyepiece can be used in such environment.
In each embodiment, eyepiece can provide the video feed about target drone surrounding environment to user.Real-time video for the information about each region-of-interest can be displayed up to the several seconds.Collect such information and can provide the knowledge about quantity, the layout of buildings etc. of enemy soldier in region to soldier.In addition, eyepiece be collected and be sent to data can to collect the information about the concern personage's that will be captured or succour position from target drone and/or robot.Illustrate, the user outside safe place or blindage can send it back about position, quantity and movable video or the feeds of data of the people in safe place and catch or rescue with preparation with target drone and/or robot.
In each embodiment, can allow battlefield data during commanding officer's collection task to make Planning Change and provide the various instructions of team according to the data of collecting in conjunction with the use of the eyepiece of target drone and/or robot.In addition, the control of eyepiece and associated can allow user to come at target drone and/or robot upper part administration weapon by the user interface in eyepiece.Can provide about what weapon and will dispose and when dispose their user profile from the feeds of data of target drone and/or robot transmission.
In each embodiment, can allow user to approach potential danger situation from the data of target drone and/or robot collection.For example, this can allow that user investigation biology overflows, bomb, stoneshot, foxhole etc. to be to provide the data about situation and environment to this user in the time user being remained on outside direct infringement.
In certain embodiments, eyepiece can provide user guided in the environment relevant with marine ships used for military purposes.For example, in training, enter war, cleaning after search and rescue duty, execution calamity, carry out while maintenance etc., eyepiece can be used in such environment.Such use can be suitable for the personnel of various grades and rank.
In each embodiment, eyepiece can be used in training to allow user prepare the various technical ability combinations for the performance of the responsibilities of theirs aboard ship.Training can comprise user navigation, control boats and ships and/or under Combat Condition, carry out the simulation that the ability of various task dispatchings is tested.User's response and action can be recorded and/or analyze additional information, judge is provided to user and changes training routine according to passing data.
In each embodiment, the augmented reality view that eyepiece can threaten by boats and ships potential outside user provides local horizon allows user to check this situation.Such threat can be indicated by round dot, diagram or alternate manner.Once eyepiece detects specific threat, can send to user by eyepiece about the instruction of carrying out with the belligerent preparation of enemy.In addition, user can check therein them by the map at the harbour of landing pier or video and be provided enemy position.In each embodiment, eyepiece can allow user to synchronize with boats and ships and/or weaponry and during fighting, guide the use to user's Navigation Equipment.Can by eyepiece, to user reminding, international and national hydrosphere be where.
Need therein in each embodiment of search and rescue, eyepiece can be used for following the tracks of current and/or mark is carried out in the nearest waters of searching for.Therein in the tracked each embodiment of current, this can provide the concern personage's that transmission will be succoured potential site or through changing the information of position to user.Similarly, eyepiece therein user must investigate in each environment of surrounding environment and use.For example, user can be alerted the remarkable change of hydraulic pressure and/or water sport, and so remarkable change can be sent the signal closing on of mantle motion and/or disaster on the horizon.Prompting about the threat of change, earthquake and/or the tsunami etc. of earth mantle can be sent to user by eyepiece.Such prompting can by with boats and ships on the eyepiece of device synchronization change by following the tracks of ocean water motion, current, hydraulic pressure changes, landing or the lifting etc. of water level around provides.
Ships used for military purposes is deployed for each embodiment clean after calamity therein, and eyepiece can be used to detect the region of polluting, pollute the speed of advancing and about the degree of depth with pollute the prediction where stopping.In each embodiment, eyepiece can be used to detect the volume number of contained pollutant in the air of ppm(1,000,000 volumes polluting) and variation on it determine the variation of pollution volume location.
In each embodiment, eyepiece can provide plan to check boats and ships and the correct running of equipment on it to user.In addition,, if do not complete correct routine maintenance before disposing, each operator of boats and ships can be alerted so.In each embodiment, user perhaps can check the state of the maintenance history of boats and ships and the important function of boats and ships.
In each embodiment, in the environment of submarine, eyepiece can provide various forms of guidances to user.For example, when training, come into action, when safeguarding etc., eyepiece can be used in such environment.Such use can be suitable for the personnel of various grades and rank.
As example, user can the training for the submarine operation in fight or high pressure situation by eyepiece.User can be presented the augmented reality situation of the Combat Condition of simulation in particular submarine etc.Training plan can be based on user grade, will determine the type of the situation that be presented with the grade that makes him.User's response and action can be recorded and/or analyze additional information, judge is provided to user and changes training routine according to passing data.In each embodiment, eyepiece also can safeguarded submarine, use the aspect training users such as submarine and correct security procedure.
In combat environment, eyepiece can be used to provide about the friendly troop on position, the water surface of user's the degree of depth, enemy and object and/or enemy's information to user.In each embodiment, such information can send user to by audio frequency etc. in visual presenting.In each embodiment, eyepiece can be synchronized to equipment and the equipment of submarine and/or utilize the equipment of submarine and state is collected data from GPS, sonar etc. to collect various information, such as the position of other object, submarine etc.Eyepiece can show the instruction about enemy's appearance in security procedure, task details and region to soldier.In each embodiment, equipment can be communicated by letter with boats and ships and/or weaponry or synchronously in the time using such equipment, be instructed and provide the demonstration relevant with project equipment to user.Such demonstration can comprise the visual and voice data relevant with equipment.As further example, equipment can make visual picture for strengthening user and/or audio frequency to show the local of potential threat, concern and may not be by carry out shown information with periscope, such as position, country and international hydrosphere, the full spectrum of threats etc. of enemy outside the visual field together with periscope.
Also can in the maintenance of submarine, use eyepiece.For example, it can check the correct running for boats and ships before user provides travelling, and it can be reminded there is no to carry out or do not complete correct routine maintenance operation before task.In addition, user can be provided detailed history and check maintenance of execution etc.In each embodiment, eyepiece also can be by providing augmented reality or other to assist submarine to safeguard in the plan of carrying out indicating user in such maintenance.
In each embodiment, in the environment of boats and ships, eyepiece can provide various forms of guidances to user in port.For example, when training, come into action, when safeguarding etc., eyepiece can be used in such environment.Such use can be suitable for the personnel of various grades and rank.
As example, user can use eyepiece to be used for the training to boats and ships in the port in the time fighting, under attack or high pressure situation.Can present the augmented reality situation that the Combat Condition to seeing is simulated in specific harbour and on such boats and ships to user.Training plan can show from global each harbour and land data around, preset time may be in harbour alliance's boats and ships or the data of the quantity of enemy's boats and ships, and it can show local refuelling station etc.Training plan can be based on user grade, will determine the type of the situation that be presented with the grade that makes him.User's response and action can be recorded and/or analyze additional information, judge is provided to user and changes training routine according to passing data.In each embodiment, eyepiece also can be safeguarded and carry out the use of maintenance of machine, boats and ships and on boats and ships, use the aspects such as correct security procedure to train on boats and ships to user.
In combat environment, eyepiece can be used to provide with user wherein by landing pier or by the relevant information in the harbour of landing pier to user.Can provide about enemy in harbour and/or the position of friendly troop's boats and ships or the information of other visual representation to user.In each embodiment, user can obtain the warning about close aircraft and enemy's boats and ships, and user can be synchronizeed with boats and ships and/or weaponry and user instructed using aspect equipment, and provide about the information of this equipment simultaneously and/or show data.Such data can comprise quantity and effect of specific munitions etc.Eyepiece can show the instruction about enemy's appearance in security procedure, task details and region to soldier.Such demonstration can comprise visual and/or audio-frequency information.
Also can in the maintenance of boats and ships, use eyepiece.For example, it can check the correct running for boats and ships before user provides travelling, and it can be reminded there is no to carry out or do not complete correct routine maintenance operation before task.In addition, user can be provided detailed history and check maintenance of execution etc.In each embodiment, eyepiece also can be by providing augmented reality or other to assist boats and ships to safeguard in the plan of carrying out indicating user in such maintenance.
In other embodiments, user can obtain about those near the biometric information of the object at harbour with eyepiece or miscellaneous equipment.Such information can provide user's identity and allow user to know this people and threaten or the someone who pays close attention to.In other embodiments, user can scan import to the article at harbour or container to find potential threat in shipment etc.The various out of Memory that user can collect according to density or by the sensor being associated with eyepiece or equipment detect dangerous substance.Eyepiece can recorded information or scanned document determine whether the document is forged in some way or revises.This can help user to check individual certificate, and it can be used to check that the proof document that is associated with Specific Goods is with potential threat from kinds of goods to user reminding or problem that can be relevant with, such as the file of inaccurate inventory, forgery etc.
In each embodiment, in the time using tank or other land vehicle, eyepiece can provide various forms of guidances to user.For example, when training, come into action, for monitoring, group's transport, when safeguarding etc., eyepiece can be used in such environment.Such use can be suitable for the personnel of various grades and rank.
As example, user can by eyepiece for to fight, attacking or high pressure situation under time use the training of tank or other land vehicle.Can present the augmented reality situation to simulating when the Combat Condition of seeing in tank and/or in the time operating tank to user.Training plan can be tested user for correct equipment and weapon use etc.Training plan can be based on user grade, will determine the type of the situation that be presented with the grade that makes him.User's response and action can be recorded and/or analyze additional information, judge is provided to user and changes training routine according to passing data.In each embodiment, eyepiece also can be safeguarding tank, use tank and when user is trained in the aspects such as the correct security procedure using in tank or while climbing up vehicle etc.
In combat environment, eyepiece can be used to provide information and/or visual present relevant with the enemy army of land and/or the position of friendly troop's vehicle to user.In each embodiment, user can obtain about the warning of close aircraft and enemy's vehicle and user can synchronize with tank and/or weaponry and user instructed using aspect equipment, and provides about the information of this equipment simultaneously and/or show data.Such data can comprise quantity and effect of specific munitions etc.Eyepiece can show the instruction about the appearance of enemy and friendly troop in security procedure, task details and region to soldier.Such demonstration can comprise visual and audio-frequency information.In each embodiment, user can be spread and be sent the 360 degree views from the surrounding environment of tank outside here, and this can be by being synchronized to camera or other equipment with such view is realized with eyepiece.Can be to the inside of tank/vehicle or outside user as much as possible provide video/audio to be fed to.This can allow user to monitor vehicle and static threat.Eyepiece can communicate to monitor car statistics data with vehicle and various vehicles, aircraft and equipment described here or that otherwise it will be apparent to those skilled in the art, such as armoring breakage, engine status etc.Eyepiece can further provide GPS for navigation purpose, and making for detecting enemy army and navigating to environment at night and in non-the best in watching etc. black silicon or other technology described here.
In addition, eyepiece can be used in tank/land vehicle environment for monitoring.In each embodiment, user perhaps can be synchronized to the visual field that camera or miscellaneous equipment obtain 360 degree with collection information.Night vision and/or SWIR described here etc. can be used to where necessary further information.User can detect heat signal with eyepiece and investigate environment to detect potential threat, and can check that soil density etc. detects roadside bomb, track of vehicle, various danger etc.
In each embodiment, eyepiece can be used to promote to use group's transport of tank or other land vehicle.For example, can provide the article that will betransported and individual inventory to user, this inventory is visual, mutual etc.User perhaps can follow the tracks of and more the inventory of new article to follow the tracks of such as those article in transit etc.User perhaps can check that map, the scanning of peripheral region prove document and file for personnel's mark, identification and follows the tracks of and related article of individuality in transit, check individual route/mission bit stream in transport etc.
Also can in the maintenance of vehicle, use eyepiece.For example, it can check the correct running for tank or other vehicle before user provides travelling, and it can be reminded there is no to carry out or do not complete correct routine maintenance operation before task.In addition, user can be provided detailed history and check maintenance of execution etc.In each embodiment, eyepiece also can be by providing augmented reality or other to assist vehicle to safeguard in the plan of carrying out indicating user in such maintenance.
In each embodiment, in city or suburban environment, eyepiece can provide various forms of guidances to user.For example, when training, come into action, when monitoring etc., eyepiece can be used in such environment.Such use can be suitable for the personnel of various grades and rank.
As example, user can with eyepiece for when fight, attacking or high pressure situation under, carrying out when mutual etc. in city with local employee or suburban environment is trained.Can present the augmented reality situation that the Combat Condition to seeing when time in such environment is simulated to user.Training plan can be tested user for correct equipment and weapon use etc.Training plan can be based on user grade, will determine the type of the situation that be presented with the grade that makes him.User's response and action can be recorded and/or analyze additional information, judge is provided to user and changes training routine according to passing data.In each embodiment, user can check the alternately sight that city and suburb arrange, and city and suburb arrange and comprise the layout of real buildings and buildings and the region of potential fight.Before entering this region, can provide climate and weather information to user, and can be notified in preset time or that time in one day time generally the number in this region be possible attack or other belligerent preparation.In addition, can provide in the buildings in given area to user, around and individual position on top, so that user before entering this environment is ready to.
In city and suburban environment, eyepiece or miscellaneous equipment can allow user also to investigate local employee.User can collect concern personnel's face, iris, voice and fingerprint and palmmprint data.User can be in the situation that not discovering from distance P OI0-5 rice, larger distance or just scanning such data on POI side.In each embodiment, user can with eyepiece understand thoroughly smog and/or destroyed environment with mark and be recorded in the appearance of vehicle in this region, use for (such as in plan of action) in the future to record ambient image, the layout of the density of population, various buildings and the path in region when being marked at each time of one day etc.In addition, user can collect and receive about soldier the specific aborigines' that contact with it truth.
In the time of fight, user can use eyepiece or miscellaneous equipment in urban/suburban environment.Equipment can allow user to use geographic position to locate and to destroy unfriendly target by laser range finder.In each embodiment, it can provide the general view of surrounding environment and buildings.It can show enemy in user's peripheral region the position of mark individual (such as those members of enemy army or friendly troop or user group).User can come keep in touch, check/listen to the instruction from commanding officer by eyepiece with his general headquarters with eyepiece or miscellaneous equipment, wherein instruction can check or listen to the data from user environment after make.In addition, eyepiece also can allow user to provide order to other members in his group.In each embodiment, user can carry out biometric data near those members and collect, record such information and/or retrieval about their information to use in fight.User can be with other soldier's linking of devices for monitoring and use the various equipments that carried by soldier.In each embodiment, also warn when near ground transformation or thrust at the edge of the buildings that eyepiece can occur very soon to user reminding when on roof.Can make user can check that the map of environment covers and the member of his group, and he can detect near the signal that will give a warning and warn near possible enemy to other people.In each embodiment, user can be by eyepiece for communicating executive plan with other group members.In addition, user can detect the enemy who is positioned at dark tunnel and enemy and can be positioned at other regions wherein with eyepiece.
Eyepiece also can be used in desert Environment.Except the general and/or applicatory use relevant with training, fight, existence, supervision object etc. described here, eyepiece can be further used in the various use scenes that can meet with in the environment such as such as desert Environment.As example, in the time coming into action or train, user sees through sandstorm vision in can being corrected in fight, monitoring and training with eyepiece goes down.In addition, eyepiece can be simulated bad visibility and other desert danger of sandstorm in training mode for user.In fight, eyepiece can carry out assisting users by variety of way described above sees or detects the appearance of enemy in sandstorm.In addition, user can be alerted and/or can see by vehicle cause Sha Yun's and the husky cloud that produced by wind between difference close to be warned potential enemy.
In each embodiment, user can detect ground danger and environmental hazard with eyepiece.For example, user can detect with eyepiece edge, the sand-protecting barrier etc. on sand dune.User also can detect sand density to detect various danger with eyepiece, such as hole in the ground, steep cliff, buried equipment such as mines and bombs.The map that can present desert to user is checked so dangerous position.In each embodiment, can provide the device that monitors his vital signs and warn to him during in danger due to extreme environmental conditions (cold, the variation of temperature in heat in such as one day, evening, dehydration) when him by it to user.In the user interface that such warning and supervision can show in eyepiece, illustrate and provide and/or provide by audio-frequency information.
In each embodiment, the map that can present desert to user is checked the position of his group, and near his signal can detecting with eyepiece or obtain the warning of possible hostile forces, and this warning can be displayed on map or from the audio-alert of earphone.In such embodiments, user can have than his enemy's advantage, this be because he can have in sandstorm, in buildings, the medium position of group and enemy's the ability of determining him of vehicle.User can check the map of his position, and this map can by the region that this user advances recently therein, with a kind of color demonstration, the new another kind of color in region shows.In this way or by other means, equipment can make user lost and/or keep mobile with correct direction.In each embodiment, user can be provided weather satellite and cover with reminding user sandstorm and hazardous weather.
Eyepiece also can be used in field environment.Except the general and/or applicatory use relevant with training, fight, existence, supervision object etc. described here, eyepiece can be further used in the various use scenes that can meet with in the environment such as such as field environment.
As example, user can be used in field and prepare with eyepiece in training.For example, user can simulate with eyepiece the degree of the variation of field environment.In each embodiment, user can experience very thick and heavy tree/bushes of dangerous animal around, and in other training environment, he can stand to have the challenge that less place can hide enemy.
In fight, user can come for various objects with eyepiece.User can detect nearest enemy with the branch that eyepiece detects new fracture and occur.In addition, user can detect with eyepiece change, the dust of movement/disturbance recently etc. of dangerous steep cliff, cave, landform.As example, the appearance of dust by nearest disturbance being detected is (if it has density or the heat signal different from dust/leaf around, it can be detected, or it can be detected by alternate manner), user can be alerted trap, bomb or other hazardous equipment.In each environment of describing in this article, user can communicate with his group by user interface or alternate manner with eyepiece, to make communication maintenance to mourn in silence and/or not detected by enemy in enclosed environment, open environment to echo sensitivity etc.And in each environment, user can detect with night vision described here enemy's existence.User also can check the covering of trajectory diagram and/or high mountain trajectory diagram in eyepiece so that user can check path before can being positioned at situation wherein meeting with potential hazard region and/or enemy.In each environment of describing in this article, the sense of hearing that eyepiece also can amplify user carrys out the detection for potential enemy.
In each embodiment, user can lowered in field environment in, under search and rescue service condition, use eyepiece.For example, user can detect soil/leaf with eyepiece and move, with determine it whether by disturbance for following the tracks of mankind's track and for finding the corpse of burying.User can check that the map in the region being labeled shows and be searched for the region of covering by air force and/or other group member, thereby user is directed to not searched region from the region of having searched for.In addition, user can come for seeing through trees, undergrowth, bushes etc. night vision for the mankind and/or animal detection with eyepiece.In addition, by detect the existence of the sprig of new fracture with eyepiece, when in monitor and/or rescue duty in time, user can detect pay close attention to personage existence or recently exist.In each embodiment, user also can check the covering of trajectory diagram and/or high mountain trajectory diagram in eyepiece, can before meeting with potential hazard region and/or situation, check path so that obtain user.
In other embodiments, user can adopt the purposes of eyepiece for living outside land and under existence type situation in the wild.As example, in the time of search of food, user can with eyepiece follow the tracks of animal dis and mobile.In addition, user can be by eyepiece for the detection of soil moisture and detect existence and the position of water system.In each embodiment, the hearing that eyepiece also can amplify user detects the animal of potential prey.
Eyepiece also can be used in arctic circumstances.Except general and/or applicatory be suitable for relevant with training, fight, existence, supervision object etc. described here, eyepiece can be further used in the various service conditions that can meet with in the environment such as such as arctic circumstances.For example, when in training, visual and audio frequency milky white (white out) weather condition that eyepiece can analog subscriber can meet with in arctic circumstances so that user can be adapted to operate under such pressure.In addition, eyepiece can provide the program of simulating various situations and scene according to extremely cold to user, and the data that this program can be followed the tracks of and demonstration is relevant with user's prediction thermal loss.In addition, this program can be applicable to coming the situation that analog subscriber may experience in this thermal loss situation.In each embodiment, this program can analog subscriber can not be controlled his four limbs suitably, and this can occur the reduction of weapon accuracy.In other embodiments, can to user provide help information and about as instruction and the various existence skill for arctic condition of the warming matters of burrowing in snowfield.In other embodiments, eyepiece can be synchronized to vehicle so that vehicle is responded as carrying out in the specific environment for example with arctic condition and ice and snow as it.Therefore, vehicle can similarly respond to user and eyepiece also can come as user analog vision and audio frequency in such environment.
In each embodiment, user can use eyepiece in fight.Soldier can allow him to have an X-rayed milky weather condition with eyepiece.Covering map and/or audio frequency that user can recall the information that buildings irrigation canals and ditches, land danger etc. are provided allow soldier in environment, to move safely.Eyepiece can reminding user detects rising or the decline of snow density, thereby allows him know when changed irrigation canals and ditches, hole or other danger, the object of burying etc. that all Tathagata expresses possibility in snow when the continent of the face of snowing.In addition, be difficult to therein lower time of condition of seeing, no matter avenge the visual field that whether has hindered user, his group member and enemy's position can be provided to him.Eyepiece also can provide heat signal to show animal and individuality to user in arctic circumstances.In each embodiment, the user interface in eyepiece can show his vital signs and when he provides warning around during in danger due to extreme environmental baseline to soldier.In addition, eyepiece also can be helped user from the prompting about transmission slip, wheel slip etc. of vehicle operate vehicle under many ice-lollies part by providing to user.
Eyepiece also can be used in jungle environment.Except the general and/or applicatory use relevant with training, fight, existence, supervision object etc. described here, eyepiece can be further used in the various use scenes that can meet with in the environment such as such as jungle environment.For example, eyepiece can be in training in order to provide to user about which plant can eat, which is poisonous and what insect and animal may make user's information on the line.In each embodiment, various noises and environment that eyepiece can analog subscriber may meet with in jungle, so that proper, this environment is unlikely in fight time diverts one's attention.In addition, when in fight or real jungle environment, can provide diagram to cover or other map shows the region of his around and/or help him to follow the tracks of him from where and where he must go to user.This can remind allied forces and the enemy army in this region to him, and it can sensing movement so that near animal potential user reminding and/or insect.Such prompting can help user by avoiding attack and search of food to survive.In other embodiments, can provide such as the augmented reality data with diagram mulching method to user, it allows user that biology and/or animal and those biologies that ran into and/or animal are compared and help user and distinguish which is edible safety, which is poisonous etc.Be not the information threatening by having specific biology to user, when in undercover operations or stillness mode, he can weapon deployment.
Eyepiece also can use relevantly with special force task.Except described here with training, fight, existence, monitor the general and/or use applicatory that object etc. is relevant, eyepiece can be further used in the various use scenes that may meet with relevant with special force task.In each embodiment, eyepiece can be for the specific use of undercover operations task.For example, the user interface that user can see by each member on his eyepiece is mourned in silence communicate by letter completely with his group.User shares information and can move and/or controller equiment etc. navigates in user interface by eye.Along with user makes instruction and/or navigates in user interface and the particular data about the information that will transmit, other users also can see these data.In each embodiment, each user can insert the problem that will be answered by instruction leader by user interface.In each embodiment, user can talk or initiate other audio frequency that all users can hear by their eyepiece or miscellaneous equipment.This can allow the user of each position afield to send plan of action, instruction, problem, shared information etc. and can allow them in the situation that not being detected, so to do.
In each embodiment, eyepiece also can be used to military fire-fighting.As example, user can move with eyepiece the simulation of fire-fighting scene.Equipment can be simulated the condition of a fire and the structural damage to buildings as time goes by with augmented reality, and it can also reproduce scene true to nature by alternate manner.As mentioned herein, training program can monitor that user's progress and/or the action according to user change scene and training module.In each embodiment, glasses can be used in actual fire-fighting.Eyepiece can allow user to understand thoroughly smog by various means described here.User can check, download or otherwise visit the layout of buildings, container, aircraft, vehicle or the structure of catching fire.In each embodiment, user will have general view map or other shows that each group member is positioned at map where.Eyepiece can monitor user wears or miscellaneous equipment during putting out a fire.User can see his Oxygen supplied level and be reminded in the time that he should withdraw to obtain more oxygen in his eyepiece.Eyepiece can enter or leave the scene of a fire to dispose new personnel by the command post that sends to structural outer from the notice of subscriber equipment, and gives do well renewal and the prompting to possible fire fighters danger.User can make his vital sign show, with determine he whether temperature too high, lose too many oxygen etc.In each embodiment, eyepiece can be used to analyze according to density, the heat signal etc. of beam the structural intergrity that whether has crack in beam or in moulding and notify user's buildings or other environment.In the time that structural intergrity is damaged, eyepiece can provide automatic warning.
In each embodiment, eyepiece also can be used to maintenance purpose.For example,, before eyepiece can provide task to user and/or use inventory, for the correct running of the article that will be used.If correct maintenance is not by the database of typing article, it can remind operator so.It can be user virtual maintenance and/or the performance histories requisite measure to determine the security of article or will take for safety and/or performance is provided.In each embodiment, eyepiece can be used to carry out augmented reality program etc. for training user at weapon maintenance and aspect safeguarding, and is used in the course of skilled worker's relevant new and/or Advanced Equipment.In each embodiment, eyepiece can be used in the maintenance and/or repairing of various article (such as weapon, vehicle, aircraft, equipment etc.).User can check the visual covering of article and/or audio frequency instruction so that user safeguards in the situation that not needing hand-held handbook with eyepiece.In each embodiment, image, the audio frequency etc. of video, still image, 3D and/or 2D image, animation can be used to such maintenance.In each embodiment, user can check the covering of article and/or the video of various images so that shown which part of user to remove, with what order remove and how to remove, which part will be added, replacement, repairing, enhancing etc.In each embodiment, such maintenance program can be augmented reality program etc.In each embodiment, user can come to help repair and/or provide maintenance information with machine or equipment connection with the running and/or the important statistics that monitor machine or equipment with eyepiece.In each embodiment, user can advise with eyepiece ensuing a succession of action during safeguarding, and how and/or whether eyepiece can send about such action and can damage machine, help to repair machine, machine by the information of the possibility operating etc. after this following step to user.In each embodiment, the maintenance of all article that eyepiece can be used to may be used on as mentioned herein or otherwise military environment or meet with in military environment, machine, vehicle, equipment, aircraft etc.
Eyepiece also can be used to user therein to be had in unfamiliar environment to a certain degree said language.As example, soldier can obtain the nearly real time translation to those talkers around him with eyepiece and/or equipment.By the earphone of equipment, he can hear the translation with his mother tongue to he talker.In addition, he can record and translate the comment of being made by the prisoner of war and/or other internees.In each embodiment, soldier can have can translate phrase or by earphone, eyepiece by user in text image or the user interface of translation is otherwise provided to user.In each embodiment, eyepiece can by linguist make for provide to veteran linguist about in specific region said dialect or near the people him said be the side information of what dialect.In each embodiment, linguist can carry out record instruction sample for further comparing and/or study with eyepiece.Other experts can with eyepiece adopt speech analysis changing voice by supervision, tone, stutter etc. determine whether speaker is just experiencing anger, ashamed, lie etc.Even if hearer and speaker say different language, this still can give hearer speaker intention originally.
In each embodiment, eyepiece can allow user to understand body language and/or facial expression or the other biological identification data from another people.For example, user can with equipment come analyst the changing voice of pupil amplification, blink rate, sound, health moves etc. determines whether this people tells a lie, inimical, under pressure, may be to threaten etc.In each embodiment, eyepiece also can be collected such as the data of facial expression and detect and warn user speaker whether telling a lie maybe may to make insecure statement, inimical etc.In each embodiment, when individual when mutual with group or other, eyepiece can provide and remind the potential individuality with menace that can be disguised oneself as by non-belligerent or common citizen or other individualities with warning to user.User reminding can be audio frequency and/or visual, and can in user interface, appear in user's eyepiece or cover in user's vision and/or be associated with the individuality of investigating in user's sight line.Such supervision can be collected invisibly data in the time that user uses eyepiece and/or equipment from a distant place as described in this, it can by camouflage or discontinuous mode in very near distance, carry out, or in the case of individuality under a cloud know and/or agree to carry out.
In the time processing bomb and other hazardous environments, eyepiece also can be used.As example, eyepiece can provide the prompting changing about the soil density near roadside to user, and it can remind the bomb of burying to user and/or group.In each embodiment, similar approach can adopt in various environment, determines whether to find bomb or other explosives in arctic circumstances etc. such as the density of test snow.In each embodiment, eyepiece can provide density calculation to determine whether luggage and/or transport article tend to have unexpected density or drop on the density outside the particular range that betransported article.In each embodiment, and if eyepiece can provide similar density calculation density to be found to be while dropping in the desired extent such as explosive equipment, other weapon, provide warning.Those skilled in the art will recognize that bomb detection also can adopt and can in each embodiment, be used by eyepiece via the known way of chemical sensor and/or this area.In each embodiment, glasses can be used in bomb disposal processing.Can provide augmented reality or other audio frequency and/or visual covering to obtain about the instruction of bomb of how to remove existing particular type to user.Be similar to maintenance program described above, the instruction that can be provided for disarming a bomb to user.In each embodiment, if bomb type is unknown, user interface can be provided for safe handling and the instruction of the following step taked possibly to user.In each embodiment, near the potential bomb of user can being warned and can be presented the instruction for processing safely this situation, such as how fleeing from safely bomb region, how to exit safely vehicle, the user with bomb leave bomb how close be safe, how disarm a bomb etc. by being suitable for the instruction of this situation and user's skill level.In each embodiment, eyepiece also can provide the training in such hazardous environment etc. to user.
In each embodiment, eyepiece can detect various other danger, and such as biology leakage, chemicals seepage etc. also provides the warning of unsafe condition to user.In each embodiment, user also can be provided various about mitigation situation in this environment and/or under such condition, become safety and keep the instruction of other people safety.Although described the situation with bomb, it is intended to represent that eyepiece can be used similarly and watch out for and suppress under various danger and/or dangerous situation, and/or so dangerous and instruction etc. is provided when dangerous when suffering from.
In each embodiment, eyepiece can be used in general body-building and training environment.Eyepiece can provide the information such as the mile of advancing during running, go hiking, walk at him etc. to user.Eyepiece can provide the information such as quantity, the calorie of burning etc. of carried out exercise to user.In each embodiment, eyepiece can provide with correct and carry out the relevant virtual instruction of specific exercise to user, and it can provide additional exercise to user desirably or ideally.In addition, eyepiece can provide that physical efficiency benchmark is therein revealed meets user interface or other modes of requirement for his specific program to soldier.In addition, eyepiece can provide the quantity data relevant with type of the exercise being implemented with needs to make user meet such requirement.Such requirement can be towards adjustment such as special force's qualification, propaedeuticss.In each embodiment, user can cooperate to stop user to set up real railing, obstacle etc. together with virtual obstacle at exercise period.
Although described specific each embodiment and use scenes at this, such description is not intended to restriction.In addition, being intended to eyepiece can be used in to those skilled in the art significantly in each example.The adapted to use of the eyepiece of also expecting as mention for specific environment can be used in each other environment, even without specifically mention thereupon.
In each embodiment, user-accessible and/or otherwise handle be stored in secure digital (SD) card, mini SD card, other storer, be remotely carried in tactical network or pass through other means canned data storehouse.This storehouse can be that the part of subscriber's installation and/or it can be remote accessibles.Subscriber's installation can comprise DVR or can be by on-demand delivery to other places for storing other device of the information of being collected by user and recorded data and/or being fed to.In each embodiment, this storehouse can comprise the local image threatening, be listed in each individual's of threat information and/or image etc.The storehouse threatening can be stored onboard in mini SD card or other device.In each embodiment, it can remotely be carried in tactical network.In addition, in each embodiment, information bank can be included in program and the out of Memory that aspect is useful of safeguarding of military vehicle, or data can be any kind or about the information of any type.In each embodiment, information bank can use so that must data be transmitted and/or send to storage medium and subscriber equipment or send from storage medium and subscriber equipment together with equipment.As example, data can be sent to user's eyepiece and send from the storehouse of storage, to make him can check the local personage's of concern image.In each embodiment, data can be sent to and be included in soldier and equip interior or be arranged in long-range storehouse and equip or be positioned at long-range storehouse and send from being included in soldier, and data can be sent to various device described here and send from various device described here.In addition, data can send between various device described here and various storehouse described above.
In each embodiment, military simulation and training can be used.As example, battlefield simulation and training can be adjusted and be used to the scene of game that is generally used for amusement.The various device of all eyepieces as described in this etc. can be used to such object.Near-field communication can be used in and in such simulation, carrys out change personnel, presents danger, changes strategy and communicate by letter with scene and for various other.Such information can be posted to share information in its needed place and provide instruction and/or information.Various scenes, training module etc. can be running on user's equipment.Be not the restriction of the use to such training as just example, user's eyepiece can show augmented reality combat environment.In each embodiment, user can take action and react fight as his reality in such environment.User can advance or fall back according to his performance.In each embodiment, user's action can be recorded for the feedback that will be provided according to his performance.In each embodiment, no matter whether user's performance is recorded, user can be provided feedback.In each embodiment, the information of putting up as described above can be password or bio-identification protection ground and/or cryptographically, and immediately can with or available after specific a period of time.Can and may be desirable renewal and by immediate updating for changed order with such information of electronic form storage.
Near-field communication or other means also can be used in training environment neutralization for safeguarding to share with posted information and provide instruction and/or information in the place that needs information.As example, information can be posted within classroom, in the maintenance prevention of laboratory, repair workshop medium or other need such training and instruction Anywhere.The subscriber equipment of all eyepieces as described in this etc. can allow such communication and reception.Information can be shared via augmented reality, once user meets with specific region and he just obtains such information notice there in this augmented reality.Be similar to described herely, near-field communication can be used in maintenance.As example, information can be sticked on the place that needs it exactly, such as in maintenance prevention, in repair workshop, be associated with the article that will be repaired etc.More specifically but be not limitation of the present invention, repair under the hood that instruction can be posted within military vehicle and can be by using soldier's eyepiece visible.Similarly, various instructions and training information can be shared with various users in any given training condition, the training of safeguarding such as the training for for fight and/or for military equipment.In each embodiment, the information of putting up as described above can be password or bio-identification protection ground and/or cryptographically and immediately can with or available after specific a period of time.With such information of electronics situation storage can be in time for changed order with may be desirable renewal and being updated.
In each embodiment, being applied to application of the present invention can be for face recognition or sparse face recognition.Sparse face recognition like this can exclude the possibility with one or more facial characteristics in the time that mark is paid close attention to personage.Sparse face recognition can have that automatic obstacle is sheltered and wrong and angle correction.In each embodiment, as example and be not limitation of the present invention, eyepiece described here, flashlamp and equipment can allow sparse face recognition.This can be similar to human vision and work, and by immediately all image vectors being got rid of to unmatched each region or whole profile rapidly by sparse coupling.This can make may occur hardly false positive.In addition, this can side by side utilize multiple images to expand vector space and promote accuracy.This works together with can requiring to come with multiple databases or with multiple target images according to availability or operability.In each embodiment, equipment can manually or automatically be identified at accuracy aspect the one or more specific clean feature of minimum yojan.As example, accuracy can be various scopes, and it can be at least for nose 87.3%, for eyes 93.7% and for 98.3% of face and chin.In addition, can be used by the angularity correction of facial reconstruct, and in each embodiment, can realize by the degree of miter angle at the most of facial reconstruct and proofreading and correct.This can further strengthen with 3D rendering mapping techniques.In addition, fuzzy region is sheltered and is replaced and can be used.In each embodiment, can be respectively realize 97.5% and 93.5% fuzzy region for sunglasses and scarf and shelter and replace.In each embodiment, desirable input picture can be 640 to take advantage of 480.Target image may be because long distance or atmosphere cover to be less than 10% input resolution reliable matching.In addition, particular range can be greater or lesser in each embodiment as mentioned above.
In each embodiment, equipment described here and/or network can be applicable to mark and/or the tracking to friend and/or allied forces.In each embodiment, face recognition can be used to identify for certain friend and/or friendly troop.In addition, the real-time network of real-time network tracking and/or blue force and Red Army follow the tracks of can allow user know he allied forces and/or friendly troop where.In each embodiment, between blue force and Red Army and/or the army that identifies by various marks and/or mode, can there is visual separation scope.In addition, user can geo-location enemy and is shared in real time enemy's position.In addition, the position of friendly troop also can be shared in real time.Can be that bio-identification described here is collected glasses, eyepiece, miscellaneous equipment and those equipment well known by persons skilled in the art for the equipment of such application.
In each embodiment, in medical treatment when equipment described here and/or network can be used in diagnosis.As example, such equipment can make healthcare givers can make remote diagnosis.In addition and as example, in the time that battlefield doctor reaches the spot or remotely, they can with the equipment of such as fingerprint sensor etc. at once transfer soldier's case history, anaphylaxis, blood group and At All Other Times responsive medical data take the most effectively to treat.In each embodiment, such data can be transferred by face recognition, the iris recognition etc. to soldier that can realize via eyepiece described here or another equipment.
In each embodiment, user can share various data by diverse network described here and equipment.As example, the video wireless transceiver that 256 AES encrypt can between each group and/or and the computing machine of vehicle between bidirectionally share video.In addition biometric data set,, potential concern personage's registration, mark and checking, concern personage's biometric data etc. can locally be shared and/or remotely be shared by wireless network.In addition, potential concern personage's this mark and checking can share by this locality and/or complete or offer help by the long-range shared data of wireless network.Biological recognition system described here and equipment line are also activated by network shares data.In each embodiment, data can and various device, individuality, vehicle, position, unit etc. are shared or share and/or share between them from them.In each embodiment, may exist in unit and outside unit, communicate by letter and data sharing.Data can be shared or from the following, share and/or share between the following via the following: the existing communication resource, mesh network or other network, there is the military control type ultra-wideband transceiver cap of 256 bit encryptions, military control type cable, mobile SD and/or micro-SD memory card, the brave horse of Humvee(), PSDS2, unmanned vehicle, WBOTM or other relay, fight radio, the computing machine of netted networking, such as, but not limited to the equipment of various device described here, the computing machine of biological phone 3G/4G networking, digital archives, tactical operations center, command post, DCSG-A, BAT server, individuality and/or group of individuals and any eyepiece described here and/or equipment and/or those equipment known to those skilled in the art etc.
In each embodiment, equipment described here or miscellaneous equipment can comprise checks pane, and this checks that pane oppositely checks image projecting on any surface for carrying out fight group by squad and/or the group leader of group.Transparent check that pane or other check that pane can be rotated 180 degree or another number of degrees to share data with group and/or multiple individuality in projection mode.In each embodiment, include but not limited to monocular and binoculars NVG equipment can with use in all or nearly all tactics radio dock, and allow user and come in real time or otherwise to share live video, S/A, biometric data and other data.The such equipment of binoculars and monocular can be independently VIS, NIR and/or SWIR binoculars or monocular as mentioned above, and comprise that colored day/night vision and/or numeral show, and dock with tactics radio with the computing machine of compactness, encryption, enabling wireless.Various data can by fight radio, mesh network and long-range tactical network is come in real time or closely in real time share.In addition, data can be organized in digital archives.The data of paying close attention to personage (POI) can be organized in digital archives, and no matter whether such POI registers.In each embodiment, the data that are shared can be compared, manipulation etc.Although mentioned specific equipment, any equipment can as described in this and/or be shared information as those skilled in the art will appreciate that as mentioned herein.
In each embodiment, the data of biometric data, video and various other types can be collected by various device, method and apparatus.For example, can from the weapon of war, terrorist activity and/or scene of a crime and other object, collect fingerprint and other data.Such collection can catch by video or other means.Pocket biological camera, flashlamp and embedded static video camera, various miscellaneous equipments described here or miscellaneous equipment can be collected video, record, supervision and collection and identify bio-identification photographed data as described in this.In each embodiment, various device can record, collect, identify and verification msg and identify witness marking and environmental data with face, fingerprint, laten fingerprints, the palmmprint of diving, iris, sound, articles in pocket, scar, the relevant biometric data of tatooing and other.Data can be geo-location and use date/time mark.Equipment can catch the specific image that meets EFTS/EBTS that will be mated and be filed by any bio-identification adapting software.In addition, can carry out videoscanning and the potential coupling for embedded or long-range iris and face data storehouse.In each embodiment, various biometric data can be captured and/or carry out comparison and/or it can be organized in electronic record for database.In each embodiment, imaging and detection system can provide bio-identification scanning, and can allow the face of multiple objects to follow the tracks of and iris recognition.Object can move at a high speed crowd or shifts out crowd and can be identified immediately, and can carry out this locality and/or remote storage and/or analysis to such image and/or data.In each embodiment, equipment can be carried out multi-mode living things feature recognition.For example, various other that face and iris, iris and laten fingerprints, biometric data could be collected and identify to equipment combines etc.In addition, equipment can recording of video, sound, gait, fingerprint, laten fingerprints, palmmprint, the palmmprint of diving etc. and other distinguishing mark and/or movement.In each embodiment, biometric data can be archived with the manual input of the additional actuating section data capture of specific image.Data can be by geo-location automatically, with time/date stamp, and with locally or the GUID of network allocation file in digital archives.In each embodiment, equipment can record bat seal and roll printing, the fingerprint of complete 4 fingers of fingerprint sensitive scanning and clap seal and roll printing, palmmprint, finger tip and fingerprint.In each embodiment, operator can collect POI also with airborne or remote data base checking POI in the time inspecting Territorial Army.In each embodiment, the addressable web door of equipment and enabled the audit listing database of bio-identification and/or can comprise for POI gather existing bio-identification preliminary hearing software.In each embodiment, unabiding voice, video and the data for sending and receiving safety be mated and be filed to bio-identification can by the bio-identification adapting software of any approval.Equipment can in conjunction with and/or otherwise analyze bio-identification content.In each embodiment, biometric data can be collected into bio-identification standard picture and data layout, and these forms can be by mutual reference for carrying out nearly real-time or real-time data communication with the biological authoritative department of defence (Department of Defense Biometric Authoritative) or other database.In each embodiment, equipment can come for detection, the analysis etc. relevant with palmmprint, iris and face-image with fingerprint with algorithm.In each embodiment, equipment can illuminate iris or laten fingerprints for integration analysis simultaneously.In each embodiment, equipment can under unsettled situation, catch with high-speed video specific image and available tactics intuitively show the fast propagation that promotion situation is known.Can provide real-time situation to know to command post and/or tactics operation center.In each embodiment, equipment can allow each soldier to become a sensor and observe and report.Mark is carried out in the geographic position of collected data available dates, time and collection.In addition, bio-identification image can meet NIST/ISO, comprises ITL1-2007.In addition,, in each embodiment, laser range finder can assist bio-identification to catch and localizing objects.Threaten storehouse can be stored in airborne mini SD card or remotely be carried in tactical network.In each embodiment, equipment can wirelessly transmit encrypted data with band transceiver and/or ultra-wideband transceiver between equipment.Equipment can for embedded database or safely the database on the network of battlefield carry out the airborne coupling to potential POI.In addition, equipment can catch specific image with high-speed video under all environmental baselines.Bio-identification profile can be uploaded, be downloaded and be searched in the time within the several seconds or still less.In each embodiment, user can utilize visual bio-identification at the geo-location POI of safe distance place with equipment, and uses for the sane sparse recognizer of face, iris etc. and identify for certain POI.In each embodiment, user can merge and visual biometric feature is printed on to one and show upper comprehensively and do not remind this POI, and this shows that having the target of enhancing highlights and views registered and warning comprehensively.Such demonstration can be in various device, such as eyepiece, portable equipment etc.
In each embodiment, when native country personage is in controlled checkpoint and/or when station is subject to examination, operator can come to collect from audit listing by unobtrusive face and iris biometric feature, registration, mark and checking POI.In each embodiment, bio-identification is collected and mark can be carried out in scene of a crime.For example, in blast or other scene of a crime, operator can be rapidly from all potential POI collection of biological identification data.Data can be collected, GEOGRAPHICAL INDICATION be stored in digital archives to compare for passing and scene of a crime in the future.In addition, in house and buildings search, can be in real time from POI collection of biological identification data.Shown such data can allow operator know release, detain or arrest potential POI.In other embodiments, unobtrusive Data Collection and mark can be carried out in street environment etc.Also assimilate with Local resident for example through the market of user, collection of biological recognition feature, geographic position and/or environmental data under minimum visible influences simultaneously.In addition, whether are POIs to biometric data if can collect to identify them with it the dead or the wounded.In each embodiment, user can be by the dead or the wounded or other people face mark, iris identification, fingerprint mark, visible mark mark etc. are identified to known or unknown POI, and keep the renewal to electronic record by such data.
In each embodiment, laser range finder and/or inclinometer can be used to determine the position of paying close attention to personage and/or improvised explosive devices, other concern article etc.Various device described here can comprise that digital compass, inclinometer and laser range finder are to provide the geographic position of POI, target, IED, concern article etc.POI and/or the geographic position of paying close attention to article can be sent out and such data can be shared between individuality by network, tactical network etc.In each embodiment, equipment can allow optical array and laser range finder to use in uncontrolled environment the group in battlefield or crowd's Continuous Observation multiple POI is carried out to geo-location and range finding simultaneously.In addition,, in each embodiment, equipment can comprise laser range finder and specify device with by the Continuous Observation of one or more targets being come side by side target is found range and painting is painted.In addition, in each embodiment, equipment can be that soldier wears, hand-held etc., and comprises and locate the enemy on battlefield by the target geographic position of integrated laser range finder, digital compass, inclinometer and gps receiver.In each embodiment, equipment can comprise that integrated digital compass, inclinometer, MEMs gyroscope and gps receiver record and show soldier's position and the direction of his sight line.In addition, various device can comprise for the integrated gps receiver of positional precision and directional precision etc. or other gps receiver, IMU, 3 axle digital compass or other compass, laser range finder, gyroscope, gyroscope, accelerometer and/or inclinometer based on micromotor system processed.Various device described here and method can make user can locate enemy and the POI in battlefield, and pass through network or alternate manner and friendly troop and share such information.
In each embodiment, can use communication and geographic position by netted user networking or network together with.In addition, each user can be provided pop-up window or other situational map of all users or proximal subscribers.This can provide about friendly troop and be positioned at knowledge where to user.As described above, enemy's position can be found.Can be tracked and can provide by other situational map of pop-up window or enemy in enemy's position, it can provide about friendly troop and be positioned at knowledge where to user.Friendly troop and enemy's position can be shared in real time.User can be provided the map of describing such position.Can be displayed on about friendly troop, enemy's position and/or such map of quantity and their combination in user's eyepiece or in miscellaneous equipment for checking.
In each embodiment, equipment, methods and applications can provide without with instruction hand, wireless, that visually maintenance and repair and/or audio frequency strengthen.Such application can comprise the RFID sensing for part location and relevant equipment.In each example, user can carry out the spot repair of instructing for augmented reality with equipment.Such spot repair can be by without indicating to instruct with hand, wireless, maintenance and repair.The equipment of such as eyepiece, projector, monocular etc. and/or other equipment described here etc. can show the image of maintenance and repair process.In each embodiment, such image can be static and/or video, animation, 3-D(three-dimensional), 2-D(two dimension) etc.In addition, user can be provided voice and/or the audio annotation about such process.In each embodiment, this application can be used in high threat environment, and the work not wherein being detected is a security consideration.Augmented reality image and video can be projected to or otherwise cover on the real object that user working or in the user visual field of object, so that video, diagram, text or other instruction of the process that will be performed to be provided.In each embodiment, can or download and access through a cable or wirelessly from remote equipment, database and/or server etc. from the computing machine worn with it for the routine library of various processes.Such program can be used to real maintenance or training goal.
In each embodiment, equipment, method and the description found herein can be used for inventory status notification system.In each embodiment, such tracker can allow to use the processing of 2mb/s data transmission rate to exceed 1000 synchronization links from the scanning of distance of 100 meters at the most.When checking stock and/or near stock time, this system can provide audio frequency and/or the visual information through annotation relevant with inventory tracking.In each embodiment, equipment can comprise eyepiece described here, monocular, binoculars and/or miscellaneous equipment, and inventory tracking can use SWIR, SWIR color and/or night vision technology, wired or wireless computing machine, wireless UWB safety label, RFID label, the helmet/safety helmet reader and the display etc. worn with it.In each embodiment, and only as example, user can receive the visual and/or audio-frequency information relevant with stock, such as which article to be destroyed, shifted, where quantity, the article of the article that will be destroyed or shift will be transferred or be discarded into etc.In addition, such information can highlight or otherwise provide visible mark and the instruction of article in question.Such information can be displayed on user's eyepiece, project on article, to be presented at numeral or other display or monitor first-class.Article in question can be labeled by UWB and/or RFID label, and/or augmented reality program can be used to provide visual and/or indicate to make various device described here can be provided for the information of inventory tracking and administrative institute's need to user.
In each embodiment, when when fire fighting, SWIR described here, SWIR color, monocular, night vision, wireless computer, eyepiece and/or the equipment described here worn can be used with it.In each embodiment, user can have the lifting observability through smog, and each individual position can be shown to user covering in map or other map by user's equipment, to make him can know fireman and/or other people position.Equipment can provide the real-time demonstration of all firemans' position, and provide to have lower than with the Hot spots detection in the region higher than 200 degree celsius temperature, and do not trigger false alarm.The map of facility also can provide by equipment, be displayed on equipment, by from equipment projection and/or cover by augmented reality or alternate manner user's sight line, to help guides user to pass through structure and/or environment.
System described here and equipment can be configured to meet task specific needs and/or system upgrade for any software and/or algorithm.
With reference to Figure 73, eyepiece 100 can dock with " bio-identification flashlamp " 7300, and " bio-identification flashlamp " 7300 is such as comprising that biometric signature and biometric data function and that have the form factor of typical handheld flashlamp for recording individual obtain sensor.Bio-identification flashlamp can directly dock with eyepiece in the following manner: such as by wireless connections directly from bio-identification flashlamp to eyepiece 100 or as the embodiment representing among Figure 73 in show, by the intermediary transceiver 7302 of wirelessly docking with bio-identification flashlamp and for example, by wired or wireless interface (, wherein transceiver apparatus is can be worn on such as on waistband) from transceiver to eyepiece.Do not show although described in the accompanying drawings other to move biometric apparatus transceiver, those skilled in the art can understand can make in mobile biometric apparatus any by transceiver 7300 indirectly communicate by letter with eyepiece 100, directly communicate by letter with eyepiece 100 or individually operate.Data can be delivered to storer eyepiece storer, transceiver apparatus from bio-identification flashlamp, as storer in the mobile memory card 7304 of a bio-identification flashlamp part etc.As described in this, bio-identification flashlamp can comprise integrated camera and display.In each embodiment, bio-identification flashlamp can be used as stand-alone device and without eyepiece, and wherein data are internally stored and information is provided on display.In this way, civilian personnel can more easily and safely use bio-identification flashlamp.Bio-identification flashlamp can have the scope of the biometric data for catching some type, such as the scope of 1 meter, 3 meters, 10 meters etc.Camera can provide monochrome or coloured image.In each embodiment, bio-identification flashlamp can provide hidden biometric data to collect flashlamp-camera, and geo-location, supervision and collection environment and biometric data are mated for airborne or long-range bio-identification rapidly for they.In an example use scenes, soldier can be assigned at night sentry post.Soldier can only use bio-identification flashlamp as typical flashlamp, but wherein unwitting individuality is illuminated by equipment, and moves and/or obtain biometric feature as a part for Data Collection and/or bio-identification identification procedure.
With reference now to Figure 76,, 360 ° of little pictures that are recessed into of imager utilization numeral will arrive any given region in set of pixels, thereby send the high-definition picture of this appointed area.Each embodiment of 360 ° of imagers can be with the little recessed visual field of ultrahigh resolution and simultaneously and independently 10x(10 is doubly) optical zoom characterize 360 ° × 40 ° continuous panorama FOV(visuals field).360 ° of imagers can comprise two 5,000,000 element sensors and 30fps(number of pictures per second) imaging capability and the image acquisition time of <100.360 ° of imagers can comprise the gyrostabilized platform with stable independently imageing sensor.360 ° of imagers can only have a movable part and two imaging sensors, and this allows the image reducing to process bandwidth in compact Optical System Design.360 ° of imagers also can characterize low angular resolution and high-speed video is processed and can be that sensor is unknowable.360 ° of imagers can be used as in an equipment, on the moving vehicle with gyrostabilized platform, be arranged on traffic lights or phone mast, robot, aircraft or other allow the lasting locational surveillance equipment monitoring.Multiple users can check the environment by 360 ° of imager imagings independently and side by side.For example, the video being caught by 360 ° of imagers can be displayed in eyepiece to be known to allow all recipients (such as all occupants in combat vehicle) of data to have 360 ° of real-time situations.360 ° of imagers of panorama can recognize that the people at 100 meters of and little recessed 10x(10 are doubly) zoom can be used to read the licence plate at 500 meters of.360 ° of imagers allow the lasting record to environment and characterize independent controllable little recessed imager.
360 ° of imagers that Figure 76 A has described to assemble and Figure 76 B has described the cut-open view of 360 ° of imagers.360 ° of imagers comprise seizure mirror 7602, object lens 7604, beam splitter 7608, lens 7610 and 7612, MEMS catoptron 7614, sensor total field of view 7618, panoramic picture lens 7620, folding mirror 7622, central fovea sensor 7624 and spill image lens 7628.With the image that 360 ° of imagers are collected can be geo-location and time and date mark in addition.Other sensor can be included in 360 ° of imagers, such as thermal imaging sensor, NIR sensor, SWIR sensor.MEMS catoptron 7614 is reflecting prisms of unique use single view hemisphere face capture system, thereby allows high and balanced resolution.Imager design makes it possible to realize the little recessed distortion of scanning accuracy, the <1% of <0.1 °, little recessed collection at 50%MTF and the <30 millisecond at 400lp/mm place.
360 ° of imagers can be to have a part wireless or that physics is traced back to the network of TOC or database.For example, user can wirelessly check the image from 360 ° of imagers with the display with 360 ° of imager drivers, or uses wired connection (such as military control type cable) to check the image from 360 ° of imagers.Demonstration can be fight wireless device or the netted Net-connected computer with general headquarters' networking.Data from database (such as Ministry of National Defence's authoritative database) can be used removable memory storage card or pass through networking connected reference such as passing through by the computing machine of fight wireless device or netted networking.
With reference now to Figure 77,, consistent many visuals field camera can be used to imaging.Can be sent to eyepiece 100 or any other suitable display device from being fed to of consistent many visuals field camera.In one embodiment, consistent many visuals field camera can be connect completely, 3-or the consistent visual field of 4-, SWIR/LWIR imaging and target appointing system, it allows simultaneously: wide, medium and Narrow Field Of Vision monitors, wherein each sensor has VGA or SXVGA resolution for daytime or nighttime operation.Sensor array light weight, gimbal-mounted can by inertia stablize and geographical ground reference, thereby sensor localization and the target that can utilize its compatible laser designator ability of NVG to realize under all conditions pin-point accuracy are specified.Multiple while visuals field of its uniqueness can be realized wide area and be monitored in visible, near infrared, short-wave infrared and LONG WAVE INFRARED region.In the time being coupled with the output from digital compass, inclinometer and gps receiver, it also allows high-resolution Narrow Field Of Vision for using more accurate target identification and the appointment of point to mesh coordinate.
In an embodiment of consistent many visuals field camera, can there is the visual field separately, steerable, consistent, such as 30 °, 10 ° and 1 °, there is automatic POI or multiple POI tracking, face and iris recognition, airborne coupling and encrypt UWB by 256 AES and come wirelessly to communicate by letter with kneetop computer, fight wireless device or other networking or netted networked devices.Camera can be networked to CP, TOC's and biometric data storehouse, and can comprise that 3 axles, gyro-stabilized, high dynamic range, high resolution sensor send the ability of seeing under the condition from dazzling sunlight to extremely low light.Identifier (ID) can be by immediately local or store in remote storage and analyze.The accurate geography that camera can characterize POI and threat " searching and location " navigates to >1, the distance of 000 meter, the integrated safe laser range finder of 1550 nanometer eye, GPS, 3 axle gyroscopes, 3 axle magnetometers, accelerometer and the inclinometer of networking, electronic image strengthen and increase electronic stability and assist to follow the tracks of, record full motion (30 frames are per second) color video, ABIS, EBTS, EFTS and JPEG2000 compatibility and meet MIL-STD810 for operating at extreme environment.Camera can be installed via the ball system that universal joint is housed, this ball system integration that universal joint is housed catches the disoperative bio-identification collection of movement and mark and laser ranging and the POI geo-location of scheme for isolated bio-identification, such as in chokepoint, checkpoint and facility place.Multi-mode living things feature recognition comprises collects and mark face and iris and recording of video, gait and other distinguishing mark or movement.Camera can comprise all POI and collect the ability that data are carried out geo-location and marked with time, date and position.Camera promotes the fast propagation of knowing to enabling the CP of squad of network and the situation of TOC.
In another embodiment of consistent many visuals field camera, camera characterizes 3 VGA SWIR electric light modules that separate, colored in 20 °, 7.5 ° and the 2.5 ° visuals field that provide consistent, and 1 LWIR thermoelectricity optical module of the imaging pointing out with precision to POI and target in ultra-compact configuration for extensive region.3 axles, gyro-stabilized, high dynamic range, colored VGA SWIR camera are sent under the condition from dazzling sunlight to extremely low light and are seen through mist, cigarette and do not have the ability that " in full bloom " haze is seen.Can be by by system processed micromotor (MEMS) 3 axle gyroscopes and strengthen the mutually integrated geo-location that obtains of 3 axle accelerator of gps receiver and magnetometer data.The integrated safe laser range finder of 1840 nanometer eye and target specify device, gps receiver and IMU that " finding and location ", the accurate geo-location to 3 kilometers of distances to POI and threat are provided.Camera shows also storage full motion (30 frames are per second) color video in its " camcorders chip in ", and it is stored in to solid-state, mobile driving above for during flying or for the remote access of postoperative inspection.Electronic image strengthens and increases electronic stability and assists tracking, the geo-location range finding to POI and target and specify.Thus, eyepiece 100 is by showing " vision " that be not blocked that be fed to send threat from consistent many visuals field camera.In some embodiment of eyepiece 100, eyepiece 100 also available demonstration sensor image, moving map and data " perspective ", on turn over/under the electric light indication mechanism that turns over the uncrossed view of the own weapon of soldier is provided.In one embodiment, above turn over/under the electric light indication mechanism that turns over can snap in during the MICH of any standard or the NVG of the PRO-TECH helmet install.
Figure 77 has described an embodiment of consistent many visuals field camera, comprises laser range finder and specifies device 7702, total internal reflection mirror 7704, mounting ring 7708, total internal reflection mirror 7710, total internal reflection mirror 7714, antireflection honeycomb ring 7718,1280x1024SWIR380-1600 nano-sensor 7720, antireflection honeycomb ring 7222,1280x1024SWIR380-1600 nano-sensor 7724, antireflection honeycomb ring 7728 and 1280x1024SWIR380-1600 nano-sensor 7730.Other embodiment can comprise additional TIR lens, FLIR sensor.
With reference to Figure 78, flight eyes (flight eye) are described.Can be sent to eyepiece 100 or any other suitable display device from being fed to of eyes of flight.Flight eyes can comprise the multiple independent SWIR sensor being arranged in the folding imager array with multiple FOV.Flight eyes are supervision and the target appointing system of low profile, this system can utilize each sensor at VGA to SXGA resolution, by day or night, see through mist, cigarette and haze and in single low-latitude flying, realize the consecutive image to whole battlefield.Its modular design allow to any element from 1 ° to 30 ° optionally, fixing resolution changing, for the wide-angle imaging in any region of the array of dolly-out,ing dolly-back.The resolution of each SWIR imager is 1280x1024 in 380-1600 nanometer sensitivity.Many DSP array board is subdued overlaid pixel for seamless image by all images " stitching " together and automatically.Consistent 1064 nanometer lasers specify device and stadimeter 7802 as one man to install with any imager, and do not stop its FOV.
With reference to Figure 106, eyepiece 100 can be developed with eyepiece Application development environ-ment 10604 explicitly with the software inhouse application 7214(of eyepiece) operation in combination, wherein eyepiece 100 can comprise the projection facility that is applicable to projecting image onto in perspective or translucent lens, thus make the wearer of eyepiece can check surrounding environment and as apply by software inhouse 7214 provide as shown in image.The processor that can comprise storer and operating system (OS) 10624 can apply 7214 by main memory software inhouse, controls docking between eyepiece order and control and software application, controlling projection facility etc.
In each embodiment, eyepiece 100 can be included in multimedia and calculate the operating system 10624 of the main memory software inhouse application 7214 of operation on facility 7212, wherein internal applications 7214 can be the software application of being developed by third party 7242 and be provided to for downloading to eyepiece 100, such as from application shop 10602,3D AR eyepiece application shop 10610, from third party's application server 10612 of networking etc.Internal applications 7214 can be such as passing through in combination input equipment 7204, external unit 7240, outside order and control 10630 facilities etc. and eyepiece control treatment facility 10634 process interfaces that calculate facility 7232, eyepiece with API10608.Internal applications 7214 connects 10622(such as the Internet, Local Area Network, the mesh network with other eyepiece and mobile device, satellite communication link, cellular network etc. by network service) and can use eyepiece 100.Internal applications 7214 can be bought by application shop (such as application shop 10602,3D AR eyepiece application shop 10610 etc.).Internal applications 7214 can provide by 3D AR eyepiece shop 10610, such as the software inhouse application 7214 for eyepiece 100 specific development.
Eyepiece Application development environ-ment 10604 can for example, should be used for creating this new 3D application version of substantially applying etc. in order to create new eyepiece application (, 3D applies), amendment substantially to software developer.Eyepiece Application development environ-ment 10604 can comprise 3D applied environment, once this 3D applied environment is adapted that the application completing is loaded on eyepiece or otherwise can be with just providing the access to control scheme available on eyepiece, UI parameter and other specification to developer to eyepiece.Eyepiece can comprise the API10608 of the communication between application and the eyepiece computing system that is designed to promote.Application developer in developer's development environment then can pay close attention to exploitation and have the application of specific function, and need not be concerned about how themselves carries out mutual details with eyepiece hardware.API also can make developer more directly revise existing application to create 3D application for using on eyepiece 100.In each embodiment, internal applications 7214 can utilize the server 10612 of networking come for client-server configuration, the configuration of mixed-client-server (for example, by internal applications 7214 partly local runtime on eyepiece 100 and partly operate on application server 7214), will apply in completely main presence server, from server download etc.Network data storage 10614 can provide explicitly with internal applications 7214, such as providing explicitly with application server 10612, the application etc. bought further.In each embodiment, internal applications 7214 can be carried out alternately with sponsor facility 10614, market 10620 etc., advertisement that the execution of all Tathagata and internal applications 7214 provides assistance in combination, provides marketplace content etc. to the user of eyepiece 100.
In each embodiment, software and/or application can be developed and use together with eyepiece or eyepiece is supplemented.Application for eyepiece can be developed via Open Source Platform, sealing source code platform and/or software development kit.For develop eyepiece application software development kit and therefrom exploitation software can be increase income or seal source code.Application can be developed and the compatibility such as Android, Apple, other platform.Application can be by the application shop being associated with eyepiece, independently apply shop etc. sells or therefrom downloads.
For example, the integrated processor of eyepiece can move at least one software application and contents processing shows to feed to user, and integrated image source can be introduced content in the optics assembly of eyepiece.Software application can by with control and the sensor facility of eyepiece at least one alternately interactive 3D content is offered to user.
In each embodiment, eyepiece can be used to various application.Eyepiece can be used to consumer's application.Only as example but not detailed bill is provided, eyepiece can be used to following application or use therewith: travel application, educational applications, Video Applications, exercise application, personal assistant applications, augmented reality application, search application, local search application, navigation application, film applications, face recognition application, venue identifier application, character recognition and label accord with application, text application, instant message transrecieving application, e-mail applications, item to be done application, social networking application etc.Social networking application can comprise the application such as Facebook, Google+ etc.In each embodiment, eyepiece can be used to entertainment applications.As example but not exhaustive list is provided, eyepiece can be used to following application or use therewith: charging application, customer relation management application, business intelligence application, human resource management application, list automation application, the application of office product, Microsoft Office etc.In each embodiment, eyepiece can be used to commercial Application.As example but not exhaustive list is provided, eyepiece can be used to following application or use therewith: product of the future quality planning software application, product parts approval software application, statistics process control application, professional training application etc.
With reference to Figure 107, eyepiece Application development environ-ment 10604 can be used to the exploitation of the application to being presented to application shop 10602,3D AR eyepiece application shop 10610 etc.Eyepiece Application development environ-ment 10604 can comprise user interface 10702, access to control program 10704 etc.For example, it is for you to choose that developer can utilize menu in user interface and dialog box to visit control program 10704, and therefore application developer can selection scheme.Developer can select the template scheme of general operation application, but also can have the single control that can be selected to the various functions that cover sometime template function scheme for carrying out in application.Developer can also utilize user interface 10702 use control program exploitations to have the visual field (FOV) the control application of (such as by FOV interface).FOV interface can provide the mode of mediating between the FOV of two demonstrations of demonstration (for each eyes) and single demonstration.In each embodiment, 3D application for eyepiece can be designed in single demonstration view, this is because API10610 determines that by providing which demonstration is used to the explanation of which content, although developer perhaps can select specific eyes to show for certain content.In each embodiment, developer can be manually such as selecting by user interface 10702 and/or checking in each eyes what to show.
Eyepiece can have the software stack 10800 of describing as in accompanying drawing 108.Software stack 10800 can have wear-type Platform of hardware layer 10818, the interface-API-packaging 10814 to podium level, 10812 layers, storehouse, the application layer 10801 etc. for developing.Application layer 10801 can and then comprise that consumer applies 10802, entertainment applications 10804, commercial Application 10808 and other similar application 10810.In addition the hardware 10820 being associated with execution or the exploitation of internal applications 7214, also can merge in software stack 10800.
In each embodiment, user experience can by guarantee to strengthen image with respect to surrounding environment be in focus and in the case of given ambient light and shown content, show that being arranged on suitable brightness optimizes.
In one embodiment, eyepiece optics assembly can comprise the electric light module of carrying out content delivery for the mode with three-dimensional of each eyes, also referred to as demonstration.In some cases, three-dimensional view is not desirable.In each embodiment, for certain content, only have a demonstration can be opened or only have an electric light module can be included in optics assembly.In other embodiments, the brightness of each demonstration can be changed, to make brain ignore darker demonstration.The auto brightness control of image source can be controlled according to the brightness in environment the brightness of displayed content.The speed that brightness changes can be depending on the change in environment.The speed that brightness changes can be matched with adjusting of eyes.Displaying contents can be closed a period of time after the flip-flop of ambient brightness.Displaying contents can dim along with the dimmed of environment.Displaying contents can brightening along with environment.
When entering dark environment from bright environment, mankind's eyes need a period of time to adapt to dark.At this section of time durations, eyes only have the limited visibility to dark surrounds.In some cases, such as under safety or law enforcement situation, can move to dark environment and can determine rapidly that what activity or object are important dark environment from bright environment.But people's eyes fully adapt to dark need for environment 20 minutes at the most.During at this moment, people is compromised the eyesight of environment, and this can cause dangerous situation.
In some cases, can be used to illuminate dark environment high lights such as flashlamp.In other cases, people's eyes being covered to a period of time before entering dark environment is possible allow eyes entering partly to adapt to dark environment before dark environment.But, can not be used in dark environment and the eyes of covering people before entering dark environment are in infeasible situation at high light, need to provide and assist the method for checking to reduce the damaged time of eyesight of people between the tour from bright to dark.
Nigh-vison googles and binoculars are known for providing the image of dark surrounds.But these equipment provide the image of constant brightness and do not allow thus user's eyes to adapt to dark, so equipment must be used continuously in dark environment.As a result, these equipment do not utilize the fact that people can check well after their eyes adapt to dark completely in dark environment.
United States Patent (USP) 8094118 provides for regulating accordingly the brightness of demonstration with the method for conservation of power with the brightness of surrounding environment.The method is for the display brightness of experiencing, and not with user's eyes in the transformation of the environment from bright to dark adjust relevant.In addition, the method does not have assisting users to check environment.
Therefore, need a kind of method during people's eyes adapt to dark a period of time, to assist people in the method that moves to dark environment from bright environment.
The head mounted display device with see-through capabilities provides the clear view of scene before user and the ability that shows image is also provided simultaneously, and wherein user sees the combination picture being made up of the shown image of see-through view and covering.The present invention openly provides the method for the assistance view of environment is provided during from bright environment transition to dark environment user.The method regulates capturing condition rapidly with the camera on head mounted display device, the image of dark surrounds can be captured and image is displayed to user.The brightness of shown image is little by little reduced so that obtain user's eyes can adapt to dark environment.
Figure 154 is (from by Davison, H., Hecht in editor's books and data, " The Eye " (" eyes ") volume Two of Mandelbaum, London academic press, 1962, Pirene, obtains in the 5th chapter " dark adaptation and night vision " that M.H. writes) chart of the typical black dark adaptation curve of show needle to mankind's eyes, the research object colony that wherein stamps the region representation 80% of shade.In this chart, curve is presented at the minimal illumination that special time place can be observed, this special time starts with time 0 place under high light environment and enters into immediately dark surrounds, and the minimal illumination that wherein can be observed is that luminous point by be presented at from different illumination on a region to people and this people report that the luminous point that can see after adusk different time determines.As found out from this curve, mankind's eyes are along with the time adjusts, and can be seen progressively along with a period of time of approximately 25 minutes making compared with the luminous point of low-light (level).As what annotate in the chart in Figure 154, in fact there are two kinds of mechanism to contribute to the dark adaptation of mankind's eyes.The cone (also referred to as photopic vision) in eyes is adjusted at brighter condition place quickly than relatively slow rhabodoid of adjusting (also referred to as noctovision).As a result, for spending and depend on that environment has the how dark quite a long time moving to time that darker condition adjusts from brighter condition.During dark accommodation time section, people can be close to becoming blind.
Form 2 provides has a Lux(lux for general illumination condition) and lambert Lambert() the typical brightness value of Liang Ge unit.For the scope of the illumination of exterior lighting condition at bright sunlight with do not have to cross over 9 orders of magnitude between cloudy night of the moon.Also provide for the brightness value of interior lighting condition for relatively.
Form 2 shows from website http:// www.engineeringtoolbox.com/light-level-rooms-d_708.htmltypical illumination level.
Typical case's illumination level Lux Lambert
Sunlight 107527 10.7527
Bright and beautiful daylight 10,752 1.0752
Cloudy day 1075 0.1075
Very dark sky 107 0.0107
Dusk 10.8 0.00108
Degree of depth dusk 1.08 0.000108
Full moon 0.108 0.0000108
Crescent 0.0108 0.00000108
Starlight 0.0011 0.00000011
Cloudy night 0.0001 0.00000001
Supermarket 750 0.075
Common office 500 0.05
Classroom 250 0.025
Warehouse 150 0.015
Dark public domain 35 0.0035
Form 3 provides the brightness value of experiencing in the time that the lighting condition adapting to completely from eyes changes to compared with dark condition (having brightness (Bril) unit).The change of shown lighting condition is with relevant according to the brightness value providing from the lambert of form 2.The example providing in form 3 bottoms is the explanation of brightness, the brightness of 1 brightness of wherein experiencing is approximately the brightness being provided sunny night or 0.000001 lambert by crescent, wherein the brightness of experiencing in the human visual system equation relevant to change in lighting condition provided in United States Patent (USP) 8094118, this equation is providing for reference as equation 3 below.
B=λ (L/L a) σ equation 3
Wherein
σ=0.4log10(L a)+2.92
λ=10 2.0208L a 0.336
In other example showing at form 3, easily see that the many situations that run into cause the wherein change of lighting condition to cause the dark situation of experiencing in real-life.The change of various lighting conditions and the brightness of experiencing in the time that this change occurs have first been shown in form 3.In many examples in these examples, when the brightness being provided by crescent of experiencing adapting to dark completely is provided in the brightness of experiencing in the time that bright condition moves to compared with dark condition first.The public domain that moves to warehouse or dark from daylight is especially problematic, and eyes are in fact to have become blind a period of time to adapt to new lighting condition until it becomes.Described here the invention provides a kind of for during being converted to darker condition from bright condition, in the time that eyes are adapting to darker condition, assist the method for mankind's eyes.
Form 3 illustrates and uses the luminance level of experiencing during to dark environment when the environment change from bright of equation 3 and the brightness value from form 2:
Figure 155 provides about from Spillman L.,, Nowlan A.T. and Bernholz C.D. the paper (magazine of Optical Society of America that " has the dark adaptation under wane background illuminance ", the 62nd volume, the 2nd phase, in February, 1972) measurement data of speed of the dark adaptation that adopts.Figure 155 has shown for the log background illuminance reducing along with the time is linear and has measured delta threshold.Background is at 3.5 minutes (), 7 minutes (Δ), 14 minutes (zero), 21 minutes (◇) and in 3.5 minutes, changed 7 log(logarithms) unit, and do not there is pre-exposure (■).The time of arrow instruction background delustring.At the most of and precipitous background Slope Facies of the common dark threshold value that lacks under any background illuminance (×) record consistent and becoming invariant after omit.
Data in Figure 155 are the institute's speed based on measuring, under this measured speed, can by mankind's eye detection to speck on minimal illumination level (threshold value) along with when lighting condition from 0.325 lambert (part cloudy day) during to complete darkness eyes become and adapt to dark (more responsive) and reduction progressively.The different curves that show in the chart of Figure 155 are (but not the change immediately showing in Figure 154) lighting conditions for wherein the change from bright to dark completes with different linear velocities.Curve in chart left side is presented under the condition that wherein change from bright to dark completes rapidly more fast to dark adaptation.As the Data support showing in Figure 154 and shown by Figure 155, when directly to adapt to the dark typical time while moving to complete darkness be about 15 minutes from bright.What the data in Figure 155 showed is that brightness can change linearly and only have in the time-related little loss of dark adaptation along with a period of time of 14 minutes, and the data in Figure 155 show that the dark time of adaptation is from be increased to for 15 minutes 19 minutes that change for the slope of 14 minutes for changing immediately.The present invention provides for the shown image to dark surrounds the method having along with the brightness of progressive minimizing of time is provided at this, so user is provided the observable image of dark surrounds and still allows user's eyes to adapt to dark environment simultaneously.The method is used and adjusts to rapidly dark surrounds to make it can catch the camera of the image of dark surrounds.The image catching offers user on perspective head mounted display, and wherein the brightness of image is along with the time reduces progressively, so user's eyes can adapt to dark and user can see environment progressively by the see-through capabilities of perspective head mounted display.
Figure 156 is the diagram with the head-mounted display apparatus 15600 of see-through capabilities.Head-mounted display apparatus 15600 comprises see-through display 15602, one or more camera 15604 and electron device 15608, and wherein electron device 15608 can comprise with lower one or more: processor, battery, gps sensor (GPS), direction sensor, data storage, wireless communication system and user interface.
In one embodiment, the head-mounted display apparatus 15600 that has at least one camera 15604 or 15610 is used in see-through display 15602, provide the enhancing view of this dark surrounds during user's eyes are adapting to dark surrounds.Camera 15604 or 15610 can carry out automatically to regulate very rapidly trap setting with auto exposure system, such as gain, ISO, resolution or pixel binning (binning).In certain embodiments, the camera lens of camera 15604 or 15610 is changeable picture catchings promoting in dark surrounds.The brightness that is presented at the image in see-through display 15602 can be conditioned to mate the debugging of eyes and any change of the photochromics that can be associated with head-mounted display apparatus 15600 along with the time.In this way, do not need the photochromics of change fast.Become clear photochromic material and be suitable for well various embodiments of the present invention the transit time with the number of minutes magnitude.Under any circumstance, the visual field of shown ambient image should be mated with the visual field of head-mounted display apparatus 15600, so that the easily demonstration of the dark surrounds of docking in augmented reality pattern to be provided.
The invention provides the head-mounted display apparatus 15600 with one or more cameras 15604 or 15610, the image of scene can show with the brightness in a scope along with the time before the user that wherein caught.Change than the adaptable ambient brightness of eyes of user, can the conform variation of brightness of the auto exposure system that camera 15604 or 15610 and they are associated, conventionally in 1 second fasterly.In one embodiment, camera 15604 or 15610 catches the image of scene before users, and when the brightness of scene is rapidly from bright when dimmed, the image of the scene that catches is displayed to user in see-through display 15602.The brightness of shown image is along with the time is reduced to make and then to move to after dark surrounds the i.e. bright image to user's scene, and brightness is then along with the time is by the dark rate reduction that allows user's eyes to conform.Figure 157 illustrates the chart that offers the brightness of user's shown image along with the time, and wherein t1 is when the brightness of environment is from the bright time when dimmed.To the seizure of ambient image can time t1 place or before start.After t1, the brightness of shown image is lowered until time t2, and in the time of time t2, user's eyes adapt to dark surrounds.After time t2, it is constant that the brightness of shown image is maintained at a level, and wherein user can be with perspective pattern environment of observation.In other embodiments of the invention, after time t2, the brightness of shown image is 0, only observes dark surrounds with perspective pattern so that obtain user.In the further embodiment of the present invention, the picture material of shown image after t2 from the image modification of environment before caught user to other image or such as the information of augmented reality information (for example instruction or direction) etc.In another embodiment of the present invention, if environment is secretly in predeterminated level, the brightness of shown ambient image is lowered to the level maintaining after time t2, thereby provide the version of night vision, wherein night vision responds to the quick change of ambient lighting, and provides longer-term night vision in the time that condition is too dark for eyes adapt on hand task.After time t2, provide therein the dark level of night vision imaging in operator scheme, middle selection to be set by user, wherein need the task that more details in environment are detected to use the setting of the shown ambient image that provides brighter during night vision pattern.
In preferred embodiment, the brightness of shown ambient image reduces with the speed corresponding with the speed of eyes of user adaptation dark surrounds, for example 14 minute the changing from bright image to dark images or without image corresponding with the curve showing in Figure 155.In this way, adapt to when dark at eyes of user, this user is temporarily provided ambient image, but adapting to the time of dark surrounds did not compare and be significantly longer with the time adapting in the situation that there is no shown image.
In further embodiment of the present invention, in the time that user enters dark surrounds, the lens of camera 15604 or 15610 are changed to provide the low light image capture capabilities of lifting.In this case, camera 15604 or 15610 or electron device 15608 in another photolytic activity detecting device detect the change from bright light environments to dark surrounds, wherein the brightness of environment detects by the automatic exposure sensor in electron device 15608 or by detecting from the reduction of the pixel codes value of the imageing sensor in camera 15604 or 15610.The lens of camera 15604 or 15610 are then changed to promote light capacity gauge or make camera 15604 or 15610 can catch infrared image.Example: the lens that light capacity gauge can have by changing to lower f# promote.Example: infrared image catches can be in the following manner to be realized in camera 15604 or 15610: remove infrared cutoff filter in lens combination piece installing, by lens element mutually mobile come refocus or by the one or more Infrared Lens elements that change in lens element.In another embodiment, the imageing sensor of camera 15604 or 15610 is changed to realize infrared image seizure.
Figure 158 shows the process flow diagram of the present invention's one method.In step 15802, user moves to dark surrounds from bright light environments.In step 15804, another photolytic activity detecting device in camera 15604(or electron device 15608) detect the change to dark condition of lighting condition in environment.In step 15808, the capturing condition of camera 15604 or 15610 use regulates to realize the picture catching in dark surrounds by auto exposure system, and especially video image catches.In step 15810, the image of environment is caught and is presented in see-through display 15602 with the first luminance level by camera 15604 or 15610, and wherein the first luminance level of shown image is similar to immediately and changes from becoming clear the brightness perceiving the see-through view of user at environment before dark lighting condition at environment into.Then in step 15812, the brightness of shown ambient image, along with the time is lowered, can adapt to the dark image of also simultaneously checking environment so that obtain user's eyes.The reduction of brightness can be linear along with the time period, or is nonlinear shown in Figure 157.Along with the time period that brightness of image is lowered can be corresponding with the change of lighting condition in environment.How blackly have according to environment, in step 15812, the brightness of shown image can be lowered to 0 or maintain predeterminated level night vision version is provided.
Exemplary scene 1
In daylight, the police of (about 1.0 lamberts) work has broken Yishanmen, and this gate open is toward some dark room of the degree of darkness (about 0.0035 lambert) in the many restaurants that are similar to from demonstrating as the data form 2.In the time that door is opened, police is 0.000007 brightness or than the dark 10000X(10000 of the illumination being provided by crescent doubly by experiencing dark room), as the data in form 3 show.In fact, he by cannot see in this dark room anything.According to the curve in Figure 155, anything (it is in 0.0035 lambert (0.0035 lambert=0.54Log millilambert)) in police can see this dark room has about 1 minute before.This is dangerous situation, because the eyes of the people in this dark room have adapted to dark, so they can see police.Worn and had as described in this head-mounted display apparatus of camera and see-through display in the case of police, the image of dark room can be presented to police in approximately 1.5 minutes, and within this time period, police's eyes are adapting to dark.After this time, police can be by this dark room of see-through display.See-through display can still be used to, in the time that this police checks this dark room by see-through display, instruction or out of Memory are sent to this police (such as presenting in system at augmented reality).Thus, head-mounted display apparatus of the present invention provides the moment vision that is only limited to camera low light ability in dark room to police.
As long as closely mate with the part in the police visual field in the visual field presenting in demonstration image and video image is only to have the fact that catches limited hysteresis between demonstration, this police can be easily only with showing that image move in this dark room.Along with police's eyes adapt to this dark room, the brightness of shown image is along with the time reduces.
Camera can be the digital camera of suitable standard, and it has the good low optical property operating in high ISO and binning pattern so that the video imaging of being down to part moonlight illumination level to be provided.Short-wave infrared camera or the camera (such as the camera that has removed infrared cutoff filter) with visual+near infrared imaging ability can be used to provide the imaging of being down to darker level.As the data instruction showing, under very dark condition, may, providing image to user in 25 minutes at the most, will adapt to dark completely at this time point user's eyes in Figure 154 and 155.
Exemplary scene 2
In the house that lights a lamp (illumination 0.025 lambert=0.40Log millilambert), inner soldier opens door and walks out, enters in the night (illumination 0.00001 lambert=-2Log millilambert) with full moon.As found out in the numeral from form 3, the darkness of experiencing in the time that first soldier steps into night be have brightness 0.000001 brightness (its than in the time that eyes adapt to completely, have crescent night dark 1000000X doubly) complete darkness.This change of curve show needle in Figure 155 to illumination, before seeing object under dark condition more, soldier's eyes will need about 2 minutes.As in previous example, due to soldier in fact ablepsia 2 minutes, this can be dangerous situation.The invention provides perspective head mounted display, it catches the image of environment and they is shown to soldier to eliminate the time period of ablepsia.In this case, the brightness of image can reduce along with a period of time of 3-4 minute, so soldier's eyes can adapt to dark and in the time that this time finishes, soldier can operate with head mounted display in perspective pattern or augmented reality pattern.
Moment, visuality can provide showing on the visual field with regard to shown image.Along with user's eyes adapt to dark condition, be provided to by little by little reducing the brightness of shown image the transformation that perspective is checked.
This technology also can be used to the photochromic material lens that compensation is associated with head-mounted display apparatus, and it can not become clear very rapidly.
In the embodiment replacing, the image of presenting to user can be that the image that the 2D(that caught by single camera 15610 wherein presents to eyes of user is identical) or the 3D(being caught by stereoscopic camera 15604 image of wherein presenting to eyes of user provide the difference of scene to have an X-rayed).As is known to persons skilled in the art, be also possible for generation of other method of stereo-picture, such as with thering are the lens of division pupil or realizing optical field imaging with the lens with microlens array.
Shown image also can be transferred to different color (such as red or green) to help eyes to adapt to quickly dark, as conventionally realized in night vision binoculars.
In each embodiment, augmented reality eyepiece of the present invention (AR) is adapted turning to of definite and/or compensation eyes of user.Turn to is that user's eyes rotate that in the time of Z-axis their optical axis are separately moved and obtain or maintain binocular vision in the opposite direction.In the time that a people sees nearer object, this people's eyes inwardly move their optical axis separately towards nose, be called as the compound motion of convergence.In order to see object far away, this people's eyes are outside mobile to nose by their optical axis separately, are called as the compound motion of dispersing.When this people watches attentively at infinity or time very far away, this people's eyes are dispersed until their optical axis are separately parallel to each other substantially.Turn to running together with the adjusting of eyes adaptability to allow a people in the time that object moves with respect to this people, to maintain the picture rich in detail to this object.Turn to compensation at virtual image (, AR image) (such as mark or out of Memory) to be placed with near true picture cover when true picture or situation in the time that the virtual image of object will be superimposed upon on the true picture of object in become important, to make the placement of virtual image correct with respect to true picture.Of the present invention for turning to compensation and/or definite method to be described and be collectively referred to as forward method at this.
Forward method can comprise determines that perpetual object determines steering angle from distance and this distance of follow-up use of the user of AR eyepiece, that is, and and the crossing angle forming of the optical axis of eyes of user in the time of pair soon this object of user.Steering angle is then used to determine that AR image is with respect to the correct placement of this object, and it can be before object, below or the position of mating with it.For example, in first group of forward method embodiment, certain that has that the single automatic focusing digital camera of output signal is assembled in AR eyepiece facilitates position, for example, and in nasal bridge region or near one of temple.The output of camera is provided to the microprocessor in AR eyepiece and/or is sent to teleprocessing unit.In either case, its signal relevant with its automatic focusing ability is used to determine the go ahead distance of the appreciable object of this user while seeing as user.The interocular distance of this distance and eyes of user is used to determine and turns to and the correct placement of the virtual image (for example, mark) desirable to those objects.The focusing level that this distance and/or steering angle also can be used to determine virtual objects is can correctly being observed by user.Optionally, can be transfused to and be kept in the storer being associated with microprocessor about the additional information of specific user's steering characteristic and be used to regulating rotary to determine.
In second group of forward method embodiment, certain that is independent of that the electronic distance measuring instrument of camera is integrated in AR eyepiece facilitates position, for example, and in nasal bridge region or near one of temple.In these embodiments, the output of electronic distance measuring instrument is used in the identical mode of output with the automatic focusing camera of describing relatively with first group of forward method embodiment.
In the 3rd group of forward method embodiment, AR eyepiece comprises multiple distance-measuring equipments, and they can be automatic focusing camera and/or electronic distance measuring instrument.Whole being aligned because becoming in the distance of determining object with identical direction in the plurality of equipment, or one or more in equipment can differently to aim to make about the information of the distance to various objects with miscellaneous equipment be obtainable.Input and analyze in the identical mode of output with the automatic focusing camera of describing relatively with first group of forward method embodiment from the one or more output in equipment.
Turn in scheme implementation example at the 4th group, one or more distance-measuring equipments are used by mode discussed above.Additionally, AR eyepiece comprises one or more eye tracking equipments, they be configured to follow the tracks of in user's eyes one or two move and/or check direction.The microprocessor that the output of eye tracking equipment is provided in AR eyepiece maybe can be sent to teleprocessing unit.This output is used to the direction of determining that user checks, and in the time that the eye trace information from two eyes is available, for determining turning to of eyes of user.This direction and direction information (if available) then used individually or with together with direction information definite distance-measuring equipment, use, determine placement and (optionally) focusing level of the relevant one or more virtual images of the one or more objects that may check with user.
In the 5th group of forward method, one or more distance-measuring equipments are directed to the direction in the user's who leaves at AR eyepiece dead ahead.By stadimeter equipment Inspection to the distance to object be used to show in mode described above the virtual image of object.Although when user is going ahead while seeing, he may recognize or not recognize virtual image, in the time that user sees with the direction of the object relevant with virtual image, he will recognize virtual image.
Calibrating sequence can use together with arbitrary forward method embodiment.Calibrating sequence can adopt mechanically calibrated character, Electronic Calibration character or both step-lengths.During calibrating sequence, user's interocular distance can be determined.And user can requestedly see a series of true or virtual objects for example, with a true or pseudo range scope (, from closely to far away), and eyes turn to mechanically electronically or both ground measure.From the information of this calibrating sequence then can be used to just in use to turn at AR eyepiece, focusing and/or virtual image are placed determines.Calibrating sequence is preferably used in the time that user puts on AR eyepiece first, but also can think to recalibrate user uses being the useful any time.Information relevant to the information obtaining during calibrating sequence user can be stored for as long as specific user for example just can use in the time that AR eyepiece is designated oneself him its user by any technology of describing in this article.
Be noted that some distance-measuring equipment service ranges determine method, the information wherein receiving from the sensor of equipment is mapped in space representation straight line or non-rectilinear grid.Information from each section of grid is relatively carried out to determine range distance each other.In forward method embodiment, raw sensor information, map information, distance as calculated or these combination in any can be used to placement and/or the focusing of one or more virtual images.
Be understandable that, forward method embodiment comprises for one in eyes of user or places for the virtual image of two of eyes of user.In certain embodiments, a virtual image is provided for user's left eye, and different virtual images is provided for user's right eye.This for example provides one or more virtual images to eyes by permission, and the information from another eyes simultaneously obtained is for retouching standard.Multiple images are placed in user's situation before therein, no matter image is identical or different, placement can be simultaneously, at different time or staggered in time, for example, image with one or more predetermined flicker rates (for example, 30,60 and/or 180 hertz) show, wherein in the time being used for the image of right eye and not being presented, be presented for the image of left eye and vice versa.In certain embodiments, virtual image only shows people's dominant eye, and in further embodiments, virtual image only shows people's nondominant eye.Use therein in time in some embodiment of staggered image, the virtual image that is positioned at the various objects that leave each distance of user is shown in mode described above; In the time that user sees the true picture to another object from the true picture of an object, only have the virtual image corresponding with the true picture of the object of just being checked just will to be seen by user's brain.For example, for example, by with high speed (using, 30 to 60 hertz) focus mechanism (such as being attached to the piezo-activator of LCOS or being inserted into the variable focus lens in light path) of operation, one or more in identical or different virtual image can be placed on in the more than one depth plane of two in eyes of user or eyes of user.
In certain embodiments, the focal length of virtual image can be conditioned to user the mirage of virtual image in desirable distance is provided.In the time that image is just being presented to two eyes of user, such adjusting is particularly useful, and the relative lateral attitude of two images is conditioned for turning to.This adjusting can realize as follows by example: regulate image show light path length or use one or more variable lens, this can be for example by promoting or reducing LCOS panel and realize in some embodiments of the invention.
In each embodiment, the invention provides the method for Depth cue is provided by augmented reality virtual objects or virtual information, this augmented reality virtual objects or virtual information can be communicated to the perceived depth of broad range the individuality of the broad range with different eye features.These Depth cue embodiments of the method for the present invention are passed on the virtual objects of depth perception or the difference turning to of virtual information with the difference or different the providing that offer between the located lateral of augmented reality image of two individual eyes.An advantage of these methods is: the transverse shift of augmented reality image can be for the different piece of augmented reality image and difference, to make perceived depth for those part differences.In addition, transverse shift can be realized by the image processing of the each several part to augmented reality image.User can experience FR perceived depth by the method, focus on from individual physical efficiency so closely to infinity (no matter individual age how).
In order to understand better these Depth cue embodiments of the method for the present invention, remember that following is useful: aspect some of augmented reality, head mounted display is used to add image or the virtual information of the related virtual objects of scene view of seeing with user.For additional effect being added to the perception of augmented reality, it is useful that virtual objects or virtual information are placed into perceived depth place in scene.As example, virtual tag (such as the title of buildings) can be placed on the object in scene.If this virtual tag and this buildings perceived as in scene in the same degree of depth, the perception relevance of virtual tag and this buildings is enhanced.The head mounted display with see-through capabilities is applicable to provide augmented reality information (such as mark and object) well, and this is the clear view that environment is provided to user because of them.But for valuable augmented reality information, it must easily be associated with the object in environment, and thus, with respect to object, the location in see-through view is important to augmented reality information.Although having the horizontal and vertical location of augmented reality information the camera that can be calibrated to see-through view at head mounted display is relatively direct, depth localization is more complicated.United States Patent (USP) 6690393 has been described the method for locate 2D mark at 3D virtual world.But the method is not the demonstration for the major part with the image that user sees therein is not digitally provided and the 3D position of object is unknown see-through view thus.United States Patent (USP) 7907166 has been described the robotic surgical system that uses anaglyph viewer, and its medium-long range diagram diagram is coated on the stereo-picture of execute-in-place.But, be similar to the method for describing in United States Patent (USP) 6690393, this system is used the image catching, the image of these seizure is then handled to add diagram, and is not digitally provided for the major part of image therein thus and the relative position of the object that user sees is the particular case of unknown see-through display.Another prior art scheme for augmented reality is the focus that regulates virtual objects or virtual information so that user experiences the difference of the depth of focus, this depth of focus provides Depth cue to user.Along with user must again focus on the object in his/her eyes scene and see virtual objects or virtual information, user feels the degree of depth being associated.But the depth range that can be associated with focus is subject to the adaptive restriction that user's eyes can be realized.This adaptability is limited in some individuality when lose their adaptability scope most of when eyes, if especially individuality is older.In addition, depend on that user is myopia or long sight, adaptability scope is different.It is insecure that these factors make to use the result of focus clue for having a large number of users colony of different ages and different eye features.Therefore, exist always to outside prior art can be widely used for the needs of method that depth information is associated with augmented reality.
Some in Depth cue embodiment of the method for the present invention are by this and next describe in the paragraph of accompanying drawing 121 about accompanying drawing 109.The head mounted display with see-through capabilities provides the clear view of scene before user and the ability that shows image is also provided simultaneously, and wherein user sees the combination picture being made up of the demonstration image of see-through view and covering.Method need to show that 3D mark and other 3D information are to help user interpretation user environment around by see-through display.The stereo-picture of 3D mark and other 3D information is to being presented to user's left eye and right eye, by the different depth place of 3D mark and other 3D Information locating user awareness in scene.By this method, 3D mark and other 3D information can more easily be associated with see-through view and surrounding environment.
Accompanying drawing 109 is the diagrams with the head-mounted display apparatus 109100 of see-through capabilities, and is the special version that shows and run through the augmented reality eyepiece 100 of describing in Fig. 1 herein.Head-mounted display apparatus 109100 can comprise see-through display 109110, stereoscopic camera 109120, electron device 109130 and stadimeter 109140.Electron device can comprise with lower one or more: processor, battery, gps sensor (GPS), direction sensor, data storage, wireless communication system and user interface.
Figure 110 is the diagram of the scene before user seen in see-through view as user.In scene, multiple objects at different depth place are shown to be used in discussing.In Figure 111, identified and mark of several objects in scene.But mark is presented in two dimension (2D) mode in the following manner: only user eye is presented to mark or the label at same position place in image is presented to each eyes so that properly checked that tense marker is consistent simultaneously.Such mark makes mark be associated with object more difficult, especially, in the time there is prospect and background object, is all positioned at identical perceived depth because mark looks.
In order to make more easily mark or out of Memory to be associated with ideal object or the aspect of environment, by mark or out of Memory is rendered as three-dimensional (3D) mark or other 3D information is useful, with make this information by user awareness in the different degree of depth.This can complete in the following manner: two eyes of the 3D mark in overlay image or other 3D information being presented to user with the locational transverse shift between the overlay image being covered on fluoroscopy images, and to make overlay image there is perceived depth.For the technician of three-dimensional imaging field, this transverse shift between image is also called as parallax, and its relative positioning of causing user to change his/her eye carrys out visually alignment image and this has caused depth perception.The image with parallax is to be covered to 3D mark in the see-through view of the scene that user sees or the image of other 3D information.By the 3D mark with large parallax is provided, user must aim at a little the optical axis of his/her eyes and bring the mark in stereo-picture into aligning, and it provides mark and is positioned at the perception near user.The 3D mark with little parallax (or there is no parallax) can visually be aimed at the eyes of user of seeing front and this provides 3D mark and is positioned at perception at a distance.
Figure 112 and 113 illustrates the stereo-picture pair of the 3D mark that will be applied to the see-through view showing in Figure 110.Figure 112 is the image that is displayed to the 3D mark of user's left eye, and Figure 113 is the image that is displayed to the 3D mark of user's right eye.Figure 112 provides stereo-picture pair together with Figure 113.In this stereo pair, between the image that the located lateral of 3D mark shows in Figure 112 and Figure 113, be different.Figure 114 provides the overlay image of Figure 112 and Figure 113.In order to increase sharpness in Figure 114, be shown as grey and be shown as black from the 3D mark in Figure 112 from the 3D mark of Figure 113.In the prospect of Figure 114, be positioned at the left side from the 3D mark of Figure 112 from the 3D mark of Figure 113 with relatively large parallax.In the background of Figure 114, consistent with 3D mark from Figure 112 and be positioned in from the top of the 3D mark of Figure 112 and there is no parallax from the 3D mark of Figure 113.In the middle scene area showing at Figure 114, there is medium parallax from the 3D mark of Figure 112 and Figure 113.This relative parallax as left eye is presented with right eye of 3D mark is corresponding with the user awareness degree of depth.By make can easily to understand contacting between the other side of the environment that 3D mark and this object or user see in see-through view for user for the 3D Marker selection degree of depth consistent with the degree of depth of 3D mark object associated therewith in scene.Figure 115 has shown the see-through view of the scene with the 3D mark that shows its parallax.But in the time checking in actual life, the orientation that user may change his/her eyes makes 3D be marked at that in each left collection/right collection, unanimously and just this provides the perception of the degree of depth to user.
The calculating of parallax it is known to those skilled in the art that.Equation for relevant parallax and distance is provided by equation 1:
Z=Tf/d
Wherein Z is the distance from stereoscopic camera to object, and T is the separation distance between stereoscopic camera, and f is the focal length of camera lens, and d is the parallax distance between the image of same object in scene on camera sensor.Rearrange every parallax of obtaining, equation becomes equation 2:
d=TF/Z
For example, for separately 120 millimeters and having for 7 millimeters of focal length cameras of being combined with the imageing sensor with 2.2 microns of center to center pixel distances, select by visual object in the time that a demonstration is shown relatively with another parallax that the pixel quantity that is moved represents and representatively provide apart from (providing with rice) for some in form 1.
Form 1
Distance (rice) Parallax (pixel)
1 381.8
2 190.9
10 38.2
50 7.6
100 3.8
200 1.9
Note, sometimes in the prior art, describe the parallax value of stereo-picture by numeral from negative to positive, wherein 0 parallax is for defining leaving the object at observer one select location place, the observer of this select location place can perceptual image in middle scape.This that consider at 0 moves, and the equation of more than enumerating must be adjusted.In the time that parallax value is described by this way, but the parallax of nearly object and object far away can be identical contrary on symbol in amplitude.
Figure 116 shows the right diagram of stereo-picture being caught by the stereoscopic camera 109120 on head-mounted display apparatus 109100.Because these images catch from different perspectives, they will have the parallax corresponding with distance from head-mounted display apparatus 109100.In Figure 117, be capped to show the parallax between the image of stereo pair from two images of Figure 116.This parallax matches with the parallax of seeing in the 3D mark of the object demonstration in Figure 114 and 115.Thus, 3D mark is intended to by being perceived as to be positioned at 3D mark the depth that object associated therewith is identical.Figure 118 shows the diagram of the 3D mark of the covering of the conduct of being seen by user to the see-through view of seeing with left eye and right eye.
Figure 119 is the process flow diagram of Depth cue embodiment of the method for the present invention.In step 119010, the electron device 109130 use GPS in head-mounted display apparatus 109100 determine the GPS position of head-mounted display apparatus 109100.In optional step 119020, electron device 109130 uses electronic compass to determine the direction in the visual field.This makes it possible to determine position, the visual field and visual field direction, so that object in the visual field and near object can be positioned with respect to user's the visual field in the following manner: the database of the GPS position of other object in the GPS position of head-mounted display apparatus 109100 and head-mounted display apparatus 109100 is compared or be connected to other database with wireless connections.In step 119030, perpetual object is the database on equipment 109100 or by wirelessly communicating the visual field mark with respect to user in conjunction with another equipment by electron device 109130 analyzing stored.In step 119040, by the GPS position of the GPS position of head-mounted display apparatus 109100 and perpetual object being compared to determine the distance of perpetual object.In step 119050, the mark relevant with title about perpetual object or out of Memory be then generated together with parallax with distance to perpetual object corresponding provide 3D mark by user awareness distance.Figure 111 shows the example of the mark of the title, distance and the description that comprise perpetual object in the user visual field.In step 119060, the 3D mark of perpetual object is shown to user's left eye and right eye to provide 3D mark at desired depth place with parallax.
Figure 120 is the process flow diagram of another Depth cue embodiment of the method for the present invention, and the step that is wherein similar to those steps in Figure 119 has been used with the identical accompanying drawing number of using in Figure 119 and numbered.In step 120140, to the distance with respect to the user visual field of perpetual object and direction by the electron device 109130 on equipment or combining wireless the miscellaneous equipment that is connected determine.In step 120160,3D mark is shown to user's left eye and right eye to provide 3D mark at desired depth place with parallax, and in addition, and 3D mark is provided in the corresponding part of the direction with to perpetual object in the user visual field.Figure 111 has shown that wherein the mark of perpetual object far away is provided to the rear in the user visual field and has the example towards the direction of this perpetual object far away, is shown as in this example mark and " comes to town 10 miles from this direction." this feature provides the visual cues in 3D information, it makes user easily navigate to perpetual object.Should be noted that 3D mark can provide before other object in the perspective visual field.
Figure 121 is the process flow diagram of another Depth cue embodiment of the method for the present invention.In this embodiment, service range measuring equipment 109140(is such as stadimeter) determine the distance of perpetual object in scene.In step 121010, one or more images of the scene of close head-mounted display apparatus 109100 are by catching with stereoscopic camera 109120.Alternatively, single camera can be used to catch one or more images of scene.One or more images of this scene can be the images of different spectrum types, and for example, image can be visible images, ultraviolet image, infrared light image or HYPERSPECTRAL IMAGERY.One or more images are analyzed to identify one or more perpetual objects in step 121020, and wherein this analysis can be undertaken or image can wirelessly be sent to another equipment for analysis by electron device 109130.In step 121030, service range measuring equipment 109140 is determined the distance of perpetual object.In step 121040, determine the relevant parallax of correlation distance that makes perpetual object.In step 121050, determine mark or the out of Memory of perpetual object.In step 121060, show 3D mark or other 3D information of perpetual object.
Figure 122 is the process flow diagram of another Depth cue embodiment of the method for the present invention.In the present embodiment, by directly measuring the distance of object in scene with stereoscopic camera with the depth map of acquisition scene.In step 122010, stereoscopic camera 109120 is used to catch the one or more stereo-picture groups near the scene of head-mounted display apparatus 109100.One or more stereo-picture groups of this scene can be different spectrum picture types, and for example, stereo-picture can be visible images, ultraviolet image, infrared light image or HYPERSPECTRAL IMAGERY.One or more stereo-picture groups are analyzed to identify one or more perpetual objects in step 122020, and wherein this analysis can be undertaken or one or more stereo-picture group can wirelessly be sent to another equipment for analysis by electron device 109130.In step 122030, the image in one or more stereo-picture groups is compared the parallax of determining one or more perpetual objects.In step 122040, the mark relevant with one or more perpetual objects or out of Memory are determined.In step 122050, the 3D mark of one or more perpetual objects and/or 3D information are shown.
In each embodiment, the present invention can provide displaying contents to place by camera focus information, such as utilizing the integrated camera of determining facility binding operation with automatic focusing, wherein be integrated processor with the information of the distance dependent of real-world objects in surrounding environment and determine facility and extract from automatic focusing, and wherein integrated processor is determined the placement location of content in the visual field of optics assembly according to this distance.This visual field can comprise two visuals field that can separately control, eachly aim at in eyes of user one, so that user can check peripheral region and content with two eyes, and the placement location of content comprises for each the placement location in two visuals field that can separately control.Content can comprise two independent images, and wherein two independent images will be placed in each of two visuals field that can separately control dividually, and wherein two independent images can form 3D rendering in the time being shown to user in the visual field that can separately control at two.Placement location can be determined by extract placement value from the form of the placement value corresponding with distance to real-world objects.Integrated processor can calculate placement location.
In each embodiment, the present invention can provide displaying contents to place by stadimeter information, such as using the stadimeter that integrates and operate to determine the distance of real-world objects in surrounding environment with eyepiece, and wherein integrated processor is determined the placement location of content in the visual field of optics assembly according to this distance.This visual field can comprise two visuals field that can separately control, eachly aim at in eyes of user one, so that user can check peripheral region and content with two eyes, and the placement location of content comprises for each the placement location in two visuals field that can separately control.Content can comprise two independent images, and wherein two independent images will be placed in each of two visuals field that can separately control dividually, and wherein two independent images can form 3D rendering in the time being shown to user in the visual field that can separately control at two.Placement location can be determined by the placement value of extracting from the form of the placement value corresponding with distance to real-world objects.Integrated processor can calculate placement location.
In each embodiment, the present invention can determine that sensor carrys out displaying contents and places by multiple distances, such as by utilizing multiple operations to determine that the integrated distance of the distance of real-world objects in surrounding environment determines sensor, and wherein integrated processor is determined the placement location of content in the visual field of optics assembly according to this distance.This visual field can comprise two visuals field that can separately control, eachly aim at in eyes of user one, so that user can check peripheral region and content with two eyes, and the placement location of content comprises for each the placement location in two visuals field that can separately control.Content can comprise two independent images, and wherein two independent images will be placed in each of two visuals field that can separately control dividually, and wherein two independent images can form 3D rendering in the time being shown to user in the visual field that can separately control at two.Placement location can be determined by the placement value of extracting from the form of the placement value corresponding with distance to real-world objects.Integrated processor can calculate placement location.In each embodiment, multiple integrated distances determine that sensor can be camera sensor, stadimeter etc.
In each embodiment, the present invention can determine that the combination that sensor and user's eye are followed the tracks of carrys out displaying contents placement by service range, such as for example, by (utilizing multiple integrated sensors, camera, stadimeter) and from the eye trace information of the eye trace facility comprising in combination with the optics assembly of eyepiece, set up object's position with respect to the position of the visual field and object (for example, to the angle of object, to the distance of object).In each embodiment, the present invention can utilize other facility relevant with the placement of content in the visual field of optics assembly (such as position and the placement of image in user's indirect vision), with calibrating sequence, with grid assist location and/or calibration, for the image at diverse location place to each eyes interlaced image etc.
In each embodiment, the present invention can provide displaying contents control during the movement of eyepiece, such as the integrated mobile checkout facility of the movement by be adapted to detection wear-type eyepiece in the time being worn by user, and wherein integrated processor is determined mobile type and is reduced manifesting of shown content according to mobile type.Mobile type can be shake, fast moving etc.The minimizing manifesting can be the elimination of displayed content, the change of the reduction of the reduction of brightness to displayed content, contrast to displayed content, focus to displayed content etc.
Short-distance wireless exchanges data between near-field communication (NFC) permission NFC reader and passive NFC equipment, wherein NFC reader is used as " target " (gathering in the crops electric power to provide back reader by data from the RF field from NFC reader) as " promoter " (for exchange provides electric power) and the passive NFC equipment of communication.An example of this configuration can be NFC reader, and this NFC reader reads the identification information from label (such as clothing label).Note, NFC is also compatible mutually with radio-frequency (RF) identification (RFID) technology.If two electronic equipments comprise NFC reader and be brought to very close to each otherly, the NFC switched wireless of data can also be two-way.The example of this configuration can be that exchange message is (for example between them for two smart mobile phones of having enabled NFC, the exchange of electronic business card), a smart mobile phone and of having enabled NFC enabled NFC service point (POS) devices exchange information (for example, for electronic funds transfer, such as utilizing GOOGLE wallet mobile-payment system), two moving game exchanged between equipment information of having enabled NFC etc.The application of NFC technology can comprise electronic funds transfer, mobile payment, file-sharing, electronic business card exchange, moving game, social networks connection, ticketing service purchase, boarding card value machine, POS, coupon collection and/or exchange, the guide key in promoter, ID card, key card, vehicle or hotel etc. of standing.NFC technology has the practicality distance of about 4 centimetres (about 20 centimetres in theory), and promoter should approach for the generation of communicating by letter with target very much thus.
In an example, user can be stored in credit card information their enabling in the smart mobile phone of NFC, thereby allow them to approach the POS equipment of having enabled NFC and carry out electronic money payment (again, such as realize) to this POS equipment in GOOGLE wallet mobile-payment system by their smart mobile phone being positioned over very at retail shop place.In this way, user does not need to extract out actual credit card and transacts business, and this is because credit card information is connected from smart mobile phone and read by NFC by POS equipment.But, user still have must by their smart mobile phone from they pocket or wallet take out, lift its and come near POS equipment the inconvenience of then again their smart mobile phone being taken away.
The invention provides for by the scheme that provides NFC watch device (such as be worn on their wrist by user) to realize the wireless transactions of having enabled NFC to user, the equipment that this NFC wrist-watch then can be used for another to have enabled NFC always easily lifts for exchanges data.Although the invention describes the embodiment of NFC " wrist-watch ", be not intended to limit in any way, those skilled in the art is by admitting that the replacement that can realize spirit of the present invention realizes, such as being embodied as bracelet, watch chain, ring etc.Each embodiment of NFC wrist-watch can comprise independent NFC equipment, NFC trunking etc., wherein NFC trunking structure and NFC target device are (for example, enable the POS equipment of NFC) and the second both communication of equipment (for example, user's smart mobile phone) of having enabled NFC.For example, in the situation of NFC wrist-watch as independent NFC equipment, it can comprise the exchanged information (for example, credit card information) of wanting therein.NFC wrist-watch is as in the situation of NFC trunking therein, this wrist-watch does not comprise the exchanged information of wanting, but contrary, exchanged information to be stored in another electronic equipment that NFC trunking communicates by letter with it, such as smart mobile phone, mobile computing device, personal computer etc.
In each embodiment, wherein NFC wrist-watch is as NFC trunking, user can stay their personal device (for example smart mobile phone) in their pocket or wallet, and the equipment of only NFC wrist-watch having been enabled to NFC near another is for exchanges data, wherein NFC wrist-watch provide this another enabled the communication between the equipment of NFC and user's personal device.For example, user can wear NFC wrist-watch in their wrist, and the smart mobile phone of the credit card information that comprises them is placed in their pocket.When they near the POS equipment of having enabled NFC when paying by mails, user can not take out now they smart mobile phone and only by their NFC wrist-watch near POS equipment, wherein NFC wrist-watch and user's smart mobile phone for example, is communicated by letter by certain non-communication link (, bluetooth, WiFi etc.) that approaches.NFC wrist-watch reads user's credit card information and data is sent to POS equipment from smart mobile phone.By this configuration, user does not need their smart mobile phone to take out completely, this is because they only need their NFC wrist-watch to pay by mails near POS equipment, maintains simultaneously all their personal information and Financial Informations are intensively placed in their smart mobile phone.
In each embodiment and with reference to figure 207, NFC wrist-watch 20702 can provide the general function of typical wrist-watch, such as the surface 20704 of the demonstration for time and date 20708, function button 20710, for embedded controller of watch function etc.But in addition, NFC can provide communications facility, such as for to enabled NFC equipment near-field communication, for for example, to the intermediate range communication (, bluetooth) of nearby electron device, for example, near the longer range communications (, WiFi) of the electron device to etc.In each embodiment, the antenna 20712A(that can be provided as watchband for the antenna 20712A-B of near-field communication for example, has NFC loop antenna), antenna 20712B(in table body for example, there is NFC " stamp " antenna) etc.Antenna 20712A is arranged in the situation of watchband therein, and user-operable ground holds near the target device of having enabled NFC the watchband part of NFC wrist-watch for exchanges data.Antenna 20712B is arranged in the situation of table body therein, and user-operable ground divides the table body of NFC wrist-watch to hold near the target device of having enabled NFC for exchanges data.In each embodiment, watch displays 20704 also can provide the control interface 20718 that can make user input and/or select information, information such as the credit number in electronic money exchange, identifying code, the data of transferring accounts etc., and wherein control interface 20718 and can comprise display, control knob, 2D control pad, touch-screen etc.
With reference to figure 208, one example use scenes can comprise that user 20802 wears the NFC wrist-watch 20702A as independent NFC equipment, wherein NFC wrist-watch 20702A be picked up enabled NFC POS equipment 20084 for paying buying by NFC communication link 20804A.In this case, NFC wrist-watch 20702A comprises the payment information that will exchange with the POS equipment 20804 of having enabled NFC.In each embodiment, the information being included in NFC wrist-watch 20702A can previously (for example be calculated facility by arriving, mobile computing device, smart mobile phone, personal computer) wired or wireless connection, by network connect (for example, local network connect, WiFi connect) etc. via control interface 20718 manually inputted.
With reference to figure 209, one example use scenes can comprise user 20902 wear as with the NFC wrist-watch 20702B of NFC trunking of 20908A that wirelessly communicates by letter of the smart mobile phone 20908 in user's pocket, and be wherein included in user's smart mobile phone 308 for the information of exchanges data.In the situation that not using NFC wrist-watch, user must take out their smart mobile phone from their pocket is developing simultaneously near the POS equipment of having enabled NFC for exchanges data.The application of the invention, user 20902 can stay smart mobile phone 20908 in their pocket, and only NFC wrist-watch 20702B is lifted to the POS equipment 20804 of having enabled NFC, the wherein NFC wrist-watch 20702B 20804A that communicates by letter with the POS equipment 20804 of having enabled NFC, realizes thus two communication channel 20804A20908A setting up via NFC wrist-watch 20702B at smart mobile phone 308 and has enabled the transmission of information between the POS equipment 20804 of NFC.In this configuration, smart mobile phone 20908 needing not be has been enabled NFC, is only such as utilizing bluetooth etc. via intermediate range communication link 20908A(because smart mobile phone 20908 needs) to the communication link of NFC wrist-watch 20702B.
In each embodiment, NFC wrist-watch can communicate by non-NFC intermediate range communication link and multiple other electronic equipment, such as electronic equipment, augmented reality glasses, wear-type electronic equipment, home entertainment device, household safe equipment, family's automatic operation equipment, the local network worn with personal computer, mobile computer, mobile communication equipment, navigator, be with it connected, personal network connects etc. communicates.For example, NFC wrist-watch can communicate with eyepiece, the all following eyepieces in this way of this eyepiece: comprise the optical device that can enable see-through display, in this see-through display, can present the content providing from integrated processor, and wherein the each side of eyepiece can be controlled by the one or more complex control technology relating in sensor, camera, tactile interface, accessory device etc.In each embodiment, NFC wrist-watch can present Transaction Details to fetching with glasses in glasses.Glasses control system can be used as at the required any mutual interface of transaction.
In each embodiment, NFC wrist-watch can provide computational resource, such as microcontroller, storer, be independent of the IO unit (for example, memory card, wired connection) of communication link, to the wireless connections of local network for renewal, programmability etc.For example, NFC trunking can provide client ID, incentive message, integration planning data of commodity, personal profiles, sales quotation, redemption code, the preference of history that storer stores purchase, preference etc.
Method and system described here, each embodiment of especially creative augmented reality eyepiece, can be adjusted by and/or via any electronic communication system or network communicates and received communication.The example of such electronic communication system and network type and their relevant agreements, topological structure, network element etc. comprises following: (1) cable network, such as: leased line and the digital subscriber line of the agreements such as (a) wide area network (WAN)-use such as point-to-point (PPP), High-Level Data Link Control (HDLC), synchronous data-link control (SDLC), use the circuit switching such as the agreement such as PPP and ISDN, use such as frame relay, X.25(front OSI stack), the grouping in Synchronous Optical Network/synchronous level (SONET/SDH), multi protocol label switch (MPLS), the grouping of the agreement such as multi-megabit data, services (SMDS), Ethernet (for example, 10GB, 100GB) switched is switched, the cell relay of the agreement of use such as ATM(Asynchronous Transfer Mode) agreement etc., and use network element, such as router, interchanger, hub and fire wall, (b) use following MAN (MAN): agreements such as ATM, distributed fiber-optic network interface (FDDI), SMDS, Metro Ethernet and distributed queue dual bus (DQDB), topological structures such as starlike, bus, grid, ring-type and tree, and network elements such as router, interchanger, hub and fire wall, (c) for example use following Local Area Network: for example, such as the HSSI High-Speed Serial Interface agreement of Ethernet (, Ethernet, fast, 1GB, 10GB and 100GB), such as topological structure starlike and tree, and network elements such as router, interchanger, hub and fire wall, (d) use the PAN (Personal Area Network) (PAN) such as the technology such as USB and live wire, (2) such as following wireless network: (a) use following wide area network (WAN): such as RTT (CDMA), EDGE (GSM), EV-DO (CDMA/TDMA), Flash-OFDM (Flash-OFDM), GPRS (GSM), HSPA D and U (UMTS/3GSM), LTE (3GPP), UMTS-TDD (UMTS/3GSM), WIMAX (802.16), satellite, standards such as the mobile Internet of general 3G and 4G, network elements such as base station sub-system, network and switching subsystem, GPRS nuclear network, operations support systems, subscriber identity module (SIM), universal terrestrial access network (UTRAN) and nuclear network, and use with lower interface: such as W-CDMA (UTRA-FDD) – UMTS, ULTRA-TDD HCR – UMTS, TD-SCDMA – UMTS, user device interface-UMTS, radio resource control (radio link control, media interviews control), Um Interface is (for having the GSM air interface with lower floor, such as the Physical layer of GMSK or 8PSK modulation, such as the data link layer of LAPDm and such as radio resource, the network layer that mobile management and calling are controlled), (b) use the Metropolitan Area Network (MAN) (MAN) agreements such as WIMAX (802.16), use following technology LAN (Local Area Network) (LAN): such as having such as the Wi-Fi of special and infrastructure isotype, such as the osi layer of SCMA/CA etc. and such as the sub-technology of OFDM and spread spectrum etc., and use network elements such as router, switch, hub, fire wall, access point, base station and have clients such as personal computer, laptop computer, IP phone, mobile phone and smart mobile phone, (c) use such as star, tree and nettedly wait the personal area network (PAN) of topological structure, this network uses such as following technology: (i) (for example, use role is (such as master for bluetooth, from and simultaneously master/slave), protocol stack is (such as core agreement, RFCOMM, the agreement of phone control protocol and employing), mandatory agreement is (such as LMP Link Manager Protocol (LMP), logical link control and adaptation protocol (L2CAP), service discovery protocol (SDP)), matching method (such as traditional pairing and simple and safe pairing), air interface (such as exempting to permit ISM band (2.402-2.480GHz))), (ii) infra red data as-sodation (IrDA) (for example, is used mandatory protocol stack layers (for example, Infrared Physics layer specification (IrPHY), infrared link access protocol (IrLAP), infrared link management agreement (IrLMP) or optional protocol stack layers (for example, small host-host protocol (Tiny TP), infrared communication protocol (IrCOMM), object exchange (OBEX), LAN and Infrared (IrLAN), IrSimple and IrSimpleShot))), (iii) Wireless USB, (iv) Z-Wave(Z-ripple) (for example, there is the mesh network topologies structure of source routing, one or more master controller routes and safety and FGSK modulation), ZigBee(for example, there is the physics that defines and medium access control layer in 802.15.4 and such as the application of network layer, application layer, Zigbee device object and manufacturer's definition and use the assembly of CSMA/CA), (vi) body area network and (vii) Wi-Fi, (3) such as those 13.56 megahertzes and peer-to-peer network type operations, with data rate ISO/IEC18000-3 operated and that there is 106Kbits/s-424kbits/s and there is the near-field communication (NFC) of passive and/or active communication mode.Method and system described here, each embodiment especially with the augmented reality eyepiece of invention can be applied to meet any or all aspect of mobile device network management system, such as policy management, user management, profile management, business intelligence, incident management, performance management, enterprise-class, multi-platform support mobile device management (comprising sub-aspect, such as software and SaaS), safety management (comprise sub-aspect, such as certificate control (for example,, with Email, application, Wi-Fi access is relevant with VPN access), password is implemented, device clear, remote lock, follow-up auditing/daily record, the configuration verification of central type equipment, the detection of escape from prison/root, safety container and application packages), platform is supported (for example, Android, iOS, blackberry, blueberry, Saipan, Windows moves and Windows mobile phone), be obedient to management, software administration (comprises sub-aspect, such as application downloader, application verification, application is upgraded and is supported, the support of application patch, (for example, enterprise application and third party apply) supported in application shop) and hardware/equipment control (for example, comprise device registration (for example, entitlement, classification, registration, user differentiates, EULA exploitation and Limit exploitation), external memory storage blocks and configuration change history.Method and system described here, each embodiment especially with the augmented reality eyepiece of invention can be applied to relate to software and together with privately owned, the group of any type of the feature of service (IaaS) or mixed cloud computational grid or cloud computing environment, use as service (PaaS) and/or infrastructure as service (SaaS), platform with comprising those.
Method and system described here can partly or wholly be disposed by the machine of object computer software, program code and/or instruction on processor.Processor can be a part for server, Cloud Server, client, network implementation, mobile computing platform, fixing computing platform or other computing platform.Processor can be can execution of program instructions, calculating or the treatment facility of any type of code, binary command etc.Processor can be or comprise signal processor, digital processing unit, embedded processor, microprocessor or any distortion, such as the co-treatment device (mathematics co-treatment device, figure co-treatment device, communication co-treatment device etc.) etc. of execution that can directly or indirectly promote program code stored thereon or programmed instruction.In addition, processor can be carried out multiple programs, thread and code.The synchronous operation that thread can be performed simultaneously to strengthen the performance of processor and promote application.Mode in this as descriptions such as realization, method, program code, programmed instruction can realize in one or more threads.Thread can cause that other can have other thread of the priority being assigned with being associated with them; Processor can carry out these threads according to the instruction providing in program code according to priority or any other order.Processor can comprise the storer that is stored in method, code, instruction and the program described in this and other place.Processor can be stored in by interface accessing the storage medium of method, code and the instruction described in this and other place.Be associated with processor for storage means, program, code, programmed instruction or other type can be calculated or the storage medium of instruction that treatment facility is carried out can include but not limited to lower one or more: CD-ROM, DVD, storer, hard disk, flash memory, RAM, ROM, Cache etc.
Processor can comprise one or more cores that can strengthen multiprocessor speed and performance.In each embodiment, processor can be dual core processor, four core processors, other combines chip-scale multiprocessor of two or more individual core (being called wafer) etc.
Method and system described here can partly or intactly be disposed by the machine of carrying out the computer software on server, client, fire wall, gateway, hub, router or other such computing machine and/or networking hardware.Software program can be associated with server, and this server can comprise file server, printing server, domain server, Internet server, intranet servers and other variant, such as secondary servers, master server, distributed server etc.Server can comprise with lower one or more: storer, processor, computer-readable medium, storage medium, port (physics and virtual), communication facilities and can be by the interface of other servers of access such as wired or wireless medium, client, machine and equipment.Can be carried out by server at this and other local method, journey logic bomb of describing.In addition can be considered, a part for the facility being associated with server for carrying out the miscellaneous equipment of the method for describing in the present invention.
Server can be included in the interface of miscellaneous equipment, and miscellaneous equipment includes but not limited to: client, other server, printer, database server, printing server, file server, the communication server, distributed server, social networks etc.Additionally, this coupling and/or connection can promote the long-range execution of program across a network.The networking of some or all in these equipment can promote in the program of one or more positions or the parallel processing of method and not depart from scope of the present invention.In addition any equipment that, is attached to server by interface can comprise at least one can storage means, the storage medium of program, code and/or instruction.Central repositories can provide the programmed instruction that will carry out on different equipment.In this is realized, remote storage storehouse can be used as the storage medium of program code, instruction and program.
Software program can be associated with client, and this client can comprise file client, Printing, territory client, the Internet client, intranet client and other variant, such as secondary client, primary client, distributed clients etc.Client can comprise with lower one or more: storer, processor, computer-readable medium, storage medium, port (physics and virtual), communication facilities and can be by the interface of other clients of access such as wired or wireless medium, server, machine and equipment.Method, the journey logic bomb described in this and other place can be by client executing.In addition can be considered, a part for the facility being associated with client for carrying out the miscellaneous equipment of the method for describing in the present invention.
Client can be included in the interface of miscellaneous equipment, miscellaneous equipment includes but not limited to: server, Cloud Server, other client, printer, database server, printing server, file server, the communication server, distributed server, social networks etc.Additionally, this coupling and/or connection can promote the long-range execution of program across a network.The networking of some or all in these equipment can promote in the program of one or more positions or the parallel processing of method and not depart from scope of the present invention.In addition any equipment that, is attached to client by interface can comprise at least one can storage means, the storage medium of program, code and/or instruction.Central repositories can provide the programmed instruction that will carry out on different equipment.In this is realized, remote storage storehouse can be used as the storage medium of program code, instruction and program.
Method and system described here can partly or intactly be disposed by the network facilities.The network facilities can comprise each element, such as computing equipment, server, Cloud Server, router, hub, fire wall, client, personal computer, communication facilities, routing device and other activity known in the art and passive equipment, module and/or assembly.The calculating being associated with the network facilities and/or non-computing equipment can include but not limited to other assembly: storage medium, such as flash memory, impact damper, stack, RAM, ROM etc.Can be carried out by the one or more of network facilities element in this and other local process, method, program code, instruction of describing.
Be implemented in the cellular network with multiple unit in this and other local method, program code and instruction of describing.Cellular network can be frequency division multiple access (FDMA) network or be CDMA connecting mode (CDMA) network.Cellular network can comprise mobile device, cell site, base station, transponder, antenna, launching tower etc.Cellular network can be GSM, GPRS, 3G, EVDO, grid or other network type.
Be implemented on mobile device or by mobile device and realize in this and other local method, program code and instruction of describing.Mobile device can comprise navigator, mobile phone, mobile phone, mobile personal digital assistant, kneetop computer, palmtop computer, net book, pager, E-book reader, music player etc.Except other assembly, these equipment can comprise storage medium, such as flash memory, impact damper, RAM, ROM and one or more computing equipment.The computing equipment being associated with mobile device can be activated to carry out program code stored thereon, method and instruction.Alternatively, mobile device can be configured to cooperate to carry out instruction with miscellaneous equipment.Mobile device can communicate with base station, and executive routine code is docked and be configured in this base station with server.Mobile device can be communicated by letter on peer-to-peer network, grid network or other communication network.Program code can be stored on the storage medium being associated with server and by the computing equipment being embedded in server and carry out.Base station can comprise computing equipment and storage medium.Memory device can be stored program code and the instruction carried out by the computing equipment being associated with base station.
Computer software, program code and/or instruction can be stored on machine readable media and/or on machine readable media and access, and machine readable media can comprise: for retaining the computer module, equipment and the recording medium that are used to the numerical data of calculating into certain period; Be called as the semiconductor storage of random-access memory (ram); Be generally used for a mass storage device of more lasting storage, such as the form of the magnetic storage of CD, similar hard disk, tape, drum, card and other type; Processor register, cache memory, volatile memory, nonvolatile memory; Such as the optical memory of CD, DVD; Removable medium, for example, such as flash memory (, USB rod or key), floppy disk, tape, paper tape, card punch, independently ram disc, zip disk drive, removable large capacity storage, off-line etc.; Other computer memory, such as dynamic storage, static memory, read/write store, variable storage, read-only, random access, sequential access, location addressing, file addressing, content addressed, network attached storage, storage area network, bar code, magnetic China ink etc.
Method and system described here can be transformed into another state by physics and/or an invisible Xiang Congyi state.Method and system described here also can be transformed into another state from a state by the data that represent physics and/or invisible item.
Imply the logical boundary between each element at this description and each element of describing (comprise accompanying drawing in process flow diagram and block diagram).But, according to software or hardware engineering practice, the element of describing and their function can be carried out medium by computing machine and be implemented on machine, machine has can be carried out stored thereonly as monolithic integrated circuit software configuration, as independent software module or as the processor that uses the module of outside routine, code, service etc. or the programmed instruction of these combination in any, and all such realizations can be within the scope of the invention.The example of such machine can include but not limited to, personal digital assistant, kneetop computer, personal computer, mobile phone, other Handheld computing device, Medical Devices, wired or wireless communication equipment, frequency converter, chip, counter, satellite, board PC, e-book, gadget, electronic equipment, the equipment with artificial intelligence, computing equipment, networked devices, server, router etc.In addition the element of describing in process flow diagram and block diagram, or arbitrarily other logic module be implemented in can the machine of execution of program instructions on.Therefore, although the function aspects of the system disclosing has been set forth in above-mentioned accompanying drawing and description, but except statement or otherwise clear drawing from the context clearly, should from describing, these not infer the ad hoc arrangement of the software for realizing these function aspects.Similarly, it will be appreciated that each step of above mark and description can be changed, and the order of step can be adapted to the application-specific in the technology of this announcement.All these variants and amendment are intended to fall in the scope of present disclosure.Thus, for the order of each step describe and/or describe should not be understood to require the certain order for the execution of these steps, unless needed by application-specific or statement or otherwise clear drawing from the context clearly.
Method described above and/or process and their step can realize at hardware, software or be suitable in the combination in any of hardware and software of application-specific.Hardware can comprise particular aspects or the assembly of multi-purpose computer and/or dedicated computing equipment or specific computing equipment or particular computing device.Process is implemented in one or more microprocessors, microcontroller, is embedded in microcontroller, programmable digital signal processor or other programmable device and inside and/or external memory storage.Process can also alternatively be embodied in the specific integrated circuit of application, in programmable gate array, in programmable array logic or any miscellaneous equipment maybe can be configured in the combination of the equipment of processing electronic signal.Will be further understood that the one or more computer-executable code that can be carried out that are implemented as in process on machine readable media.
Computer-executable code can be by being used with the establishment of getting off: such as the structurized programming language of C etc., such as OO programming language or any other senior or rudimentary programming language (comprising assembly language, hardware description language and database programming language and technology) of C++ etc., these language can be stored, compile or explain with combination or any other of the combination at one of above equipment and different types of processor, processor architecture or different hardware and software and can on the machine of execution of program instructions, move.
Thus, in one aspect, each method described above and their combination can be embodied in computer-executable code, in the time of object computer executable code on one or more computing equipments, carry out their step.On the other hand, each method can be embodied in the system of step of carrying out them, and can be distributed in equipment with various ways, or all functions can be integrated into special, autonomous device or other hardware.On the other hand, can comprise any of hardware described above and/or software for carrying out the device of the step being associated with process described above.All such arrangements and combination are intended to drop in the scope of the present disclosure.
Disclose the present invention although combined preference embodiment shown and that describe in detail, the various amendments to it and improvement are very obvious to those skilled in the art.Therefore, the spirit and scope of the present invention are not limited by above example, but feel to be understood to restrain allowed the most wide in range.
All documents of quoting at this are therefore merged by reference.

Claims (10)

1. a system, comprising:
The mutual wear-type eyepiece that user wears, wherein said eyepiece comprises the optics assembly of checking surrounding environment and shown content by user described in it;
Integrated processor for the treatment of content to show to described user;
For described content being incorporated into the integrated image source of described optics assembly;
Described processor is suitable for revising described content, and wherein said amendment is in response to that sensor input makes.
2. the system as claimed in claim 1, is characterized in that, described content is video image.
3. system as claimed in claim 2, it is characterized in that, described amendment be following one of at least: adjust brightness, adjust color saturation, adjust colour balance, adjust tone, adjust video resolution, adjust transparency, adjust compressibility, adjust frame per second per second, a part for isolation video, stop displaying video, suspend video or restart video.
4. the system as claimed in claim 1, is characterized in that, sensor input is from least one obtains below: charge-coupled image sensor, black silicon sensor, IR sensor, acoustic sensor, induction pick-up, motion sensor, optical sensor, opacity sensor, proximity sense, inductance sensor, eddy current sensor, passive infrared proximity sense, radar, capacitive displacement transducer, hall effect sensor, Magnetic Sensor, GPS sensor, thermal imaging sensor, thermopair, thermistor, photoelectric sensor, sonac, infrared laser sensor, inertia motion sensor, MEMS internal motion sensor, ultrasonic 3D motion sensor, accelerometer, inclinator, power sensor, piezoelectric sensor, rotary encoder, linear encoder, chemical sensor, ozone sensor, smoke transducer, thermal sensor, magnetometer, carbon dioxide detector, carbon monoxide detector, oxygen sensor, glucose sensor, smoke detector, metal detector, Raindrop sensor, altitude gauge, activity detector, object detector, sign detector, laser range finder, sonar, capacitive transducer, heart rate sensor or RF/ micropower impulse radio (MIR) sensor.
5. system as claimed in claim 3, is characterized in that, stops play content in response to the instruction of moving from the head about described user of accelerometer input.
8. system as claimed in claim 4, is characterized in that, audio sensor input is generated by the speaking of at least one participant of video conference.
9. system as claimed in claim 4, is characterized in that, vision sensor input is at least one participant's of video conference video image.
10. system as claimed in claim 4, is characterized in that, vision sensor input is the video image of vision demonstration.
11. systems as claimed in claim 9, is characterized in that, described amendment is in response to from the instruction of moving about described user of sensor and makes at least one in more or less transparent video image.
12. 1 kinds of systems, comprising:
The mutual wear-type eyepiece that user wears, wherein said eyepiece comprises the optics assembly of checking surrounding environment and shown content by user described in it;
Integrated processor for the treatment of content to show to described user;
For described content being incorporated into the integrated image source of described optics assembly;
Described processor is suitable for revising described content, and wherein said amendment is in response to that sensor input makes; And
Further comprise integrated video picture catching facility, the one side of surrounding environment provide described content to show described in described integrated video picture catching facility record.
CN201280046955.XA 2011-09-26 2012-09-26 Video based on the sensor input to perspective, near-eye display shows modification Active CN103946732B (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US201161539269P 2011-09-26 2011-09-26
US61/539,269 2011-09-26
PCT/US2012/057387 WO2013049248A2 (en) 2011-09-26 2012-09-26 Video display modification based on sensor input for a see-through near-to-eye display

Publications (2)

Publication Number Publication Date
CN103946732A true CN103946732A (en) 2014-07-23
CN103946732B CN103946732B (en) 2019-06-14

Family

ID=47996727

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201280046955.XA Active CN103946732B (en) 2011-09-26 2012-09-26 Video based on the sensor input to perspective, near-eye display shows modification

Country Status (5)

Country Link
EP (1) EP2761362A2 (en)
JP (1) JP2015504616A (en)
KR (1) KR20140066258A (en)
CN (1) CN103946732B (en)
WO (1) WO2013049248A2 (en)

Cited By (159)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104394317A (en) * 2014-11-20 2015-03-04 段然 Method for processing recorded images of head-wearing recording equipment
CN104576709A (en) * 2015-02-03 2015-04-29 京东方科技集团股份有限公司 OLED (organic light-emitting diode) display substrate, method for manufacturing same and wearable equipment
CN104657103A (en) * 2015-03-16 2015-05-27 哈尔滨工业大学 Handheld CAVE projection system based on depth camera
CN104702911A (en) * 2014-11-24 2015-06-10 段然 Wearable video device real-time wireless transmission method
CN104731338A (en) * 2015-03-31 2015-06-24 深圳市虚拟现实科技有限公司 Closed type augmented and virtual reality system and method
CN104765456A (en) * 2015-04-08 2015-07-08 成都爱瑞斯文化传播有限责任公司 Virtual space system and building method thereof
CN105022980A (en) * 2015-07-28 2015-11-04 福建新大陆电脑股份有限公司 Barcode image identifying and reading device
CN105070204A (en) * 2015-07-24 2015-11-18 江苏天晟永创电子科技有限公司 Miniature AMOLED optical display
CN105091948A (en) * 2015-09-02 2015-11-25 徐艺斌 Multifunctional sensor module for myopia prevention frame
CN105117111A (en) * 2015-09-23 2015-12-02 小米科技有限责任公司 Rendering method and device for virtual reality interaction frames
CN105259655A (en) * 2015-09-10 2016-01-20 上海理鑫光学科技有限公司 3D video system improving authenticity of virtual and actual superposition
CN105319714A (en) * 2014-07-31 2016-02-10 精工爱普生株式会社 Display apparatus, method for controlling display apparatus, and program
CN105455285A (en) * 2015-12-31 2016-04-06 北京小鸟看看科技有限公司 Virtual reality helmet adaptation method
CN105487229A (en) * 2015-12-18 2016-04-13 济南中景电子科技有限公司 Multichannel interaction virtual reality glasses
CN105608436A (en) * 2015-12-23 2016-05-25 联想(北京)有限公司 Power consumption control method and electronic device
WO2016086439A1 (en) * 2014-12-04 2016-06-09 上海交通大学 Auto-aligning light-transmitting head-worn display device
CN105704501A (en) * 2016-02-06 2016-06-22 普宙飞行器科技(深圳)有限公司 Unmanned plane panorama video-based virtual reality live broadcast system
CN105718167A (en) * 2016-01-21 2016-06-29 陈佩珊 Icon migration achieving method and system based on intelligent glasses leg touch
CN105739851A (en) * 2016-01-21 2016-07-06 陈佩珊 Icon migration realization method and system based on voice identification of smart glasses
CN105740743A (en) * 2014-12-30 2016-07-06 手持产品公司 Augmented reality vision barcode scanning system and method
CN105938391A (en) * 2015-03-06 2016-09-14 松下电器(美国)知识产权公司 Wearable terminal and method for controlling the same
CN105976424A (en) * 2015-12-04 2016-09-28 乐视致新电子科技(天津)有限公司 Image rendering processing method and device
CN106094203A (en) * 2016-06-16 2016-11-09 捷开通讯(深圳)有限公司 VR system, for controlling wearable device and the method thereof of VR equipment
CN106162206A (en) * 2016-08-03 2016-11-23 北京疯景科技有限公司 Panorama recording, player method and device
CN106200892A (en) * 2014-10-30 2016-12-07 联发科技股份有限公司 Virtual reality system, mobile device, Wearable device and the processing method of entry event
CN106203410A (en) * 2016-09-21 2016-12-07 上海星寰投资有限公司 A kind of auth method and system
CN106200972A (en) * 2016-07-14 2016-12-07 乐视控股(北京)有限公司 A kind of method and device adjusting virtual reality scenario parameter
CN106251153A (en) * 2016-09-21 2016-12-21 上海星寰投资有限公司 A kind of method of payment and system
CN106326813A (en) * 2015-06-30 2017-01-11 深圳指芯智能科技有限公司 Intelligent frequency conversion 3D fingerprint sensor
CN106408303A (en) * 2016-09-21 2017-02-15 上海星寰投资有限公司 Payment method and system
CN106507121A (en) * 2016-10-31 2017-03-15 易瓦特科技股份公司 A kind of live method of control, VR equipment and unmanned plane
CN106507128A (en) * 2015-09-08 2017-03-15 科理特株式会社 Virtual reality imagery transmission method, player method and the program using the method
CN106575358A (en) * 2014-08-05 2017-04-19 康蒂-特米克微电子有限公司 Driver assistance system
CN106597673A (en) * 2017-02-28 2017-04-26 京东方科技集团股份有限公司 Virtual reality display apparatus, and driving method and driving module thereof
CN106603107A (en) * 2016-12-21 2017-04-26 惠州Tcl移动通信有限公司 Head-mounted device and control method thereof
CN106651355A (en) * 2016-11-08 2017-05-10 北京小米移动软件有限公司 Payment method and device, and virtual reality helmet
CN106681955A (en) * 2017-01-04 2017-05-17 四川埃姆克伺服科技有限公司 Universal interface circuit for receiving signal from servo motor position sensor
CN106790579A (en) * 2016-12-28 2017-05-31 苏州商信宝信息科技有限公司 A kind of active information transferring method and its system based on intelligent glasses
WO2017092396A1 (en) * 2015-12-01 2017-06-08 深圳市掌网科技股份有限公司 Virtual reality interaction system and method
CN106842576A (en) * 2017-03-23 2017-06-13 核桃智能科技(常州)有限公司 It is a kind of to wear intelligent display device with functionality mobile communication
CN106846383A (en) * 2017-01-23 2017-06-13 宁波诺丁汉大学 High dynamic range images imaging method based on 3D digital micro-analysis imaging systems
CN106864362A (en) * 2017-01-18 2017-06-20 陈宗坤 A kind of warning sign with purification of air
CN106871973A (en) * 2017-04-21 2017-06-20 佛山市川东磁电股份有限公司 A kind of Temperature Humidity Sensor
CN106932906A (en) * 2017-03-04 2017-07-07 国家电网公司 Mixed reality display device
CN106934361A (en) * 2017-03-06 2017-07-07 苏州佳世达光电有限公司 A kind of discrimination method and electronic equipment
CN107003824A (en) * 2014-10-30 2017-08-01 语音处理解决方案有限公司 Control device for dictating machine
CN107015655A (en) * 2017-04-11 2017-08-04 苏州和云观博数字科技有限公司 Museum virtual scene AR experiences eyeglass device and its implementation
CN107071285A (en) * 2017-05-16 2017-08-18 广东交通职业技术学院 One kind is with shooting method, memory and unmanned plane with clapping device
CN107113071A (en) * 2014-11-11 2017-08-29 索尼公司 The dynamic subscriber of media experience for enabling BAN recommends
CN107193381A (en) * 2017-05-31 2017-09-22 湖南工业大学 A kind of intelligent glasses and its display methods based on eyeball tracking sensing technology
CN107210823A (en) * 2015-02-03 2017-09-26 索尼公司 Methods, devices and systems for collecting writing pattern using BAN
CN107302845A (en) * 2014-09-22 2017-10-27 爱父爱斯吉尔有限公司 The low time delay analogue means and method, the computer program for this method of utilization orientation prediction
CN107306332A (en) * 2016-04-19 2017-10-31 奥多比公司 The image compensation of inaccessible directly view augmented reality system
CN107402378A (en) * 2016-05-19 2017-11-28 财团法人金属工业研究发展中心 Frequency modulated(FM) radar transceiver
CN107422480A (en) * 2017-08-03 2017-12-01 深圳市汇龙天成科技有限公司 A kind of semi-transparent semi-reflecting toroidal lens shows structure and display methods
CN107430479A (en) * 2015-03-31 2017-12-01 索尼公司 Information processor, information processing method and program
CN107544661A (en) * 2016-06-24 2018-01-05 联想(北京)有限公司 A kind of information processing method and electronic equipment
CN107609492A (en) * 2017-08-25 2018-01-19 西安电子科技大学 Distorted image quality based on EEG signals perceives evaluation method
CN107660283A (en) * 2015-04-03 2018-02-02 甲骨文国际公司 For realizing the method and system of daily record resolver in Log Analysis System
CN107679380A (en) * 2017-06-22 2018-02-09 国网浙江平湖市供电公司 A kind of intelligent patrol detection device and method of identity-based identification
CN107710009A (en) * 2015-02-27 2018-02-16 威尔乌集团 Controller visualization in virtual and augmented reality environment
CN107810646A (en) * 2015-06-24 2018-03-16 微软技术许可有限责任公司 Filtering sound for conference applications
CN107885311A (en) * 2016-09-29 2018-04-06 深圳纬目信息技术有限公司 A kind of confirmation method of visual interactive, system and equipment
CN107923757A (en) * 2015-09-25 2018-04-17 苹果公司 Non-solid object monitoring
CN107924522A (en) * 2015-06-24 2018-04-17 奇跃公司 Augmented reality equipment, system and method for purchase
CN108028038A (en) * 2015-10-05 2018-05-11 三美电机株式会社 Display device
CN108089324A (en) * 2016-11-22 2018-05-29 霍尼韦尔国际公司 NTE display systems and method with optical tracker
CN108122248A (en) * 2018-01-15 2018-06-05 武汉大学 Dam natural frequency of vibration recognition methods based on video measuring
CN108136258A (en) * 2015-10-28 2018-06-08 微软技术许可有限责任公司 Picture frame is adjusted based on tracking eye motion
CN108169901A (en) * 2017-12-27 2018-06-15 北京传嘉科技有限公司 VR glasses
CN108255295A (en) * 2016-12-28 2018-07-06 意美森公司 It is generated for the haptic effect of spatial dependence content
CN108337573A (en) * 2018-03-26 2018-07-27 京东方科技集团股份有限公司 A kind of implementation method that race explains in real time and medium
CN108398791A (en) * 2018-03-29 2018-08-14 陈超平 A kind of nearly eye display device based on polarisation contact lenses
CN108427830A (en) * 2018-02-09 2018-08-21 中建五局第三建设有限公司 A kind of method and device for constructing object space setting-out using mixed reality technological guidance
CN108433724A (en) * 2017-02-16 2018-08-24 三星电子株式会社 The method and wearable electronic of service are provided based on biometric information
CN108459812A (en) * 2018-01-22 2018-08-28 郑州升达经贸管理学院 A kind of fine arts track, which is shown, chases system and method
CN108479056A (en) * 2018-03-05 2018-09-04 成都看客网络技术有限公司 A kind of blind person with grabbing doll machine on the net
US10088911B2 (en) 2016-12-30 2018-10-02 Manuel Saez Programmable electronic helmet
CN108762490A (en) * 2017-05-09 2018-11-06 苏州乐轩科技有限公司 Device for mixed reality
CN108798360A (en) * 2018-02-01 2018-11-13 李绍辉 Smog Quick diffusing method based on the communication technology
CN108803877A (en) * 2018-06-11 2018-11-13 联想(北京)有限公司 Switching method, device and electronic equipment
CN108885339A (en) * 2015-12-31 2018-11-23 汤姆逊许可公司 For using the configuration of adaptive focal plane rendering virtual reality
CN108885521A (en) * 2016-03-29 2018-11-23 微软技术许可有限责任公司 Cross-environment is shared
CN108958461A (en) * 2017-05-24 2018-12-07 宏碁股份有限公司 Have the virtual reality system and its control method of self adaptive control
CN108939316A (en) * 2017-05-17 2018-12-07 维申Rt有限公司 Patient monitoring system
CN108983636A (en) * 2018-06-20 2018-12-11 浙江大学 Human-machine intelligence's symbiosis plateform system
CN108982062A (en) * 2018-06-14 2018-12-11 上海卫星工程研究所 The visual field alignment methods of linear array image optics load in a kind of satellite Stray Light Test
CN109074212A (en) * 2016-04-26 2018-12-21 索尼公司 Information processing unit, information processing method and program
CN109215132A (en) * 2017-06-30 2019-01-15 华为技术有限公司 A kind of implementation method and equipment of augmented reality business
CN109407325A (en) * 2018-12-21 2019-03-01 周桂兵 A kind of multipurpose VR intelligent glasses and its display methods
CN109425989A (en) * 2017-08-21 2019-03-05 精工爱普生株式会社 The manufacturing method of arrangement for deflecting, display device and arrangement for deflecting
CN109559541A (en) * 2018-11-20 2019-04-02 华东交通大学 A kind of unmanned vehicle route management system
WO2019061825A1 (en) * 2017-09-29 2019-04-04 歌尔股份有限公司 Vr/ar head-mounted device
CN109661594A (en) * 2016-08-22 2019-04-19 苹果公司 Intermediate range optical system for remote sensing receiver
CN109696747A (en) * 2019-01-16 2019-04-30 京东方科技集团股份有限公司 A kind of VR display device and its control method
CN109791391A (en) * 2016-07-24 2019-05-21 光场实验室公司 Calibration method for holographic energy guidance system
TWI660630B (en) * 2017-12-06 2019-05-21 瑞昱半導體股份有限公司 Method and system for detecting video scan type
CN109808711A (en) * 2018-12-25 2019-05-28 南京师范大学 Automatic driving vehicle control method and system, automatic driving vehicle and vision prosthesis
CN109886170A (en) * 2019-02-01 2019-06-14 长江水利委员会长江科学院 A kind of identification of oncomelania intelligent measurement and statistical system
CN110036635A (en) * 2016-12-28 2019-07-19 微软技术许可有限责任公司 Alleviate the system, method and computer-readable medium of motion sickness via the display of the enhancing for passenger for using video capture device
CN110110458A (en) * 2019-05-14 2019-08-09 西安电子科技大学 The conformal array antenna modeling method of deformation based on high order MoM
CN110175065A (en) * 2019-05-29 2019-08-27 广州视源电子科技股份有限公司 A kind of display methods of user interface, device, equipment and storage medium
CN110197142A (en) * 2019-05-16 2019-09-03 谷东科技有限公司 Object identification method, device, medium and terminal device under faint light condition
CN110197601A (en) * 2019-04-24 2019-09-03 薄涛 Mixed reality glasses, mobile terminal and tutoring system, method and medium
CN110210390A (en) * 2019-05-31 2019-09-06 维沃移动通信有限公司 Fingerprint collecting mould group, fingerprint collecting method and terminal
US20190293746A1 (en) * 2018-03-26 2019-09-26 Electronics And Telecomunications Research Institute Electronic device for estimating position of sound source
CN110352370A (en) * 2017-03-07 2019-10-18 苹果公司 Wear-type display system
CN110363205A (en) * 2019-06-25 2019-10-22 浙江大学 A kind of image characteristic extraction system and method based on Talbot effect optical convolution
CN110361707A (en) * 2019-08-09 2019-10-22 成都玖锦科技有限公司 The motion state Dynamic Simulation Method of radiation source
CN110794644A (en) * 2018-08-03 2020-02-14 扬明光学股份有限公司 Optical device and method for manufacturing the same
CN110869901A (en) * 2017-05-08 2020-03-06 Lg电子株式会社 User interface device for vehicle and vehicle
TWI687953B (en) * 2018-12-05 2020-03-11 宏碁股份有限公司 Key structure and mode switching method thereof
TWI687721B (en) * 2010-11-08 2020-03-11 盧森堡商喜瑞爾工業公司 Display device
CN111048215A (en) * 2019-12-13 2020-04-21 北京纵横无双科技有限公司 CRM-based medical video production method and system
CN111144921A (en) * 2018-11-06 2020-05-12 丰田自动车株式会社 Information processing apparatus, information processing method, and non-transitory storage medium
CN111160105A (en) * 2019-12-03 2020-05-15 北京文香信息技术有限公司 Video image monitoring method, device, equipment and storage medium
CN111179301A (en) * 2019-12-23 2020-05-19 北京中广上洋科技股份有限公司 Motion trend analysis method based on computer video
CN111307464A (en) * 2018-12-11 2020-06-19 劳斯莱斯有限公司 Inspection system
CN111317257A (en) * 2020-03-25 2020-06-23 黑龙江工业学院 Multimedia teacher desk for special children education
CN111426283A (en) * 2020-04-14 2020-07-17 昆山金智汇坤建筑科技有限公司 Laser scanning equipment for building site measurement
CN111433657A (en) * 2017-12-28 2020-07-17 深圳市柔宇科技有限公司 Diopter adjusting device and electronic equipment
CN111656334A (en) * 2018-01-29 2020-09-11 美光科技公司 Memory controller with programmable atomic operation
CN111665622A (en) * 2019-03-06 2020-09-15 株式会社理光 Optical device, retina projection display device, and head-mounted display device
CN111708170A (en) * 2020-07-10 2020-09-25 温州明镜智能科技有限公司 Novel VR glasses lens integrated configuration
CN111857328A (en) * 2019-04-30 2020-10-30 苹果公司 Head-mounted device
CN111897425A (en) * 2014-07-31 2020-11-06 三星电子株式会社 Wearable glasses and method for providing information using the same
TWI711005B (en) * 2019-03-14 2020-11-21 宏碁股份有限公司 Method for adjusting luminance of images and computer program product
CN112213856A (en) * 2014-07-31 2021-01-12 三星电子株式会社 Wearable glasses and method of displaying image via wearable glasses
US10891919B2 (en) 2017-06-26 2021-01-12 Boe Technology Group Co., Ltd. Display system and image display method
CN112327313A (en) * 2020-01-14 2021-02-05 必虎嘉骁光电技术(重庆)有限公司 Binocular range finder
CN112370240A (en) * 2020-12-01 2021-02-19 創啟社會科技有限公司 Auxiliary intelligent glasses and system for vision impairment and control method thereof
CN112433187A (en) * 2019-08-26 2021-03-02 通用电气精准医疗有限责任公司 MRI system comprising a patient motion sensor
CN112462932A (en) * 2019-09-06 2021-03-09 苹果公司 Gesture input system with wearable or handheld device based on self-mixing interferometry
CN112601993A (en) * 2018-08-26 2021-04-02 鲁姆斯有限公司 Reflection suppression in near-eye displays
CN112819590A (en) * 2021-02-25 2021-05-18 紫光云技术有限公司 Method for managing product configuration information in cloud product service delivery process
CN112807654A (en) * 2020-12-05 2021-05-18 泰州可以信息科技有限公司 Electronic judgment platform and method for heel-and-toe walking race
CN112868023A (en) * 2018-10-15 2021-05-28 艾玛迪斯简易股份公司 Augmented reality system and method
CN112882233A (en) * 2015-05-19 2021-06-01 奇跃公司 Double composite light field device
CN112904803A (en) * 2021-01-15 2021-06-04 西安电子科技大学 Multi-splicing-surface deformation and flatness fine adjustment system, method, equipment and application
CN113064280A (en) * 2021-04-08 2021-07-02 恒玄科技(上海)股份有限公司 Intelligent display device
CN113115008A (en) * 2021-05-17 2021-07-13 哈尔滨商业大学 Pipe gallery master-slave operation inspection system and method based on rapid tracking registration augmented reality technology
CN113240818A (en) * 2021-04-29 2021-08-10 广东元一科技实业有限公司 Method for simulating and displaying dummy model clothes
TWI740083B (en) * 2018-12-27 2021-09-21 雅得近顯股份有限公司 Low-light environment display structure
US20210312842A1 (en) * 2018-12-20 2021-10-07 Ns West Inc. Display light emission device, head-up display device, image display system, and helmet
CN113569645A (en) * 2021-06-28 2021-10-29 广东技术师范大学 Track generation method, device and system based on image detection
CN113759555A (en) * 2015-10-05 2021-12-07 迪吉伦斯公司 Waveguide display
TWI769815B (en) * 2021-02-03 2022-07-01 大立光電股份有限公司 Plastic light-folding element, imaging lens assembly module and electronic device
CN115049643A (en) * 2022-08-11 2022-09-13 武汉精立电子技术有限公司 Near-to-eye display module interlayer foreign matter detection method, device, equipment and storage medium
CN115242304A (en) * 2015-12-30 2022-10-25 艾伦神火公司 Optical narrowcast
US11556010B1 (en) * 2022-04-01 2023-01-17 Wen-Tsun Wu Mini display device
US11562711B2 (en) * 2020-01-31 2023-01-24 Microchip Technology Incorporated Heads-up display using electrochromic elements
US11631380B2 (en) 2018-03-14 2023-04-18 Sony Corporation Information processing apparatus, information processing method, and recording medium
CN116186418A (en) * 2023-04-27 2023-05-30 深圳市夜行人科技有限公司 Low-light imaging system recommendation method, system and medium
US20230168403A1 (en) * 2020-04-21 2023-06-01 Inova Ltd. Motion Aware Nodal Seismic Unit and Related Methods
CN116244238A (en) * 2023-05-12 2023-06-09 中国船舶集团有限公司第七〇七研究所 RS422 protocol and RS232 protocol compatible method and circuit for fiber optic gyroscope
TWI807066B (en) * 2019-07-08 2023-07-01 怡利電子工業股份有限公司 Glasses-free 3D reflective diffuser head-up display device
CN117195738A (en) * 2023-09-27 2023-12-08 广东翼景信息科技有限公司 Base station antenna setting and upper dip angle optimizing method for unmanned aerial vehicle corridor
US11899238B2 (en) 2019-08-29 2024-02-13 Digilens Inc. Evacuated gratings and methods of manufacturing
US11908356B2 (en) * 2021-12-15 2024-02-20 Motorola Mobility Llc Augmented reality display device having contextual adaptive brightness

Families Citing this family (354)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9158116B1 (en) 2014-04-25 2015-10-13 Osterhout Group, Inc. Temple and ear horn assembly for headworn computer
US9366867B2 (en) 2014-07-08 2016-06-14 Osterhout Group, Inc. Optical systems for see-through displays
US20150277120A1 (en) 2014-01-21 2015-10-01 Osterhout Group, Inc. Optical configurations for head worn computing
US9298007B2 (en) 2014-01-21 2016-03-29 Osterhout Group, Inc. Eye imaging in head worn computing
US9952664B2 (en) 2014-01-21 2018-04-24 Osterhout Group, Inc. Eye imaging in head worn computing
US9400390B2 (en) 2014-01-24 2016-07-26 Osterhout Group, Inc. Peripheral lighting for head worn computing
US9965681B2 (en) 2008-12-16 2018-05-08 Osterhout Group, Inc. Eye imaging in head worn computing
US9715112B2 (en) 2014-01-21 2017-07-25 Osterhout Group, Inc. Suppression of stray light in head worn computing
US9229233B2 (en) 2014-02-11 2016-01-05 Osterhout Group, Inc. Micro Doppler presentations in head worn computing
US20150205111A1 (en) 2014-01-21 2015-07-23 Osterhout Group, Inc. Optical configurations for head worn computing
US9129295B2 (en) 2010-02-28 2015-09-08 Microsoft Technology Licensing, Llc See-through near-eye display glasses with a fast response photochromic film system for quick transition from dark to clear
US9128281B2 (en) 2010-09-14 2015-09-08 Microsoft Technology Licensing, Llc Eyepiece with uniformly illuminated reflective display
US20120249797A1 (en) 2010-02-28 2012-10-04 Osterhout Group, Inc. Head-worn adaptive display
US9366862B2 (en) 2010-02-28 2016-06-14 Microsoft Technology Licensing, Llc System and method for delivering content to a group of see-through near eye display eyepieces
US9097891B2 (en) 2010-02-28 2015-08-04 Microsoft Technology Licensing, Llc See-through near-eye display glasses including an auto-brightness control for the display brightness based on the brightness in the environment
US10180572B2 (en) 2010-02-28 2019-01-15 Microsoft Technology Licensing, Llc AR glasses with event and user action control of external applications
US9223134B2 (en) 2010-02-28 2015-12-29 Microsoft Technology Licensing, Llc Optical imperfections in a light transmissive illumination system for see-through near-eye display glasses
US9341843B2 (en) 2010-02-28 2016-05-17 Microsoft Technology Licensing, Llc See-through near-eye display glasses with a small scale image source
EP2539759A1 (en) 2010-02-28 2013-01-02 Osterhout Group, Inc. Local advertising content on an interactive head-mounted eyepiece
US9134534B2 (en) 2010-02-28 2015-09-15 Microsoft Technology Licensing, Llc See-through near-eye display glasses including a modular image source
US20150309316A1 (en) 2011-04-06 2015-10-29 Microsoft Technology Licensing, Llc Ar glasses with predictive control of external device based on event input
US9182596B2 (en) 2010-02-28 2015-11-10 Microsoft Technology Licensing, Llc See-through near-eye display glasses with the optical assembly including absorptive polarizers or anti-reflective coatings to reduce stray light
US9759917B2 (en) 2010-02-28 2017-09-12 Microsoft Technology Licensing, Llc AR glasses with event and sensor triggered AR eyepiece interface to external devices
US9091851B2 (en) 2010-02-28 2015-07-28 Microsoft Technology Licensing, Llc Light control in head mounted displays
US9285589B2 (en) 2010-02-28 2016-03-15 Microsoft Technology Licensing, Llc AR glasses with event and sensor triggered control of AR eyepiece applications
US9229227B2 (en) 2010-02-28 2016-01-05 Microsoft Technology Licensing, Llc See-through near-eye display glasses with a light transmissive wedge shaped illumination system
US9097890B2 (en) 2010-02-28 2015-08-04 Microsoft Technology Licensing, Llc Grating in a light transmissive illumination system for see-through near-eye display glasses
CN105122094B (en) * 2012-11-01 2018-06-12 依视路国际公司 The changeable color applicator being influenced by heat
GB2515460B (en) 2013-04-12 2016-01-06 Two Trees Photonics Ltd Near-eye device
US9417471B2 (en) * 2013-04-30 2016-08-16 Research Frontiers Incorporated Method and device for protecting objects from degradation by light with suspended particle device light valves
US9964844B2 (en) * 2013-05-09 2018-05-08 Imax Corporation Methods and systems of vibrating a screen
US9280972B2 (en) * 2013-05-10 2016-03-08 Microsoft Technology Licensing, Llc Speech to text conversion
US9740030B2 (en) * 2013-05-23 2017-08-22 Omnivision Technologies, Inc. Near-eye display systems, devices and methods
WO2014193326A1 (en) * 2013-05-29 2014-12-04 Baltaci Cetin Ozgur System for forming a virtual image
KR102249577B1 (en) * 2013-05-30 2021-05-07 찰스 안소니 스미스 Hud object design and method
CN103336435B (en) * 2013-06-19 2015-10-28 河海大学常州校区 Gyroscope is based on the method for adaptive fuzzy sliding mode control of Attitude rate estimator
JP2015009630A (en) * 2013-06-27 2015-01-19 庸 菊池 Vehicle inspection recording unit
JP6205189B2 (en) * 2013-06-28 2017-09-27 オリンパス株式会社 Information presentation system and method for controlling information presentation system
US9563331B2 (en) * 2013-06-28 2017-02-07 Microsoft Technology Licensing, Llc Web-like hierarchical menu display configuration for a near-eye display
JP6252002B2 (en) * 2013-07-11 2017-12-27 セイコーエプソン株式会社 Head-mounted display device and method for controlling head-mounted display device
TW201502581A (en) 2013-07-11 2015-01-16 Seiko Epson Corp Head mounted display device and control method for head mounted display device
ES2576489T3 (en) * 2013-08-02 2016-07-07 Essilor International (Compagnie Générale d'Optique) A method to control a programmable ophthalmic lens device
KR20150018264A (en) * 2013-08-09 2015-02-23 엘지전자 주식회사 Wearable glass-type device and control method thereof
JP6111932B2 (en) * 2013-08-26 2017-04-12 ソニー株式会社 Action support device, action support method, program, and storage medium
JP6337433B2 (en) * 2013-09-13 2018-06-06 セイコーエプソン株式会社 Head-mounted display device and method for controlling head-mounted display device
US9311545B2 (en) 2013-09-18 2016-04-12 Blackberry Limited Multicolor biometric scanning user interface
US9418273B2 (en) 2013-09-18 2016-08-16 Blackberry Limited Structure for multicolor biometric scanning user interface
JP5877824B2 (en) * 2013-09-20 2016-03-08 ヤフー株式会社 Information processing system, information processing method, and information processing program
US9763071B2 (en) * 2013-09-22 2017-09-12 Ricoh Company, Ltd. Mobile information gateway for use in emergency situations or with special equipment
US20150088547A1 (en) * 2013-09-22 2015-03-26 Ricoh Company, Ltd. Mobile Information Gateway for Home Healthcare
KR102088020B1 (en) 2013-09-26 2020-03-11 엘지전자 주식회사 A head mounted display ant the method of controlling thereof
EP4321915A2 (en) * 2013-10-16 2024-02-14 Magic Leap, Inc. Virtual or augmented reality headsets having adjustable interpupillary distance
CN103536279A (en) * 2013-10-22 2014-01-29 德赛电子(惠州)有限公司 Intelligent wristband and adaptive method thereof
US10258256B2 (en) 2014-12-09 2019-04-16 TechMah Medical Bone reconstruction and orthopedic implants
US9420178B2 (en) 2013-12-20 2016-08-16 Qualcomm Incorporated Thermal and power management
US9448621B2 (en) * 2013-12-20 2016-09-20 Nokia Technologies Oy Causation of display of information on a see through display
WO2015104446A1 (en) 2014-01-10 2015-07-16 Nokia Technologies Oy Display of a visual representation of a view
US9939934B2 (en) 2014-01-17 2018-04-10 Osterhout Group, Inc. External user interface for head worn computing
US9829707B2 (en) 2014-08-12 2017-11-28 Osterhout Group, Inc. Measuring content brightness in head worn computing
US10649220B2 (en) 2014-06-09 2020-05-12 Mentor Acquisition One, Llc Content presentation in head worn computing
US10254856B2 (en) 2014-01-17 2019-04-09 Osterhout Group, Inc. External user interface for head worn computing
US11227294B2 (en) 2014-04-03 2022-01-18 Mentor Acquisition One, Llc Sight information collection in head worn computing
US9746686B2 (en) 2014-05-19 2017-08-29 Osterhout Group, Inc. Content position calibration in head worn computing
US9671613B2 (en) 2014-09-26 2017-06-06 Osterhout Group, Inc. See-through computer display systems
US9448409B2 (en) 2014-11-26 2016-09-20 Osterhout Group, Inc. See-through computer display systems
US20160019715A1 (en) 2014-07-15 2016-01-21 Osterhout Group, Inc. Content presentation in head worn computing
US9841599B2 (en) 2014-06-05 2017-12-12 Osterhout Group, Inc. Optical configurations for head-worn see-through displays
US11103122B2 (en) 2014-07-15 2021-08-31 Mentor Acquisition One, Llc Content presentation in head worn computing
US9810906B2 (en) 2014-06-17 2017-11-07 Osterhout Group, Inc. External user interface for head worn computing
US9366868B2 (en) 2014-09-26 2016-06-14 Osterhout Group, Inc. See-through computer display systems
US9594246B2 (en) 2014-01-21 2017-03-14 Osterhout Group, Inc. See-through computer display systems
US9529195B2 (en) 2014-01-21 2016-12-27 Osterhout Group, Inc. See-through computer display systems
US10684687B2 (en) 2014-12-03 2020-06-16 Mentor Acquisition One, Llc See-through computer display systems
US20150277118A1 (en) 2014-03-28 2015-10-01 Osterhout Group, Inc. Sensor dependent content position in head worn computing
US9299194B2 (en) 2014-02-14 2016-03-29 Osterhout Group, Inc. Secure sharing in head worn computing
US10191279B2 (en) 2014-03-17 2019-01-29 Osterhout Group, Inc. Eye imaging in head worn computing
US9575321B2 (en) 2014-06-09 2017-02-21 Osterhout Group, Inc. Content presentation in head worn computing
US9494800B2 (en) 2014-01-21 2016-11-15 Osterhout Group, Inc. See-through computer display systems
US9523856B2 (en) 2014-01-21 2016-12-20 Osterhout Group, Inc. See-through computer display systems
US11737666B2 (en) 2014-01-21 2023-08-29 Mentor Acquisition One, Llc Eye imaging in head worn computing
US11669163B2 (en) 2014-01-21 2023-06-06 Mentor Acquisition One, Llc Eye glint imaging in see-through computer display systems
US11892644B2 (en) 2014-01-21 2024-02-06 Mentor Acquisition One, Llc See-through computer display systems
US9310610B2 (en) 2014-01-21 2016-04-12 Osterhout Group, Inc. See-through computer display systems
US9836122B2 (en) 2014-01-21 2017-12-05 Osterhout Group, Inc. Eye glint imaging in see-through computer display systems
US9740280B2 (en) 2014-01-21 2017-08-22 Osterhout Group, Inc. Eye imaging in head worn computing
US9753288B2 (en) 2014-01-21 2017-09-05 Osterhout Group, Inc. See-through computer display systems
US11487110B2 (en) 2014-01-21 2022-11-01 Mentor Acquisition One, Llc Eye imaging in head worn computing
US9766463B2 (en) 2014-01-21 2017-09-19 Osterhout Group, Inc. See-through computer display systems
US20150205135A1 (en) 2014-01-21 2015-07-23 Osterhout Group, Inc. See-through computer display systems
US9532715B2 (en) 2014-01-21 2017-01-03 Osterhout Group, Inc. Eye imaging in head worn computing
US9846308B2 (en) 2014-01-24 2017-12-19 Osterhout Group, Inc. Haptic systems for head-worn computers
JP6264542B2 (en) * 2014-01-30 2018-01-24 任天堂株式会社 Information processing apparatus, information processing program, information processing system, and information processing method
EP3100240B1 (en) 2014-01-31 2018-10-31 Empire Technology Development LLC Evaluation of augmented reality skins
WO2015116183A2 (en) * 2014-01-31 2015-08-06 Empire Technology Development, Llc Subject selected augmented reality skin
WO2015116182A1 (en) 2014-01-31 2015-08-06 Empire Technology Development, Llc Augmented reality skin evaluation
WO2015117043A1 (en) 2014-01-31 2015-08-06 Magic Leap, Inc. Multi-focal display system and method
EP3100226A4 (en) 2014-01-31 2017-10-25 Empire Technology Development LLC Augmented reality skin manager
US9401540B2 (en) 2014-02-11 2016-07-26 Osterhout Group, Inc. Spatial location presentation in head worn computing
US20150241963A1 (en) 2014-02-11 2015-08-27 Osterhout Group, Inc. Eye imaging in head worn computing
CN106233328B (en) 2014-02-19 2020-05-12 埃弗加泽公司 Apparatus and method for improving, enhancing or augmenting vision
JP2015166816A (en) * 2014-03-04 2015-09-24 富士通株式会社 Display device, display control program, and display control method
WO2015134738A1 (en) * 2014-03-05 2015-09-11 Arizona Board Of Regents On Behalf Of The University Of Arizona Wearable 3d augmented reality display
US11408699B2 (en) 2014-03-21 2022-08-09 Armaments Research Company Inc. Firearm usage monitoring system
EP3125073B1 (en) * 2014-03-26 2020-11-18 Sony Corporation Sensory feedback introducing device, sensory feedback introducing system, and sensory feedback introduction method
US20160187651A1 (en) 2014-03-28 2016-06-30 Osterhout Group, Inc. Safety for a vehicle operator with an hmd
CN106471860B (en) * 2014-04-09 2020-03-31 Lg电子株式会社 Mobile terminal and method for controlling the same
EP3132379B1 (en) * 2014-04-15 2018-11-07 Huntington Ingalls Incorporated System and method for augmented reality display of dynamic environment information
DE102014207490B3 (en) * 2014-04-17 2015-07-02 Carl Zeiss Ag Spectacle lens for a display device to be placed on the head of a user and an image-generating display device and display device with such a spectacle lens
US9734403B2 (en) 2014-04-25 2017-08-15 Huntington Ingalls Incorporated Augmented reality display of dynamic target object information
US9423842B2 (en) 2014-09-18 2016-08-23 Osterhout Group, Inc. Thermal management for head-worn computer
US9651787B2 (en) 2014-04-25 2017-05-16 Osterhout Group, Inc. Speaker assembly for headworn computer
US10853589B2 (en) 2014-04-25 2020-12-01 Mentor Acquisition One, Llc Language translation with head-worn computing
US9864909B2 (en) 2014-04-25 2018-01-09 Huntington Ingalls Incorporated System and method for using augmented reality display in surface treatment procedures
CN103941953B (en) * 2014-04-28 2017-10-31 北京智谷睿拓技术服务有限公司 Information processing method and device
JPWO2015170555A1 (en) * 2014-05-09 2017-04-20 アルプス電気株式会社 Eyeglass-type electronic equipment
US9635257B2 (en) * 2014-05-12 2017-04-25 Gopro, Inc. Dual-microphone camera
CN103984413B (en) * 2014-05-19 2017-12-08 北京智谷睿拓技术服务有限公司 Information interacting method and information interactive device
US9710151B2 (en) 2014-05-21 2017-07-18 International Business Machines Corporation Evaluation of digital content using non-intentional user feedback obtained through haptic interface
US9323331B2 (en) 2014-05-21 2016-04-26 International Business Machines Corporation Evaluation of digital content using intentional user feedback obtained through haptic interface
US9600073B2 (en) 2014-05-21 2017-03-21 International Business Machines Corporation Automated adjustment of content composition rules based on evaluation of user feedback obtained through haptic interface
AU2015266670B2 (en) 2014-05-30 2019-05-09 Magic Leap, Inc. Methods and systems for displaying stereoscopy with a freeform optical system with addressable focus for virtual and augmented reality
EP4235252A1 (en) 2014-05-30 2023-08-30 Magic Leap, Inc. Methods and system for creating focal planes in virtual augmented reality
WO2015191346A1 (en) 2014-06-09 2015-12-17 Huntington Ingalls Incorporated System and method for augmented reality display of electrical system information
US10504294B2 (en) 2014-06-09 2019-12-10 Huntington Ingalls Incorporated System and method for augmented reality discrepancy determination and reporting
US10915754B2 (en) 2014-06-09 2021-02-09 Huntington Ingalls Incorporated System and method for use of augmented reality in outfitting a dynamic structural space
CN103976715B (en) * 2014-06-09 2016-01-20 江苏启润科技有限公司 The healthy self-checking system of multifunctional human
US10663740B2 (en) 2014-06-09 2020-05-26 Mentor Acquisition One, Llc Content presentation in head worn computing
EP4206870A1 (en) * 2014-06-14 2023-07-05 Magic Leap, Inc. Method for updating a virtual world
JP6292478B2 (en) * 2014-06-17 2018-03-14 コニカミノルタ株式会社 Information display system having transmissive HMD and display control program
EP2957983A1 (en) * 2014-06-18 2015-12-23 Alcatel Lucent User-wearable electronic device and system for personal computing
KR102209512B1 (en) * 2014-06-30 2021-01-29 엘지전자 주식회사 Glasses type mobile terminal
WO2016004385A1 (en) 2014-07-02 2016-01-07 IDx, LLC Systems and methods for alignment of the eye for ocular imaging
KR102506455B1 (en) * 2014-07-10 2023-03-07 모하메드 라쉬완 마푸즈 Bone reconstruction and orthopedic implants
KR101629758B1 (en) * 2014-07-11 2016-06-24 넥시스 주식회사 Method and program with the unlock system of wearable glass device
WO2016006949A1 (en) * 2014-07-11 2016-01-14 넥시스 주식회사 System and method for processing data using wearable device
US9898867B2 (en) 2014-07-16 2018-02-20 Huntington Ingalls Incorporated System and method for augmented reality display of hoisting and rigging information
US10217258B2 (en) 2014-07-22 2019-02-26 Lg Electronics Inc. Head mounted display and control method thereof
EP2977855B1 (en) * 2014-07-23 2019-08-28 Wincor Nixdorf International GmbH Virtual keyboard and input method for a virtual keyboard
CN104090385B (en) * 2014-07-25 2015-11-18 金陵科技学院 A kind of anti-cheating intelligent glasses
KR102433291B1 (en) * 2014-07-31 2022-08-17 삼성전자주식회사 Method and wearable glasses for providing a content
US10379338B2 (en) 2014-09-11 2019-08-13 Huawei Technologies Co., Ltd. Mobile terminal with a periscope optical zoom lens
JP6346537B2 (en) * 2014-09-29 2018-06-20 株式会社Nttドコモ Travel plan output system
WO2016060293A1 (en) * 2014-10-15 2016-04-21 엘지전자 주식회사 Image information display device and control method therefor
US9283138B1 (en) 2014-10-24 2016-03-15 Keith Rosenblum Communication techniques and devices for massage therapy
BR212016005938U2 (en) * 2014-11-03 2018-02-06 Jose Evangelista Terrabuio Junior immersive augmented virtual reality glasses for use with smartphones, tablets, phablets and or mobile screen cpu?
US10286308B2 (en) 2014-11-10 2019-05-14 Valve Corporation Controller visualization in virtual and augmented reality environments
US9366883B2 (en) 2014-11-13 2016-06-14 International Business Machines Corporation Using google glass to project a red overlay that enhances night vision
US9811954B2 (en) 2014-12-02 2017-11-07 Honeywell International, Inc. Near-to-eye display systems and methods for verifying aircraft components
US9684172B2 (en) 2014-12-03 2017-06-20 Osterhout Group, Inc. Head worn computer display systems
KR20170081272A (en) * 2014-12-18 2017-07-11 페이스북, 인크. Method, system and device for navigating in a virtual reality environment
KR102362727B1 (en) 2014-12-18 2022-02-15 엘지이노텍 주식회사 Apparatus for measuring user's pulse, and computing apparatus using the apparatus
USD743963S1 (en) 2014-12-22 2015-11-24 Osterhout Group, Inc. Air mouse
USD751552S1 (en) 2014-12-31 2016-03-15 Osterhout Group, Inc. Computer glasses
USD753114S1 (en) 2015-01-05 2016-04-05 Osterhout Group, Inc. Air mouse
JP6451322B2 (en) * 2015-01-06 2019-01-16 セイコーエプソン株式会社 Image display device
CA2972856A1 (en) 2015-01-08 2016-07-14 Ashkelon Eyewear Technologies Ltd An apparatus and method for displaying content
KR102320737B1 (en) 2015-01-14 2021-11-03 삼성디스플레이 주식회사 Head mounted electronic device
JP6746590B2 (en) 2015-01-26 2020-08-26 マジック リープ, インコーポレイテッドMagic Leap,Inc. Virtual and augmented reality system and method with improved grating structure
KR20160093529A (en) 2015-01-29 2016-08-08 유퍼스트(주) A wearable device for hearing impairment person
EP3054371A1 (en) * 2015-02-06 2016-08-10 Nokia Technologies OY Apparatus, method and computer program for displaying augmented information
CN105988562A (en) * 2015-02-06 2016-10-05 刘小洋 Intelligent wearing equipment and method for realizing gesture entry based on same
KR102309451B1 (en) * 2015-02-13 2021-10-07 주식회사 엘지유플러스 Wearable Device and Control Method of Displaying on the Device Thereof
US20160239985A1 (en) 2015-02-17 2016-08-18 Osterhout Group, Inc. See-through computer display systems
NZ773847A (en) * 2015-03-16 2022-07-01 Magic Leap Inc Methods and systems for diagnosing and treating health ailments
EP3264203A4 (en) * 2015-03-20 2018-07-18 Huawei Technologies Co. Ltd. Intelligent interaction method, equipment and system
JP6683367B2 (en) 2015-03-30 2020-04-22 国立大学法人東北大学 Biological information measuring device, biological information measuring method, and biological information measuring program
WO2016158624A1 (en) * 2015-03-30 2016-10-06 国立大学法人東北大学 Biological information measurement device, biological information measurement method, biological information display device and biological information display method
JP6642568B2 (en) * 2015-04-20 2020-02-05 日本電気株式会社 Target identification system, target identification method and program
JP6426525B2 (en) * 2015-04-20 2018-11-21 ファナック株式会社 Display system
KR102365492B1 (en) * 2015-04-22 2022-02-18 삼성전자주식회사 Wearable device
JP6646361B2 (en) * 2015-04-27 2020-02-14 ソニーセミコンダクタソリューションズ株式会社 Image processing apparatus, imaging apparatus, image processing method, and program
IL244255A (en) 2016-02-23 2017-04-30 Vertical Optics Llc Wearable vision redirecting devices
US9690119B2 (en) 2015-05-15 2017-06-27 Vertical Optics, LLC Wearable vision redirecting devices
CN107615214B (en) * 2015-05-21 2021-07-13 日本电气株式会社 Interface control system, interface control device, interface control method, and program
IL239191A0 (en) * 2015-06-03 2015-11-30 Amir B Geva Image classification system
CN104883543A (en) * 2015-06-04 2015-09-02 段然 Data acquisition system for uncompressed image transmission
CN104967887B (en) * 2015-06-06 2018-03-30 深圳市虚拟现实科技有限公司 Information interacting method and virtual reality glasses based on NFC
KR102196507B1 (en) 2015-06-09 2020-12-30 한국전자통신연구원 Apparatus for visible light communication using electrically switchable glass and method using same
KR102586069B1 (en) * 2015-07-03 2023-10-05 에씰로 앙터나시오날 Methods and systems for augmented reality
CN105093555B (en) * 2015-07-13 2018-08-14 深圳多新哆技术有限责任公司 Short distance optical amplifier module and the nearly eye display optics module for using it
KR20170014028A (en) 2015-07-28 2017-02-08 현대자동차주식회사 Hands-free inspection apparatus and method for controlling the same
CA2995978A1 (en) 2015-08-18 2017-02-23 Magic Leap, Inc. Virtual and augmented reality systems and methods
KR102260483B1 (en) * 2015-08-25 2021-06-04 한국전자기술연구원 Smart glasses for display a fly guide information
CN105100745B (en) * 2015-08-31 2018-03-23 国网浙江省电力公司湖州供电公司 A kind of transformer station's monitoring operation device
JP2017049762A (en) 2015-09-01 2017-03-09 株式会社東芝 System and method
EP3138478B1 (en) * 2015-09-01 2023-11-01 Essilor International A heart rate sensing wearable device
EP3145168A1 (en) * 2015-09-17 2017-03-22 Thomson Licensing An apparatus and a method for generating data representing a pixel beam
CN108027656B (en) 2015-09-28 2021-07-06 日本电气株式会社 Input device, input method, and program
CN108027654B (en) 2015-09-28 2021-01-12 日本电气株式会社 Input device, input method, and program
US20190121515A1 (en) * 2015-10-15 2019-04-25 Sony Corporation Information processing device and information processing method
CN105455792B (en) * 2015-12-18 2018-09-25 济南中景电子科技有限公司 Headband for virtual reality glasses
CA3006274A1 (en) * 2015-12-22 2017-06-29 E-Vision Smart Optics, Inc. Dynamic focusing head mounted display
AU2017206021B2 (en) * 2016-01-07 2021-10-21 Magic Leap, Inc. Virtual and augmented reality systems and methods having unequal numbers of component color images distributed across depth planes
KR102610120B1 (en) * 2016-01-20 2023-12-06 삼성전자주식회사 Head mounted display and control method thereof
WO2017138545A1 (en) 2016-02-08 2017-08-17 日本電気株式会社 Information processing system, information processing device, control method, and program
JP6350772B2 (en) 2016-02-25 2018-07-04 日本電気株式会社 Information processing system, information processing apparatus, control method, and program
CN205582205U (en) * 2016-03-02 2016-09-14 福州领头虎软件有限公司 Human situation and action monitoring alarm system
CN105720347A (en) * 2016-03-16 2016-06-29 昆山联滔电子有限公司 Watch chain-type window antenna
JP6493264B2 (en) * 2016-03-23 2019-04-03 横河電機株式会社 Maintenance information sharing apparatus, maintenance information sharing method, maintenance information sharing program, and recording medium
US9910284B1 (en) 2016-09-08 2018-03-06 Osterhout Group, Inc. Optical systems for head-worn computers
NZ747005A (en) * 2016-04-08 2020-04-24 Magic Leap Inc Augmented reality systems and methods with variable focus lens elements
CN105975060A (en) * 2016-04-26 2016-09-28 乐视控股(北京)有限公司 Virtual reality terminal as well as control method and apparatus therefor
IL310060A (en) * 2016-05-09 2024-03-01 Magic Leap Inc Augmented reality systems and methods for user health analysis
WO2017197334A1 (en) * 2016-05-12 2017-11-16 Cirque Corporation Controller premonition using capacitive sensing
US10482668B2 (en) * 2016-06-02 2019-11-19 Thales Visionix, Inc. Miniature vision-inertial navigation system with extended dynamic range
KR101859909B1 (en) 2016-06-07 2018-05-21 에스아이에스 주식회사 System and Method for Precasting and Tracking Red Tied Using Drone
JP6843530B2 (en) 2016-06-15 2021-03-17 任天堂株式会社 Game systems, methods, and game programs
EP4105921A1 (en) 2016-06-20 2022-12-21 Magic Leap, Inc. Augmented reality display system for evaluation and modification of neurological conditions, including visual processing and perception conditions
KR101817952B1 (en) * 2016-06-23 2018-01-12 주식회사 맥스트 See-through type head mounted display apparatus and method of controlling display depth thereof
CN109416573B (en) * 2016-07-12 2022-04-12 三菱电机株式会社 Equipment control system
CN106267552B (en) 2016-07-25 2020-03-10 京东方科技集团股份有限公司 Wearable device, virtual reality method and terminal system
WO2018020853A1 (en) 2016-07-29 2018-02-01 Necソリューションイノベータ株式会社 Mobile body control system, control signal transmission system, mobile body control method, program, and recording medium
US11467572B2 (en) 2016-07-29 2022-10-11 NEC Solution Innovations, Ltd. Moving object operation system, operation signal transmission system, moving object operation method, program, and recording medium
CH712799A1 (en) * 2016-08-10 2018-02-15 Derungs Louis Virtual reality method and system implementing such method.
CN114253400A (en) * 2016-08-22 2022-03-29 奇跃公司 Augmented reality display device with deep learning sensor
CN106239513A (en) * 2016-08-29 2016-12-21 合肥凌翔信息科技有限公司 A kind of remote controlled robot system
WO2018047433A1 (en) * 2016-09-08 2018-03-15 ソニー株式会社 Information processing device
AU2016424054B2 (en) 2016-09-24 2020-11-26 Huawei Technologies Co., Ltd. Method for managing application program use time offline, and terminal device
EP4333428A2 (en) * 2016-10-21 2024-03-06 Magic Leap, Inc. System and method for presenting image content on multiple depth planes by providing multiple intra-pupil parallax views
KR102650572B1 (en) * 2016-11-16 2024-03-26 삼성전자주식회사 Electronic apparatus and method for controlling thereof
EP3470976A1 (en) 2017-10-12 2019-04-17 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Method and apparatus for efficient delivery and usage of audio messages for high quality of experience
US20180144554A1 (en) 2016-11-18 2018-05-24 Eyedaptic, LLC Systems for augmented reality visual aids and tools
US10055028B2 (en) * 2016-12-05 2018-08-21 Google Llc End of session detection in an augmented and/or virtual reality environment
US9906290B2 (en) * 2016-12-06 2018-02-27 Mediatek Singapore Pte. Ltd. Method for network merging and configuration sharing and associated apparatus
KR20180065515A (en) 2016-12-08 2018-06-18 박순구 Multifunctional wearable display apparatus
CN108205416B (en) * 2016-12-20 2021-06-18 法法汽车(中国)有限公司 Method for activating terminal screen by using vehicle machine, vehicle machine and intelligent vehicle
JP6382928B2 (en) * 2016-12-27 2018-08-29 株式会社コロプラ Method executed by computer to control display of image in virtual space, program for causing computer to realize the method, and computer apparatus
JP6255470B1 (en) * 2016-12-27 2017-12-27 株式会社Qdレーザ Retina scanning optometry apparatus, retinal scanning optometry system, retinal scanning optometry method, retinal scanning eyewear providing system, retinal scanning eyewear providing method, and retinal scanning eyewear
WO2018122859A1 (en) * 2016-12-31 2018-07-05 Lumus Ltd. Eye tracker based on retinal imaging via light-guide optical element
WO2018139020A1 (en) * 2017-01-24 2018-08-02 ソニー株式会社 Hinge mechanism and head-mounted display comprising said hinge mechanism
US20180212314A1 (en) * 2017-01-24 2018-07-26 Intel Corporation Wearable device sar reduction and antenna improvement
US11566860B2 (en) 2017-01-27 2023-01-31 Armaments Research Company Inc. Weapon usage monitoring system with multi-echelon threat analysis
US11215416B2 (en) 2017-01-27 2022-01-04 Armaments Research Company, Inc. Weapon monitoring system with a map-based dashboard interface
US11125521B2 (en) * 2017-01-27 2021-09-21 Armaments Research Company, Inc. Weapon usage monitoring system for initiating notifications and commands based on dashboard actions
JP7158395B2 (en) 2017-02-23 2022-10-21 マジック リープ, インコーポレイテッド Variable focus imaging device based on polarization conversion
EP3376279B1 (en) * 2017-03-13 2022-08-31 Essilor International Optical device for a head-mounted display, and head-mounted device incorporating it for augmented reality
US11030980B2 (en) 2017-03-14 2021-06-08 Nec Corporation Information processing apparatus, information processing system, control method, and program
US10564533B2 (en) * 2017-03-21 2020-02-18 Magic Leap, Inc. Low-profile beam splitter
KR102579249B1 (en) 2017-03-21 2023-09-15 매직 립, 인코포레이티드 Methods, devices, and systems for illuminating spatial light modulators
JP2018170656A (en) * 2017-03-30 2018-11-01 ソニーセミコンダクタソリューションズ株式会社 Image capturing device, image capturing module, image capturing system, and control method of image capturing device
CN106897576B (en) * 2017-04-17 2023-10-31 安徽咏鹅家纺股份有限公司 Intelligent sleep monitoring and sleep-aiding cloud service system
KR20230108352A (en) 2017-05-01 2023-07-18 매직 립, 인코포레이티드 Matching content to a spatial 3d environment
CN108955396A (en) * 2017-05-17 2018-12-07 广东建元和安科技发展有限公司 A kind of hand-held anti-sniper active probe device
JP6947661B2 (en) * 2017-05-26 2021-10-13 株式会社コロプラ A program executed by a computer capable of communicating with the head mount device, an information processing device for executing the program, and a method executed by a computer capable of communicating with the head mount device.
JP6613267B2 (en) 2017-06-02 2019-11-27 任天堂株式会社 Information processing system, information processing program, information processing apparatus, and information processing method
JP6837921B2 (en) 2017-06-02 2021-03-03 任天堂株式会社 Game programs, information processing devices, information processing systems, and information processing methods
JP6653293B2 (en) 2017-06-05 2020-02-26 任天堂株式会社 Information processing system, information processing program, information processing apparatus, and information processing method
KR102482756B1 (en) * 2017-06-14 2022-12-30 삼성전자주식회사 Head-mounted display apparatus
KR102347128B1 (en) * 2017-06-29 2022-01-05 한국전자기술연구원 High visibility microdisplay device and HMD comprising the same
CA3068046C (en) * 2017-07-06 2022-12-13 Magic Leap, Inc. Speckle-reduction in virtual and augmented reality systems and methods
US20190012841A1 (en) 2017-07-09 2019-01-10 Eyedaptic, Inc. Artificial intelligence enhanced system for adaptive control driven ar/vr visual aids
US11409105B2 (en) 2017-07-24 2022-08-09 Mentor Acquisition One, Llc See-through computer display systems
US10578869B2 (en) 2017-07-24 2020-03-03 Mentor Acquisition One, Llc See-through computer display systems with adjustable zoom cameras
US10422995B2 (en) 2017-07-24 2019-09-24 Mentor Acquisition One, Llc See-through computer display systems with stray light management
KR102026526B1 (en) * 2017-08-03 2019-09-30 주식회사 에스지엠 Authentication system using bio-information and screen golf system using the same
US10969584B2 (en) 2017-08-04 2021-04-06 Mentor Acquisition One, Llc Image expansion optic for head-worn computer
KR102485447B1 (en) 2017-08-09 2023-01-05 삼성전자주식회사 Optical window system and see-through type display apparatus including the same
US10585286B2 (en) 2017-08-15 2020-03-10 Samsung Electronics Co., Ltd. System and method for displaying real or virtual scene
WO2019040736A1 (en) 2017-08-24 2019-02-28 Vuzix Corporation Swim ar goggles
US11211030B2 (en) 2017-08-29 2021-12-28 Apple Inc. Electronic device with adaptive display
JP6987737B2 (en) * 2017-09-13 2022-01-05 株式会社コロプラ A method performed on a computer to provide content in a means of transportation, a program that causes the computer to execute the method, a content providing device, and a content providing system.
JP6458106B1 (en) * 2017-09-13 2019-01-23 株式会社コロプラ Method executed by computer to provide content in moving means, program for causing computer to execute the method, content providing apparatus, and content providing system
WO2019059044A1 (en) 2017-09-20 2019-03-28 日本電気株式会社 Information processing device, control method, and program
EP3695270A4 (en) 2017-10-11 2021-06-23 Magic Leap, Inc. Augmented reality display comprising eyepiece having a transparent emissive display
FR3072468B1 (en) * 2017-10-13 2020-02-14 Alessandro Manneschi DEVICE AND METHOD FOR DETECTING UNAUTHORIZED OBJECTS OR MATERIALS CARRIED BY AN INDIVIDUAL IN A PROTECTED ACCESS AREA
US10984508B2 (en) 2017-10-31 2021-04-20 Eyedaptic, Inc. Demonstration devices and methods for enhancement for low vision users and systems improvements
KR102063780B1 (en) * 2017-11-21 2020-01-08 고려대학교산학협력단 Virtual reality device for preventing myopic progression
IL255955B (en) 2017-11-27 2019-06-30 Elbit Systems Ltd System and method for providing synthetic infromation on a see-through device
KR102028997B1 (en) * 2017-11-29 2019-10-07 엘지디스플레이 주식회사 Head mount display device
CN109918975B (en) * 2017-12-13 2022-10-21 腾讯科技(深圳)有限公司 Augmented reality processing method, object identification method and terminal
EP3729243A4 (en) 2017-12-19 2021-09-15 Datalogic IP Tech S.r.l. User-wearable systems and methods to collect data and provide information
CN111512211B (en) 2017-12-20 2022-06-21 伊奎蒂公司 Augmented reality display system
US10360454B1 (en) * 2017-12-28 2019-07-23 Rovi Guides, Inc. Systems and methods for presenting supplemental content in augmented reality
KR20190085368A (en) 2018-01-10 2019-07-18 삼성전자주식회사 Folding-type wearable electronic device with optical transfering member transfer to transparent member from projector
CN111684323B (en) * 2018-01-26 2022-09-13 林巴克4Pi有限公司 Compact optics for cross-configuration of virtual reality and mixed reality
WO2019156839A1 (en) * 2018-02-09 2019-08-15 Vuzix Corporation Image light guide with circular polarizer
WO2019160698A2 (en) 2018-02-16 2019-08-22 Valve Corporation Using detected pupil location to align optical components of a head-mounted display
US10735649B2 (en) 2018-02-22 2020-08-04 Magic Leap, Inc. Virtual and augmented reality systems and methods using display system control information embedded in image data
KR20200121357A (en) * 2018-02-22 2020-10-23 매직 립, 인코포레이티드 Object creation using physical manipulation
KR102546994B1 (en) * 2018-02-26 2023-06-22 엘지전자 주식회사 Wearable glass device
US10695667B2 (en) * 2018-03-14 2020-06-30 Sony Interactive Entertainment LLC Pro gaming AR visor and method for parsing context specific HUD content from a video stream
EP3777184A4 (en) 2018-03-28 2021-12-15 Nokia Technologies Oy A method, an apparatus and a computer program product for virtual reality
AT521130A1 (en) * 2018-04-04 2019-10-15 Peterseil Thomas Method for displaying a virtual object
JP6368881B1 (en) * 2018-04-09 2018-08-01 チームラボ株式会社 Display control system, terminal device, computer program, and display control method
KR102063395B1 (en) 2018-04-10 2020-01-07 (주)세이프인 Virtual fire training simulator
CN112601509B (en) * 2018-05-29 2024-01-23 爱达扩视眼镜公司 Hybrid perspective augmented reality system and method for low vision users
US11353951B2 (en) 2018-06-08 2022-06-07 Hewlett-Packard Development Company, L.P. Computing input devices with sensors concealed in articles of clothing
FR3081639B1 (en) * 2018-06-11 2020-07-31 Orange OPTICAL DATA TRANSMISSION METHOD AND SYSTEM FOR VIRTUAL OR AUGMENTED REALITY APPLICATIONS
JP7175664B2 (en) * 2018-07-11 2022-11-21 克行 廣中 Voice conversation radio with light emitting function
TWI797142B (en) * 2018-07-12 2023-04-01 揚明光學股份有限公司 Optical device and fabrication method thereof
US10834986B2 (en) * 2018-07-12 2020-11-17 Sarah Nicole Ciccaglione Smart safety helmet with heads-up display
DE102018121258A1 (en) 2018-08-30 2020-03-05 Vr Coaster Gmbh & Co. Kg Head-mounted display and amusement facility with such a head-mounted display
US11174022B2 (en) * 2018-09-17 2021-11-16 International Business Machines Corporation Smart device for personalized temperature control
CN112969436B (en) 2018-09-24 2024-02-09 爱达扩视眼镜公司 Autonomous enhanced hands-free control in electronic vision assistance devices
AU2019358194A1 (en) 2018-10-12 2021-05-20 Armaments Research Company Inc. Firearm monitoring and remote support system
KR101942770B1 (en) 2018-11-29 2019-01-28 네이버시스템(주) Image processing system to synthesis revising photo image with location information
EP3663904A1 (en) * 2018-12-07 2020-06-10 Iristick nv Portable mobile mode for headset
CN111310530B (en) * 2018-12-12 2023-06-30 百度在线网络技术(北京)有限公司 Sign language and voice conversion method and device, storage medium and terminal equipment
KR102185519B1 (en) * 2019-02-13 2020-12-02 주식회사 싸이큐어 Method of garbling real-world image for direct encoding type see-through head mount display and direct encoding type see-through head mount display with real-world image garbling function
KR20200099047A (en) * 2019-02-13 2020-08-21 주식회사 싸이큐어 Method of garbling real-world image for see-through head mount display and see-through head mount display with real-world image garbling function
EP3951752A4 (en) * 2019-03-27 2022-05-18 Panasonic Intellectual Property Management Co., Ltd. Head-mounted display
JP6641055B2 (en) * 2019-05-29 2020-02-05 株式会社東芝 Wearable terminal, system and display method
TW202109134A (en) * 2019-06-04 2021-03-01 以色列商魯姆斯有限公司 Binocular type head mounted display system with adjustable interpupillary distance mechanism
US20220121283A1 (en) * 2019-06-12 2022-04-21 Hewlett-Packard Development Company, L.P. Finger clip biometric virtual reality controllers
CN110276578A (en) * 2019-06-14 2019-09-24 武汉合创源科技有限公司 A kind of merchandise warehouse safety monitoring system and its method
CN113366373B (en) * 2019-08-06 2024-03-19 松下知识产权经营株式会社 Display device
US20220335108A1 (en) * 2019-09-05 2022-10-20 Open Lens Project Ltd. System and method for management of digital media content
US11617504B2 (en) 2019-09-18 2023-04-04 Verily Life Sciences Llc Retinal camera with dynamic illuminator for expanding eyebox
KR102401854B1 (en) * 2019-11-29 2022-06-08 주식회사 카이비전 Augmented reality glass for multi-function
JP7170277B2 (en) * 2019-12-09 2022-11-14 株式会社辰巳菱機 Reporting device
US20210200845A1 (en) * 2019-12-31 2021-07-01 Atlassian Pty Ltd. Illumination-based user authentication
US11157086B2 (en) * 2020-01-28 2021-10-26 Pison Technology, Inc. Determining a geographical location based on human gestures
US11598967B2 (en) 2020-03-27 2023-03-07 ResMed Pty Ltd Positioning and stabilising structure and system incorporating same
US11686948B2 (en) 2020-03-27 2023-06-27 ResMed Pty Ltd Positioning, stabilising, and interfacing structures and system incorporating same
KR20220166378A (en) * 2020-03-27 2022-12-16 레스메드 피티와이 엘티디 Positioning, stabilising, and interfacing structures and system incorporating same
JP2021163287A (en) * 2020-03-31 2021-10-11 エイベックス・テクノロジーズ株式会社 Augmenting reality system
CN114895464A (en) * 2020-03-31 2022-08-12 优奈柯恩(北京)科技有限公司 Display device
US11915276B2 (en) * 2020-04-28 2024-02-27 Cisco Technology, Inc. System, method, and computer readable storage media for millimeter wave radar detection of physical actions coupled with an access point off-load control center
US20230194881A1 (en) * 2020-05-29 2023-06-22 Vrmedia S.R.L. System for augmented reality
KR102498191B1 (en) * 2020-06-02 2023-02-10 주식회사 피앤씨솔루션 Optical system for augmented reality with a reflective surface and a head mounted display apparatus using thereof
US11513360B2 (en) 2020-07-17 2022-11-29 Toyota Research Institute, Inc. Enhanced contrast augmented reality (AR) tags for visual fiducial system
JP7442140B2 (en) * 2020-09-10 2024-03-04 公益財団法人鉄道総合技術研究所 Computer system and control method
CN112565720A (en) * 2020-09-17 2021-03-26 苏州恒创文化传播有限公司 3D projection system based on holographic technology
WO2022072261A1 (en) * 2020-09-30 2022-04-07 Snap Inc. Low power camera pipeline for computer vision mode in augmented reality eyewear
JP2022082490A (en) * 2020-11-22 2022-06-02 斉 永岡 Projection function, smart glass with display, and output terminal
IT202000028787A1 (en) * 2020-12-01 2022-06-01 Virtual Job SYSTEM AND METHOD OF USING COMPUTER SOFTWARE AND HARDWARE COMPONENTS FOR LEARNING SAFETY PROCEDURES IN THE WORKPLACE CHARACTERIZED BY THE USE OF VIRTUAL REALITY
DE102020215285A1 (en) 2020-12-03 2022-06-09 Robert Bosch Gesellschaft mit beschränkter Haftung Method for parallax control and binocular data glasses with a computing unit for carrying out the method
JP2024502255A (en) * 2020-12-21 2024-01-18 ディジレンズ インコーポレイテッド Eye glow suppression in waveguide-based displays
JP2022113031A (en) * 2021-01-22 2022-08-03 ソフトバンク株式会社 Control device, program, system, and control method
WO2022212072A1 (en) * 2021-03-31 2022-10-06 Snap Inc. Eyewear projector brightness control
WO2022207145A1 (en) * 2021-03-31 2022-10-06 Arm Limited Systems, devices, and/or processes for dynamic surface marking
US11892624B2 (en) * 2021-04-27 2024-02-06 Microsoft Technology Licensing, Llc Indicating an off-screen target
CN113112183B (en) * 2021-05-06 2024-03-19 国家市场监督管理总局信息中心 Method, system and readable storage medium for risk assessment of entry and exit dangerous goods
KR20220151420A (en) * 2021-05-06 2022-11-15 삼성전자주식회사 Wearable electronic device and method for outputting 3d image
KR102337907B1 (en) * 2021-05-20 2021-12-09 주식회사 아진엑스텍 Augmented reality smart glass device
GB2608186A (en) 2021-06-25 2022-12-28 Thermoteknix Systems Ltd Augmented reality system
WO2023277840A1 (en) * 2021-06-28 2023-01-05 Kai̇tek Yazilim Elektroni̇k Bi̇lgi̇sayar Sanayi̇ Ve Ti̇caret Li̇mi̇ted Şi̇rketi̇ System that performs mass production process analysis with mixed reality glasses with eye tracking and accelerometer
KR102321470B1 (en) * 2021-07-28 2021-11-03 주식회사 셀리코 Electrochromic layer based vision aid device and vision aid glasses comprising thereof
HUP2100311A1 (en) * 2021-08-31 2023-03-28 Pazmany Peter Katolikus Egyetem System and procedure based on augmented reality
US20230077780A1 (en) * 2021-09-16 2023-03-16 International Business Machines Corporation Audio command corroboration and approval
CN114115453B (en) * 2021-10-21 2024-02-09 维沃移动通信有限公司 Electronic equipment
WO2023076841A1 (en) * 2021-10-25 2023-05-04 Atieva, Inc. Contextual vehicle control with visual representation
CN114030355A (en) * 2021-11-15 2022-02-11 智己汽车科技有限公司 Vehicle control method and device, vehicle and medium
KR102631231B1 (en) * 2021-11-17 2024-01-31 주식회사 피앤씨솔루션 Ar glasses apparatus with protective cover and protective cover for ar glass apparatus
GB202116754D0 (en) * 2021-11-19 2022-01-05 Sensivision Ltd Handheld guidance device for the visually-impaired
WO2023107251A1 (en) * 2021-12-06 2023-06-15 Lumileds Llc Optical filters compensating for changes in performance of next generation leds compared to legacy devices
US11914093B2 (en) * 2021-12-07 2024-02-27 Microsoft Technology Licensing, Llc RF antenna scanning for human movement classification
US20230217007A1 (en) * 2021-12-30 2023-07-06 Ilteris Canberk Hyper-connected and synchronized ar glasses
US20230236417A1 (en) * 2022-01-24 2023-07-27 Microsoft Technology Licensing, Llc Illuminating spatial light modulator with led array
WO2023149963A1 (en) 2022-02-01 2023-08-10 Landscan Llc Systems and methods for multispectral landscape mapping
WO2023175919A1 (en) * 2022-03-18 2023-09-21 日本電気株式会社 Firefighting effort assistance device, firefighting effort assistance method, and recording medium in which firefighting effort assistance program is stored
WO2023179869A1 (en) * 2022-03-25 2023-09-28 Huawei Technologies Co., Ltd. Electronic communication device with image projection
US20230306499A1 (en) * 2022-03-28 2023-09-28 Google Llc Vision-powered auto-scroll for lists
CN116389674A (en) * 2023-03-28 2023-07-04 射阳港海会议服务有限公司 Remote conference video device

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101243392A (en) * 2005-08-15 2008-08-13 皇家飞利浦电子股份有限公司 System, apparatus, and method for augmented reality glasses for end-user programming
JP2009222774A (en) * 2008-03-13 2009-10-01 Fujifilm Corp Digital content reproduction device and reproduction control method for digital content
WO2011106798A1 (en) * 2010-02-28 2011-09-01 Osterhout Group, Inc. Local advertising content on an interactive head-mounted eyepiece

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09139927A (en) * 1995-11-15 1997-05-27 Matsushita Electric Ind Co Ltd Multi-spot image transmitter
JP3921915B2 (en) * 2000-03-22 2007-05-30 松下電器産業株式会社 Display device
CN1922651A (en) * 2004-06-10 2007-02-28 松下电器产业株式会社 Wearable type information presentation device
JP4635572B2 (en) * 2004-11-09 2011-02-23 コニカミノルタホールディングス株式会社 Video display device
JP2008176681A (en) * 2007-01-22 2008-07-31 Fujifilm Corp Glasses-type communication support device
JP5309448B2 (en) * 2007-01-26 2013-10-09 ソニー株式会社 Display device and display method
KR101576567B1 (en) * 2009-12-04 2015-12-10 한국전자통신연구원 gesture input apparatus and gesture recognition method and apparatus using the same
US20110213664A1 (en) * 2010-02-28 2011-09-01 Osterhout Group, Inc. Local advertising content on an interactive head-mounted eyepiece
JP6211144B1 (en) * 2016-07-04 2017-10-11 株式会社コロプラ Display control method and program for causing a computer to execute the display control method

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101243392A (en) * 2005-08-15 2008-08-13 皇家飞利浦电子股份有限公司 System, apparatus, and method for augmented reality glasses for end-user programming
JP2009222774A (en) * 2008-03-13 2009-10-01 Fujifilm Corp Digital content reproduction device and reproduction control method for digital content
WO2011106798A1 (en) * 2010-02-28 2011-09-01 Osterhout Group, Inc. Local advertising content on an interactive head-mounted eyepiece

Cited By (230)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI687721B (en) * 2010-11-08 2020-03-11 盧森堡商喜瑞爾工業公司 Display device
CN105319714A (en) * 2014-07-31 2016-02-10 精工爱普生株式会社 Display apparatus, method for controlling display apparatus, and program
CN111897425B (en) * 2014-07-31 2024-03-12 三星电子株式会社 Wearable glasses and method for providing information using the same
CN112213856A (en) * 2014-07-31 2021-01-12 三星电子株式会社 Wearable glasses and method of displaying image via wearable glasses
CN111897425A (en) * 2014-07-31 2020-11-06 三星电子株式会社 Wearable glasses and method for providing information using the same
CN105319714B (en) * 2014-07-31 2019-09-06 精工爱普生株式会社 Display device, the control method of display device and computer storage medium
CN106575358A (en) * 2014-08-05 2017-04-19 康蒂-特米克微电子有限公司 Driver assistance system
CN106575358B (en) * 2014-08-05 2020-08-04 康蒂-特米克微电子有限公司 Driver assistance system
CN107302845B (en) * 2014-09-22 2020-11-10 爱父爱斯吉尔有限公司 Low delay simulation apparatus, method and computer readable medium using direction prediction
CN107302845A (en) * 2014-09-22 2017-10-27 爱父爱斯吉尔有限公司 The low time delay analogue means and method, the computer program for this method of utilization orientation prediction
CN107003824A (en) * 2014-10-30 2017-08-01 语音处理解决方案有限公司 Control device for dictating machine
CN106200892A (en) * 2014-10-30 2016-12-07 联发科技股份有限公司 Virtual reality system, mobile device, Wearable device and the processing method of entry event
US10108256B2 (en) 2014-10-30 2018-10-23 Mediatek Inc. Systems and methods for processing incoming events while performing a virtual reality session
CN107113071A (en) * 2014-11-11 2017-08-29 索尼公司 The dynamic subscriber of media experience for enabling BAN recommends
CN104394317A (en) * 2014-11-20 2015-03-04 段然 Method for processing recorded images of head-wearing recording equipment
CN104702911A (en) * 2014-11-24 2015-06-10 段然 Wearable video device real-time wireless transmission method
WO2016086439A1 (en) * 2014-12-04 2016-06-09 上海交通大学 Auto-aligning light-transmitting head-worn display device
CN105740743A (en) * 2014-12-30 2016-07-06 手持产品公司 Augmented reality vision barcode scanning system and method
CN105740743B (en) * 2014-12-30 2020-07-21 手持产品公司 Augmented reality visual barcode scanning system and method
CN107210823A (en) * 2015-02-03 2017-09-26 索尼公司 Methods, devices and systems for collecting writing pattern using BAN
CN104576709A (en) * 2015-02-03 2015-04-29 京东方科技集团股份有限公司 OLED (organic light-emitting diode) display substrate, method for manufacturing same and wearable equipment
CN107710009B (en) * 2015-02-27 2021-06-29 威尔乌集团 Controller visualization in virtual and augmented reality environments
CN107710009A (en) * 2015-02-27 2018-02-16 威尔乌集团 Controller visualization in virtual and augmented reality environment
CN105938391A (en) * 2015-03-06 2016-09-14 松下电器(美国)知识产权公司 Wearable terminal and method for controlling the same
CN104657103A (en) * 2015-03-16 2015-05-27 哈尔滨工业大学 Handheld CAVE projection system based on depth camera
CN104657103B (en) * 2015-03-16 2017-06-16 哈尔滨工业大学 Hand-held CAVE optical projection systems based on depth camera
CN107430479A (en) * 2015-03-31 2017-12-01 索尼公司 Information processor, information processing method and program
CN104731338B (en) * 2015-03-31 2017-11-14 深圳市虚拟现实科技有限公司 One kind is based on enclosed enhancing virtual reality system and method
CN104731338A (en) * 2015-03-31 2015-06-24 深圳市虚拟现实科技有限公司 Closed type augmented and virtual reality system and method
CN107660283A (en) * 2015-04-03 2018-02-02 甲骨文国际公司 For realizing the method and system of daily record resolver in Log Analysis System
CN104765456A (en) * 2015-04-08 2015-07-08 成都爱瑞斯文化传播有限责任公司 Virtual space system and building method thereof
CN112882233B (en) * 2015-05-19 2023-08-01 奇跃公司 Double composite light field device
CN112882233A (en) * 2015-05-19 2021-06-01 奇跃公司 Double composite light field device
CN107924522A (en) * 2015-06-24 2018-04-17 奇跃公司 Augmented reality equipment, system and method for purchase
CN107810646A (en) * 2015-06-24 2018-03-16 微软技术许可有限责任公司 Filtering sound for conference applications
CN107924522B (en) * 2015-06-24 2022-06-03 奇跃公司 Augmented reality device, system and method for purchasing
CN107810646B (en) * 2015-06-24 2020-04-03 微软技术许可有限责任公司 Filtered sound for conferencing applications
CN106326813B (en) * 2015-06-30 2023-04-07 深圳指芯智能科技有限公司 Intelligent variable-frequency 3D fingerprint sensor
CN106326813A (en) * 2015-06-30 2017-01-11 深圳指芯智能科技有限公司 Intelligent frequency conversion 3D fingerprint sensor
CN105070204A (en) * 2015-07-24 2015-11-18 江苏天晟永创电子科技有限公司 Miniature AMOLED optical display
CN105022980A (en) * 2015-07-28 2015-11-04 福建新大陆电脑股份有限公司 Barcode image identifying and reading device
CN105091948A (en) * 2015-09-02 2015-11-25 徐艺斌 Multifunctional sensor module for myopia prevention frame
CN106507128A (en) * 2015-09-08 2017-03-15 科理特株式会社 Virtual reality imagery transmission method, player method and the program using the method
CN105259655A (en) * 2015-09-10 2016-01-20 上海理鑫光学科技有限公司 3D video system improving authenticity of virtual and actual superposition
CN105117111A (en) * 2015-09-23 2015-12-02 小米科技有限责任公司 Rendering method and device for virtual reality interaction frames
US11262762B2 (en) 2015-09-25 2022-03-01 Apple Inc. Non-solid object monitoring
US11693414B2 (en) 2015-09-25 2023-07-04 Apple Inc. Non-solid object monitoring
CN107923757A (en) * 2015-09-25 2018-04-17 苹果公司 Non-solid object monitoring
CN108028038A (en) * 2015-10-05 2018-05-11 三美电机株式会社 Display device
CN108028038B (en) * 2015-10-05 2021-03-19 三美电机株式会社 Display device
CN113759555A (en) * 2015-10-05 2021-12-07 迪吉伦斯公司 Waveguide display
CN108136258B (en) * 2015-10-28 2020-11-24 微软技术许可有限责任公司 Method and system for adjusting image frame based on tracking eye movement and head-mounted device
CN108136258A (en) * 2015-10-28 2018-06-08 微软技术许可有限责任公司 Picture frame is adjusted based on tracking eye motion
WO2017092396A1 (en) * 2015-12-01 2017-06-08 深圳市掌网科技股份有限公司 Virtual reality interaction system and method
CN106814844A (en) * 2015-12-01 2017-06-09 深圳市掌网科技股份有限公司 A kind of virtual reality interactive system and method
CN105976424A (en) * 2015-12-04 2016-09-28 乐视致新电子科技(天津)有限公司 Image rendering processing method and device
CN105487229A (en) * 2015-12-18 2016-04-13 济南中景电子科技有限公司 Multichannel interaction virtual reality glasses
CN105608436A (en) * 2015-12-23 2016-05-25 联想(北京)有限公司 Power consumption control method and electronic device
CN115242304A (en) * 2015-12-30 2022-10-25 艾伦神火公司 Optical narrowcast
CN105455285A (en) * 2015-12-31 2016-04-06 北京小鸟看看科技有限公司 Virtual reality helmet adaptation method
CN108885339A (en) * 2015-12-31 2018-11-23 汤姆逊许可公司 For using the configuration of adaptive focal plane rendering virtual reality
CN105455285B (en) * 2015-12-31 2019-02-12 北京小鸟看看科技有限公司 A kind of virtual implementing helmet adaptation method
US10067349B2 (en) 2015-12-31 2018-09-04 Beijing Pico Technology Co., Ltd. Method of adapting a virtual reality helmet
CN105718167A (en) * 2016-01-21 2016-06-29 陈佩珊 Icon migration achieving method and system based on intelligent glasses leg touch
CN105739851A (en) * 2016-01-21 2016-07-06 陈佩珊 Icon migration realization method and system based on voice identification of smart glasses
CN105704501A (en) * 2016-02-06 2016-06-22 普宙飞行器科技(深圳)有限公司 Unmanned plane panorama video-based virtual reality live broadcast system
US11550399B2 (en) 2016-03-29 2023-01-10 Microsoft Technology Licensing, Llc Sharing across environments
CN108885521B (en) * 2016-03-29 2021-12-07 微软技术许可有限责任公司 Cross-environment sharing
CN108885521A (en) * 2016-03-29 2018-11-23 微软技术许可有限责任公司 Cross-environment is shared
US10891804B2 (en) 2016-04-19 2021-01-12 Adobe Inc. Image compensation for an occluding direct-view augmented reality system
CN107306332A (en) * 2016-04-19 2017-10-31 奥多比公司 The image compensation of inaccessible directly view augmented reality system
US11514657B2 (en) 2016-04-19 2022-11-29 Adobe Inc. Replica graphic causing reduced visibility of an image artifact in a direct-view of a real-world scene
CN109074212A (en) * 2016-04-26 2018-12-21 索尼公司 Information processing unit, information processing method and program
CN107402378A (en) * 2016-05-19 2017-11-28 财团法人金属工业研究发展中心 Frequency modulated(FM) radar transceiver
CN106094203A (en) * 2016-06-16 2016-11-09 捷开通讯(深圳)有限公司 VR system, for controlling wearable device and the method thereof of VR equipment
WO2017215223A1 (en) * 2016-06-16 2017-12-21 捷开通讯(深圳)有限公司 Vr system, wearable device for controlling vr device and method thereof
US10664011B2 (en) 2016-06-16 2020-05-26 JRD Communication (Shenzhen) Ltd. Wearable apparatus and method for controlling VR apparatus
CN107544661A (en) * 2016-06-24 2018-01-05 联想(北京)有限公司 A kind of information processing method and electronic equipment
CN106200972A (en) * 2016-07-14 2016-12-07 乐视控股(北京)有限公司 A kind of method and device adjusting virtual reality scenario parameter
CN109791391A (en) * 2016-07-24 2019-05-21 光场实验室公司 Calibration method for holographic energy guidance system
CN109791391B (en) * 2016-07-24 2021-02-02 光场实验室公司 Calibration method for holographic energy-guided systems
CN106162206A (en) * 2016-08-03 2016-11-23 北京疯景科技有限公司 Panorama recording, player method and device
CN109661594B (en) * 2016-08-22 2022-12-06 苹果公司 Intermediate range optical system for remote sensing receiver
CN109661594A (en) * 2016-08-22 2019-04-19 苹果公司 Intermediate range optical system for remote sensing receiver
CN106408303A (en) * 2016-09-21 2017-02-15 上海星寰投资有限公司 Payment method and system
CN106251153A (en) * 2016-09-21 2016-12-21 上海星寰投资有限公司 A kind of method of payment and system
CN106203410B (en) * 2016-09-21 2023-10-17 上海星寰投资有限公司 Identity verification method and system
CN106203410A (en) * 2016-09-21 2016-12-07 上海星寰投资有限公司 A kind of auth method and system
CN107885311A (en) * 2016-09-29 2018-04-06 深圳纬目信息技术有限公司 A kind of confirmation method of visual interactive, system and equipment
CN106507121A (en) * 2016-10-31 2017-03-15 易瓦特科技股份公司 A kind of live method of control, VR equipment and unmanned plane
CN106651355A (en) * 2016-11-08 2017-05-10 北京小米移动软件有限公司 Payment method and device, and virtual reality helmet
CN108089324B (en) * 2016-11-22 2022-01-18 霍尼韦尔国际公司 NTE display system and method with optical tracker
CN108089324A (en) * 2016-11-22 2018-05-29 霍尼韦尔国际公司 NTE display systems and method with optical tracker
CN106603107B (en) * 2016-12-21 2019-10-29 Tcl移动通信科技(宁波)有限公司 A kind of helmet and its control method
CN106603107A (en) * 2016-12-21 2017-04-26 惠州Tcl移动通信有限公司 Head-mounted device and control method thereof
US11057574B2 (en) 2016-12-28 2021-07-06 Microsoft Technology Licensing, Llc Systems, methods, and computer-readable media for using a video capture device to alleviate motion sickness via an augmented display for a passenger
CN108255295A (en) * 2016-12-28 2018-07-06 意美森公司 It is generated for the haptic effect of spatial dependence content
CN110036635B (en) * 2016-12-28 2021-01-01 微软技术许可有限责任公司 Systems, methods, and computer-readable media for mitigating motion sickness via enhanced display for passengers using a video capture device
CN106790579A (en) * 2016-12-28 2017-05-31 苏州商信宝信息科技有限公司 A kind of active information transferring method and its system based on intelligent glasses
CN110036635A (en) * 2016-12-28 2019-07-19 微软技术许可有限责任公司 Alleviate the system, method and computer-readable medium of motion sickness via the display of the enhancing for passenger for using video capture device
CN108255295B (en) * 2016-12-28 2022-06-03 意美森公司 Haptic effect generation for spatially dependent content
US10088911B2 (en) 2016-12-30 2018-10-02 Manuel Saez Programmable electronic helmet
CN106681955A (en) * 2017-01-04 2017-05-17 四川埃姆克伺服科技有限公司 Universal interface circuit for receiving signal from servo motor position sensor
CN106681955B (en) * 2017-01-04 2023-05-09 四川埃姆克伺服科技有限公司 Universal interface circuit for receiving signals from a servo motor position sensor
CN106864362A (en) * 2017-01-18 2017-06-20 陈宗坤 A kind of warning sign with purification of air
CN106846383B (en) * 2017-01-23 2020-04-17 宁波诺丁汉大学 High dynamic range image imaging method based on 3D digital microscopic imaging system
CN106846383A (en) * 2017-01-23 2017-06-13 宁波诺丁汉大学 High dynamic range images imaging method based on 3D digital micro-analysis imaging systems
CN108433724A (en) * 2017-02-16 2018-08-24 三星电子株式会社 The method and wearable electronic of service are provided based on biometric information
CN106597673B (en) * 2017-02-28 2020-04-03 京东方科技集团股份有限公司 Virtual reality display device and driving method and driving module thereof
CN106597673A (en) * 2017-02-28 2017-04-26 京东方科技集团股份有限公司 Virtual reality display apparatus, and driving method and driving module thereof
CN106932906A (en) * 2017-03-04 2017-07-07 国家电网公司 Mixed reality display device
CN106934361A (en) * 2017-03-06 2017-07-07 苏州佳世达光电有限公司 A kind of discrimination method and electronic equipment
CN110352370B (en) * 2017-03-07 2021-04-20 苹果公司 Head-mounted display system
CN110352370A (en) * 2017-03-07 2019-10-18 苹果公司 Wear-type display system
US11822078B2 (en) 2017-03-07 2023-11-21 Apple Inc. Head-mounted display system
CN106842576A (en) * 2017-03-23 2017-06-13 核桃智能科技(常州)有限公司 It is a kind of to wear intelligent display device with functionality mobile communication
CN107015655A (en) * 2017-04-11 2017-08-04 苏州和云观博数字科技有限公司 Museum virtual scene AR experiences eyeglass device and its implementation
CN106871973A (en) * 2017-04-21 2017-06-20 佛山市川东磁电股份有限公司 A kind of Temperature Humidity Sensor
CN110869901A (en) * 2017-05-08 2020-03-06 Lg电子株式会社 User interface device for vehicle and vehicle
CN110869901B (en) * 2017-05-08 2024-01-09 Lg电子株式会社 User interface device for vehicle and vehicle
CN108762490B (en) * 2017-05-09 2021-06-22 苏州乐轩科技有限公司 Device for mixed reality
CN108762490A (en) * 2017-05-09 2018-11-06 苏州乐轩科技有限公司 Device for mixed reality
CN107071285A (en) * 2017-05-16 2017-08-18 广东交通职业技术学院 One kind is with shooting method, memory and unmanned plane with clapping device
CN108939316A (en) * 2017-05-17 2018-12-07 维申Rt有限公司 Patient monitoring system
CN108958461A (en) * 2017-05-24 2018-12-07 宏碁股份有限公司 Have the virtual reality system and its control method of self adaptive control
CN107193381A (en) * 2017-05-31 2017-09-22 湖南工业大学 A kind of intelligent glasses and its display methods based on eyeball tracking sensing technology
CN107679380A (en) * 2017-06-22 2018-02-09 国网浙江平湖市供电公司 A kind of intelligent patrol detection device and method of identity-based identification
CN107679380B (en) * 2017-06-22 2020-08-11 国网浙江平湖市供电公司 Intelligent inspection device and method based on identity recognition
US10891919B2 (en) 2017-06-26 2021-01-12 Boe Technology Group Co., Ltd. Display system and image display method
CN109215132A (en) * 2017-06-30 2019-01-15 华为技术有限公司 A kind of implementation method and equipment of augmented reality business
CN107422480A (en) * 2017-08-03 2017-12-01 深圳市汇龙天成科技有限公司 A kind of semi-transparent semi-reflecting toroidal lens shows structure and display methods
CN109425989A (en) * 2017-08-21 2019-03-05 精工爱普生株式会社 The manufacturing method of arrangement for deflecting, display device and arrangement for deflecting
CN109425989B (en) * 2017-08-21 2022-03-22 精工爱普生株式会社 Deflection device, display device, and method for manufacturing deflection device
CN107609492A (en) * 2017-08-25 2018-01-19 西安电子科技大学 Distorted image quality based on EEG signals perceives evaluation method
CN107609492B (en) * 2017-08-25 2019-06-21 西安电子科技大学 Distorted image quality based on EEG signals perceives evaluation method
WO2019061825A1 (en) * 2017-09-29 2019-04-04 歌尔股份有限公司 Vr/ar head-mounted device
TWI660630B (en) * 2017-12-06 2019-05-21 瑞昱半導體股份有限公司 Method and system for detecting video scan type
US10748511B2 (en) 2017-12-06 2020-08-18 Realtek Semiconductor Corp. Method and system for detecting video scan type
CN108169901A (en) * 2017-12-27 2018-06-15 北京传嘉科技有限公司 VR glasses
CN111433657A (en) * 2017-12-28 2020-07-17 深圳市柔宇科技有限公司 Diopter adjusting device and electronic equipment
CN108122248B (en) * 2018-01-15 2020-04-24 武汉大学 Dam natural vibration frequency identification method based on video measurement
CN108122248A (en) * 2018-01-15 2018-06-05 武汉大学 Dam natural frequency of vibration recognition methods based on video measuring
CN108459812A (en) * 2018-01-22 2018-08-28 郑州升达经贸管理学院 A kind of fine arts track, which is shown, chases system and method
CN111656334A (en) * 2018-01-29 2020-09-11 美光科技公司 Memory controller with programmable atomic operation
CN108798360A (en) * 2018-02-01 2018-11-13 李绍辉 Smog Quick diffusing method based on the communication technology
CN108427830A (en) * 2018-02-09 2018-08-21 中建五局第三建设有限公司 A kind of method and device for constructing object space setting-out using mixed reality technological guidance
CN108479056A (en) * 2018-03-05 2018-09-04 成都看客网络技术有限公司 A kind of blind person with grabbing doll machine on the net
CN108479056B (en) * 2018-03-05 2021-12-31 江苏嘉尚环保科技有限公司 Online doll grabbing machine for blind people
US11631380B2 (en) 2018-03-14 2023-04-18 Sony Corporation Information processing apparatus, information processing method, and recording medium
US20190293746A1 (en) * 2018-03-26 2019-09-26 Electronics And Telecomunications Research Institute Electronic device for estimating position of sound source
CN108337573A (en) * 2018-03-26 2018-07-27 京东方科技集团股份有限公司 A kind of implementation method that race explains in real time and medium
CN108398791A (en) * 2018-03-29 2018-08-14 陈超平 A kind of nearly eye display device based on polarisation contact lenses
US10757247B2 (en) 2018-06-11 2020-08-25 Lenovo (Beijing) Co., Ltd. Switching method, apparatus and electronic device thereof
CN108803877A (en) * 2018-06-11 2018-11-13 联想(北京)有限公司 Switching method, device and electronic equipment
CN108982062A (en) * 2018-06-14 2018-12-11 上海卫星工程研究所 The visual field alignment methods of linear array image optics load in a kind of satellite Stray Light Test
CN108982062B (en) * 2018-06-14 2020-04-21 上海卫星工程研究所 Visual field alignment method for linear array imaging optical load in satellite stray light test
CN108983636A (en) * 2018-06-20 2018-12-11 浙江大学 Human-machine intelligence's symbiosis plateform system
CN110794644B (en) * 2018-08-03 2023-02-24 扬明光学股份有限公司 Optical device and method for manufacturing the same
CN110794644A (en) * 2018-08-03 2020-02-14 扬明光学股份有限公司 Optical device and method for manufacturing the same
CN112601993A (en) * 2018-08-26 2021-04-02 鲁姆斯有限公司 Reflection suppression in near-eye displays
CN112868023A (en) * 2018-10-15 2021-05-28 艾玛迪斯简易股份公司 Augmented reality system and method
CN111144921B (en) * 2018-11-06 2024-03-08 丰田自动车株式会社 Information processing device, information processing method, and non-transitory storage medium
CN111144921A (en) * 2018-11-06 2020-05-12 丰田自动车株式会社 Information processing apparatus, information processing method, and non-transitory storage medium
CN109559541B (en) * 2018-11-20 2021-06-22 华东交通大学 Unmanned vehicle route management system
CN109559541A (en) * 2018-11-20 2019-04-02 华东交通大学 A kind of unmanned vehicle route management system
TWI687953B (en) * 2018-12-05 2020-03-11 宏碁股份有限公司 Key structure and mode switching method thereof
US10714281B2 (en) 2018-12-05 2020-07-14 Acer Incorporated Key structure convertible between digital and analog switch modes and switching method thereof
CN111307464A (en) * 2018-12-11 2020-06-19 劳斯莱斯有限公司 Inspection system
US20210312842A1 (en) * 2018-12-20 2021-10-07 Ns West Inc. Display light emission device, head-up display device, image display system, and helmet
US11651714B2 (en) * 2018-12-20 2023-05-16 Ns West Inc. Display light emission device, head-up display device, image display system, and helmet
CN109407325A (en) * 2018-12-21 2019-03-01 周桂兵 A kind of multipurpose VR intelligent glasses and its display methods
CN109808711A (en) * 2018-12-25 2019-05-28 南京师范大学 Automatic driving vehicle control method and system, automatic driving vehicle and vision prosthesis
TWI740083B (en) * 2018-12-27 2021-09-21 雅得近顯股份有限公司 Low-light environment display structure
CN109696747A (en) * 2019-01-16 2019-04-30 京东方科技集团股份有限公司 A kind of VR display device and its control method
CN109696747B (en) * 2019-01-16 2022-04-12 京东方科技集团股份有限公司 VR display device and control method thereof
CN109886170A (en) * 2019-02-01 2019-06-14 长江水利委员会长江科学院 A kind of identification of oncomelania intelligent measurement and statistical system
CN111665622B (en) * 2019-03-06 2022-07-08 株式会社理光 Optical device, retina projection display device, and head-mounted display device
CN111665622A (en) * 2019-03-06 2020-09-15 株式会社理光 Optical device, retina projection display device, and head-mounted display device
US11803057B2 (en) 2019-03-06 2023-10-31 Ricoh Company, Ltd. Optical device, retinal projection display, head-mounted display, and optometric apparatus
TWI711005B (en) * 2019-03-14 2020-11-21 宏碁股份有限公司 Method for adjusting luminance of images and computer program product
CN110197601A (en) * 2019-04-24 2019-09-03 薄涛 Mixed reality glasses, mobile terminal and tutoring system, method and medium
CN111857328A (en) * 2019-04-30 2020-10-30 苹果公司 Head-mounted device
CN110110458A (en) * 2019-05-14 2019-08-09 西安电子科技大学 The conformal array antenna modeling method of deformation based on high order MoM
CN110110458B (en) * 2019-05-14 2023-03-14 西安电子科技大学 Deformation conformal array antenna modeling method based on high-order moment method
CN110197142A (en) * 2019-05-16 2019-09-03 谷东科技有限公司 Object identification method, device, medium and terminal device under faint light condition
CN110175065A (en) * 2019-05-29 2019-08-27 广州视源电子科技股份有限公司 A kind of display methods of user interface, device, equipment and storage medium
CN110210390A (en) * 2019-05-31 2019-09-06 维沃移动通信有限公司 Fingerprint collecting mould group, fingerprint collecting method and terminal
CN110363205A (en) * 2019-06-25 2019-10-22 浙江大学 A kind of image characteristic extraction system and method based on Talbot effect optical convolution
TWI807066B (en) * 2019-07-08 2023-07-01 怡利電子工業股份有限公司 Glasses-free 3D reflective diffuser head-up display device
CN110361707B (en) * 2019-08-09 2023-03-14 成都玖锦科技有限公司 Dynamic simulation method for motion state of radiation source
CN110361707A (en) * 2019-08-09 2019-10-22 成都玖锦科技有限公司 The motion state Dynamic Simulation Method of radiation source
CN112433187A (en) * 2019-08-26 2021-03-02 通用电气精准医疗有限责任公司 MRI system comprising a patient motion sensor
US11899238B2 (en) 2019-08-29 2024-02-13 Digilens Inc. Evacuated gratings and methods of manufacturing
CN112462932A (en) * 2019-09-06 2021-03-09 苹果公司 Gesture input system with wearable or handheld device based on self-mixing interferometry
CN111160105A (en) * 2019-12-03 2020-05-15 北京文香信息技术有限公司 Video image monitoring method, device, equipment and storage medium
CN111048215B (en) * 2019-12-13 2023-08-18 北京纵横无双科技有限公司 Medical video production method and system based on CRM
CN111048215A (en) * 2019-12-13 2020-04-21 北京纵横无双科技有限公司 CRM-based medical video production method and system
CN111179301B (en) * 2019-12-23 2023-06-30 北京中广上洋科技股份有限公司 Motion trend analysis method based on computer video
CN111179301A (en) * 2019-12-23 2020-05-19 北京中广上洋科技股份有限公司 Motion trend analysis method based on computer video
CN112327313A (en) * 2020-01-14 2021-02-05 必虎嘉骁光电技术(重庆)有限公司 Binocular range finder
CN112327313B (en) * 2020-01-14 2024-03-29 必虎嘉骁光电技术(重庆)有限公司 Double-cylinder range finder
US11562711B2 (en) * 2020-01-31 2023-01-24 Microchip Technology Incorporated Heads-up display using electrochromic elements
CN111317257A (en) * 2020-03-25 2020-06-23 黑龙江工业学院 Multimedia teacher desk for special children education
CN111426283B (en) * 2020-04-14 2022-12-06 昆山金智汇坤建筑科技有限公司 Laser scanning equipment for building site measurement
CN111426283A (en) * 2020-04-14 2020-07-17 昆山金智汇坤建筑科技有限公司 Laser scanning equipment for building site measurement
US20230168403A1 (en) * 2020-04-21 2023-06-01 Inova Ltd. Motion Aware Nodal Seismic Unit and Related Methods
CN111708170A (en) * 2020-07-10 2020-09-25 温州明镜智能科技有限公司 Novel VR glasses lens integrated configuration
WO2022116812A1 (en) * 2020-12-01 2022-06-09 创启社会科技有限公司 Visual impaired assisting smart glasses, and system and control method thereof
CN112370240A (en) * 2020-12-01 2021-02-19 創啟社會科技有限公司 Auxiliary intelligent glasses and system for vision impairment and control method thereof
GB2611481A (en) * 2020-12-01 2023-04-05 Innospire Tech Limited Visual impaired assisting smart glasses, and system and control method thereof
CN112807654A (en) * 2020-12-05 2021-05-18 泰州可以信息科技有限公司 Electronic judgment platform and method for heel-and-toe walking race
CN112904803A (en) * 2021-01-15 2021-06-04 西安电子科技大学 Multi-splicing-surface deformation and flatness fine adjustment system, method, equipment and application
CN112904803B (en) * 2021-01-15 2022-05-03 西安电子科技大学 Multi-splicing-surface deformation and flatness fine adjustment system, method, equipment and application
TWI769815B (en) * 2021-02-03 2022-07-01 大立光電股份有限公司 Plastic light-folding element, imaging lens assembly module and electronic device
CN112819590A (en) * 2021-02-25 2021-05-18 紫光云技术有限公司 Method for managing product configuration information in cloud product service delivery process
CN113064280A (en) * 2021-04-08 2021-07-02 恒玄科技(上海)股份有限公司 Intelligent display device
CN113240818A (en) * 2021-04-29 2021-08-10 广东元一科技实业有限公司 Method for simulating and displaying dummy model clothes
CN113115008B (en) * 2021-05-17 2023-05-19 哈尔滨商业大学 Pipe gallery master-slave operation inspection system and method
CN113115008A (en) * 2021-05-17 2021-07-13 哈尔滨商业大学 Pipe gallery master-slave operation inspection system and method based on rapid tracking registration augmented reality technology
CN113569645A (en) * 2021-06-28 2021-10-29 广东技术师范大学 Track generation method, device and system based on image detection
CN113569645B (en) * 2021-06-28 2024-03-22 广东技术师范大学 Track generation method, device and system based on image detection
US11908356B2 (en) * 2021-12-15 2024-02-20 Motorola Mobility Llc Augmented reality display device having contextual adaptive brightness
US11556010B1 (en) * 2022-04-01 2023-01-17 Wen-Tsun Wu Mini display device
CN115049643A (en) * 2022-08-11 2022-09-13 武汉精立电子技术有限公司 Near-to-eye display module interlayer foreign matter detection method, device, equipment and storage medium
CN116186418B (en) * 2023-04-27 2023-07-04 深圳市夜行人科技有限公司 Low-light imaging system recommendation method, system and medium
CN116186418A (en) * 2023-04-27 2023-05-30 深圳市夜行人科技有限公司 Low-light imaging system recommendation method, system and medium
CN116244238A (en) * 2023-05-12 2023-06-09 中国船舶集团有限公司第七〇七研究所 RS422 protocol and RS232 protocol compatible method and circuit for fiber optic gyroscope
CN116244238B (en) * 2023-05-12 2023-07-18 中国船舶集团有限公司第七〇七研究所 RS422 protocol and RS232 protocol compatible method and circuit for fiber optic gyroscope
CN117195738A (en) * 2023-09-27 2023-12-08 广东翼景信息科技有限公司 Base station antenna setting and upper dip angle optimizing method for unmanned aerial vehicle corridor
CN117195738B (en) * 2023-09-27 2024-03-12 广东翼景信息科技有限公司 Base station antenna setting and upper dip angle optimizing method for unmanned aerial vehicle corridor

Also Published As

Publication number Publication date
WO2013049248A3 (en) 2013-07-04
KR20140066258A (en) 2014-05-30
JP2015504616A (en) 2015-02-12
WO2013049248A2 (en) 2013-04-04
CN103946732B (en) 2019-06-14
EP2761362A4 (en) 2014-08-06
EP2761362A2 (en) 2014-08-06

Similar Documents

Publication Publication Date Title
CN103946732B (en) Video based on the sensor input to perspective, near-eye display shows modification
US11275482B2 (en) Ar glasses with predictive control of external device based on event input
US20200192089A1 (en) Head-worn adaptive display
US8964298B2 (en) Video display modification based on sensor input for a see-through near-to-eye display
US20170344114A1 (en) Ar glasses with predictive control of external device based on event input
US8467133B2 (en) See-through display with an optical assembly including a wedge-shaped illumination system
US9223134B2 (en) Optical imperfections in a light transmissive illumination system for see-through near-eye display glasses
US9341843B2 (en) See-through near-eye display glasses with a small scale image source
US8482859B2 (en) See-through near-eye display glasses wherein image light is transmitted to and reflected from an optically flat film
US9366862B2 (en) System and method for delivering content to a group of see-through near eye display eyepieces
US9229227B2 (en) See-through near-eye display glasses with a light transmissive wedge shaped illumination system
US9129295B2 (en) See-through near-eye display glasses with a fast response photochromic film system for quick transition from dark to clear
US8472120B2 (en) See-through near-eye display glasses with a small scale image source
US9182596B2 (en) See-through near-eye display glasses with the optical assembly including absorptive polarizers or anti-reflective coatings to reduce stray light
US9097891B2 (en) See-through near-eye display glasses including an auto-brightness control for the display brightness based on the brightness in the environment
US8488246B2 (en) See-through near-eye display glasses including a curved polarizing film in the image source, a partially reflective, partially transmitting optical element and an optically flat film
US8477425B2 (en) See-through near-eye display glasses including a partially reflective, partially transmitting optical element
US9097890B2 (en) Grating in a light transmissive illumination system for see-through near-eye display glasses
US9134534B2 (en) See-through near-eye display glasses including a modular image source
US20130278631A1 (en) 3d positioning of augmented reality information
US20160187654A1 (en) See-through near-eye display glasses with a light transmissive wedge shaped illumination system
US20120212499A1 (en) System and method for display content control during glasses movement
US20120212484A1 (en) System and method for display content placement using distance and location information
US20120242698A1 (en) See-through near-eye display glasses with a multi-segment processor-controlled optical layer
US20120235887A1 (en) See-through near-eye display glasses including a partially reflective, partially transmitting optical element and an optically flat film

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
ASS Succession or assignment of patent right

Owner name: MICROSOFT TECHNOLOGY LICENSING LLC

Free format text: FORMER OWNER: MICROSOFT CORP.

Effective date: 20150727

C41 Transfer of patent application or patent right or utility model
TA01 Transfer of patent application right

Effective date of registration: 20150727

Address after: Washington State

Applicant after: Micro soft technique license Co., Ltd

Address before: Washington State

Applicant before: Microsoft Corp.

GR01 Patent grant
GR01 Patent grant