US20020067372A1 - Utilizing augmented reality-based technologies to provide situation-related assistance to a skilled operator from remote experts - Google Patents

Utilizing augmented reality-based technologies to provide situation-related assistance to a skilled operator from remote experts Download PDF

Info

Publication number
US20020067372A1
US20020067372A1 US09/945,771 US94577101A US2002067372A1 US 20020067372 A1 US20020067372 A1 US 20020067372A1 US 94577101 A US94577101 A US 94577101A US 2002067372 A1 US2002067372 A1 US 2002067372A1
Authority
US
United States
Prior art keywords
data
information data
user
location
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US09/945,771
Inventor
Wolfgang Friedrich
Wolfgang Wohlgemuth
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Siemens AG
Original Assignee
Siemens AG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Siemens AG filed Critical Siemens AG
Assigned to SIEMENS AKTIENGESELLSCHAFT reassignment SIEMENS AKTIENGESELLSCHAFT ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: WOHLGEMUTH, WOLFGANG, FRIEDRICH, WOLFGANG
Publication of US20020067372A1 publication Critical patent/US20020067372A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/418Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS], computer integrated manufacturing [CIM]
    • G05B19/41875Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS], computer integrated manufacturing [CIM] characterised by quality surveillance of production
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/18Numerical control [NC], i.e. automatically operating machines, in particular machine tools, e.g. in a manufacturing environment, so as to execute positioning, movement or co-ordinated operations by means of programme data in numerical form
    • G05B19/409Numerical control [NC], i.e. automatically operating machines, in particular machine tools, e.g. in a manufacturing environment, so as to execute positioning, movement or co-ordinated operations by means of programme data in numerical form characterised by using manual input [MDI] or by using control panel, e.g. controlling functions with the panel; characterised by control panel details, by setting parameters
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/418Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS], computer integrated manufacturing [CIM]
    • G05B19/4183Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS], computer integrated manufacturing [CIM] characterised by data acquisition, e.g. workpiece identification
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/31From computer integrated manufacturing till monitoring
    • G05B2219/31027Computer assisted manual assembly CAA, display operation, tool, result
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/32Operator till task planning
    • G05B2219/32014Augmented reality assists operator in maintenance, repair, programming, assembly, use of head mounted display with 2-D 3-D display and voice feedback, voice and gesture command
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/35Nc in input of data, input till input file format
    • G05B2219/35482Eyephone, head-mounted 2-D or 3-D display, also voice and other control
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/35Nc in input of data, input till input file format
    • G05B2219/35494Online documentation, manual, procedures, operator, user guidance, assistance
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/35Nc in input of data, input till input file format
    • G05B2219/35495Messages to operator in multimedia, voice and image and text
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Definitions

  • the invention relates to an augmented-reality system and method for transmitting first information data from a user, for example a skilled operator at a first location, to a remote expert at a second location.
  • Such a system and method are used, for example, in the field of automation technology, for production machinery and machine tools, in diagnostic/service support systems, and for complex components, equipment and systems such as, for example, vehicles and industrial machinery and plants.
  • the invention is based on the insight that present-day machine tools and/or production machinery are of more complex design, which in many cases, for example in the event of service and repairs, require the use of a specialist, often the expert, for example, from the manufacturer. This means that the expert has to travel to the site and that a number of experts must be trained to achieve greater availability.
  • the technical solution for mitigating this problem then consists in the first information data, i.e. all the information necessary for achieving a task requiring an expert being transmitted on-line with the aid of augmented-reality means from a user, e.g. a skilled operator, to an expert at a remote second location.
  • the expert is practically linked virtually into the proceedings on site and in turn is able, with the aid of the augmented-reality system, to transmit to the skilled operator in situ his knowledge in the form of second information data transmitted to him.
  • the information and/or documentation data can, for example, be data compiled and collected while a plant, an automation technology-controlled system or a process was set up, and/or documentation data maintained and, when necessary, updated according to predefinable criteria during operation of a plant or an automation system.
  • These documentation data can.
  • Advantageous refinements consist in the documentation data being static and/or dynamic information data.
  • static information include technical data from manuals, exploded views, maintenance instructions, etc.
  • dynamic information include process values such as temperature, pressure, signals, etc.
  • the acquisition means include an image recording device
  • the analyzing means are provided for analyzing the real information in such a way that an operational context, particularly an object of the documentation data, is determined from the real information
  • the system includes visualization means for visualizing the documentation data.
  • the acquisition means are user-controlled and are designed, in particular, as speech-controlled acquisition means and/or acquisition means controlled by control data.
  • FIG. 1 shows a block diagram of a first embodiment of an augmented-reality system
  • FIG. 2 shows a further block diagram of a first embodiment of an augmented-reality system
  • FIG. 3 shows a specific application for situationally appropriate access to expert knowledge and/or documentation data.
  • FIG. 1 shows a schematic depiction of an augmented-reality system for transmitting first information data from a first location O 1 to a remote second location O 2 of an expert for providing assistance to a user at the first location O 1 , for example in the event of servicing or a repair, by the remote expert at the second location.
  • the user not explicitly shown in FIG. 1, is equipped with mobile equipment 4 , 6 .
  • the mobile equipment 4 , 6 includes data goggles 4 fitted with a video camera 2 and a microphone 11 .
  • the data goggles are linked to a device for communication without the use of wires, for example a radio transceiver 6 , which can communicate with the automation system A 1 . . . An via a radio interface 15 .
  • the automation system A 1 . . . An can be linked, via a data link 14 , to an augmented-reality system 10 , hereinafter also abbreviated as AR system.
  • the AR system includes an information module 1 b for storing or accessing information data, an AR base module 8 and an AR application module 9 .
  • the AR system 10 can be linked to the Internet 5 via a data link 13 , with optional access to further storage data and documentation data 1 a via an Internet link 12 shown by way of example.
  • the user who is equipped with the data goggles 4 and the mobile radio transceiver 7 is able to move freely within the plant A 1 . . . An for maintenance and service purposes. For example, if maintenance of, or repair to, a particular subcomponent of plants A 1 . . . An has to be carried out, appropriate access to the relevant documentation data 1 a, 1 b is established with the aid of the camera 2 of the data goggles 4 , optionally controlled by speech commands detected by microphone 11 . To do this, a data link with plant A 1 . . . An or with an appropriate radio transceiver unit is set up via the radio interface 15 , and the data transmitted to the AR system 10 .
  • the data obtained from the user are analyzed in accordance with the situation, and information data 1 a, 1 b are accessed automatically or in a manner controlled interactively by the user.
  • the relevant documentation data 1 a, 1 b obtained are transmitted via the data links 14 , 15 to the transceiver 6 , with the overall result that an analysis is carried out on the basis of the operational situation detected, said analysis forming the basis for the selection of data from the available static information.
  • Information is displayed with the aid of the visualization component used in each case, for example a handheld PC or data goggles. Referred to AR-based technologies. The operator in situ is therefore provided only with the in-formation he needs. This information is always up-to-date. The service technician therefore does not suffer from information overload from a “100-page manual”.
  • FIG. 2 shows a further specific application of a documentation processing system for service and maintenance.
  • the system consists of an augmented-reality system 10 which comprises an information module 1 b for storing information data, an AR base system 8 and an AR application module 9 .
  • the AR system 10 can be linked to the Internet 5 via connecting lines 13 , 18 . Thence a link is possible, via an exemplary data link 12 , to a remote PC 16 with a remote expert 22 . Linkage between the individual modules of the AR system 10 is effected via links 19 , 20 , 21 .
  • the user communication between a user 7 and the AR system is effected via interfaces 8 , 23 .
  • the AR system can be linked to a transceiver which enables bidirectional data communication between the AR system 10 and the user 7 via data goggles 4 , either directly via the interface 8 or via a transceiver 17 , located in the vicinity of the user 7 , via an interface 23 .
  • the link 23 can be implemented via a separate data link or via the mains as a “power-line” modem.
  • the data goggles 4 comprise an image acquisition device 2 in the form of a camera and a microphone 11 . With the aid of the data goggles 4 , the user 7 can move round the plants A 1 . . . An and carry out service or maintenance activities.
  • the initial step is the detection of the respective operational situation, for example by the camera 2 or via location by the personnel 7 .
  • a selection of data from the plant A 1 . . . An undergoing maintenance is made in the AR system.
  • the fundamental advantage of the system depicted in FIG. 3 is that this system assists the cooperation of the individual single functionalities in an application-relevant manner: i.e.
  • a concrete operational situation is detected automatically, and this operational situation is then analyzed, the aspects relevant at that point being determined automatically from the most up-to-date available static information in conjunction with the dynamic data acquired instantaneously.
  • assembly suggestions are correlated with current process data.
  • personnel 7 are provided with a situationally appropriate display of the relevant information, for example by a superposed visualization of the respective data in such a way that the real operational situation in the field of view of the personnel is expanded by the information acquired.
  • personnel 7 are very rapidly put in the position of being able to act, thereby ensuring the requisite machine operating times. Assistance to the maintenance technician 7 in situ can also be provided via the remote expert 22 and the knowledge 16 available at the location of the remote expert 22 .
  • FIG. 3 shows a specific application of situationally appropriate access to documentation data.
  • FIG. 3 shows a first monitor region B 1 which shows a plant component. Shown in the right-hand monitor region B 2 is a user 7 who, for example, is looking at an individual plant component. The user 7 is equipped with data goggles 4 which comprise a camera 2 as an acquisition means. Additionally disposed on the data goggles 4 are a microphone 11 and a loudspeaker 16 .
  • the left-hand monitor region B 1 shows a view of conduits which can be viewed with the data goggles shown in window B 2 . Marked in the left-hand monitor region B I are two points P 1 , P 2 which each represent two image details viewed with the aid of the data goggles 4 .
  • This additional information 11 consists of documentation data which, regarding the first point P 1 , include operational instructions for this pipe section and, regarding point P 2 , comprise the installation instruction to be implemented in a second step.
  • the installation instruction in this case consists of the user 7 being informed of the torque and the sense of rotation of the screwed joint of point P 2 via visualization of the additional data 112 .
  • the user 7 is therefore very quickly provided with situationally appropriate instructions for the object being viewed. If an intelligent tool is used which is able to detect the torque applied at any given moment, it is also possible for the user to be told, on the basis of the current torque, to increase or reduce the torque as required.
  • Augmented reality in brief, is a novel type of man-machine interaction of major potential for supporting industrial operational processes.
  • the field of view of the observer is enriched with computer-generated virtual objects, which means that intuitive use can be made of product or process information.
  • the deployment of portable computers opens up AR application fields involving high mobility requirements, for example if process, measured or simulation data are linked to the real object.
  • Augmented reality is a technology with numerous innovative fields of application:
  • a “mixed mock-up” approach based on a mixed-virtual environment can result in a distinct acceleration of the early phases of development.
  • VR virtual reality
  • the user is at a substantial advantage in that the haptic properties can be depicted faithfully with the aid of a real model, whereas aspects of visual perception, e.g. for display variants, can be manipulated in a virtual manner.
  • AR technologies provide the option of adopting the information sources for maintenance purposes and of explaining the dismantling process to an engineer, e.g. in the data goggles, via the superposition with real objects.
  • the AR-assisted “remote eye” permits a distributed problem solution by virtue of a remote expert communicating across global distances with the member of staff in situ. This case is particularly relevant for the predominantly medium-sized machine tool manufacturers. Because of globalization, they are forced to set up production sites for their customers worldwide. Neither, however, is the presence of subsidiaries in all the important markets achievable on economic grounds, nor is it possible to dispense with the profound knowledge of experienced service staff of the parent company with respect to the increasingly more complex plants.
  • the special feature of man-machine interaction in augmented reality is the very simple and intuitive communication with the computer, supplemented, for example, by multimode interaction techniques such as speech processing or gesture recognition.
  • multimode interaction techniques such as speech processing or gesture recognition.
  • portable computer units in addition enables entirely novel mobile utilization scenarios, with the option of requesting the specific data at any time via a wireless network.
  • Novel visualization techniques permit direct annotation, e.g. of measured data or simulation data, to the real object or into the real environment.
  • a number of users are able to operate in a real environment with the aid of a shared database (shared augmented environments) or to cooperate with AR support in different environments.
  • Augmented reality has been the subject of intense research only in the last few years. Consequently, only a few applications exist, either on the national or the international level, usually in the form of scientific prototypes in research establishments.
  • U.S.A. As with many novel technologies, the potential uses of augmented reality were first tapped in North America. Examples include cockpit design or maintenance of mechatronic equipment. The aircraft manufacturer Boeing has already carried out initial field trials using AR technology in the assembly field. The upshot is that in this hi-tech area too the U.S.A. occupy a key position, potentially making them technological leaders.
  • Japan Various AR developments are being pushed in Japan, e.g. for mixed-virtual building design, telepresence or “cyber-shopping”.
  • the nucleus is formed by the Mixed Reality Systems Laboratory founded in 1997, which is supported jointly as a center of competence by science and by commerce and industry. Particular stimuli in the consumer goods field are likely in the future from the Japanese home electronics industry.
  • production machinery and machine tools NC-controlled, automation-technology processes
  • diagnosis/service support systems for complex engineering components/equipment/systems e.g. vehicles, but also industrial machinery and plants.
  • the invention therefore relates to a system and a method of utilizing expert knowledge at a remote site, which involves the transmission, by means of augmented-reality means, of information data, e.g. in the form of video images, from a first location occupied by a skilled operator to a remote expert at a second location, and further involves the transmission of additional information data in the form of augmented-reality information by the remote expert to the skilled operator at the first location.
  • information data e.g. in the form of video images

Abstract

Utilizing augmented reality-based technologies to provide situation-related assistance to a skilled operator from remote experts.
The invention relates to a system for, and a method of, utilizing expert knowledge at a remote location, wherein information data, for example in the form of video images, are transmitted by means of augmented-reality means from a first location occupied by a skilled operator to a remote expert at a second location, and wherein the remote expert transmits additional information data in the form of augmented-reality information to the skilled operator at the first location.

Description

    BACKGROUND OF THE INVENTION
  • The invention relates to an augmented-reality system and method for transmitting first information data from a user, for example a skilled operator at a first location, to a remote expert at a second location. [0001]
  • Such a system and method are used, for example, in the field of automation technology, for production machinery and machine tools, in diagnostic/service support systems, and for complex components, equipment and systems such as, for example, vehicles and industrial machinery and plants. [0002]
  • The contribution from Daude R. et al: “Head-Mounted Display als facharbeiterorientierte Unterstutzungs kom po nente an CNC-Werkzeugmaschinen”, [“Head-Mounted Display as a component to assist skilled operators of CNC machine tools”], Werkstattstechnik, D E, Springer Verlag, Berlin, Vol. 86, No. 5, May 1, 1996, pp. 248-252, XP000585192 ISSN: 0340-4544, describes the head-mounted display (HMD) as a component to assist the skilled operator with the steps of setting up, feeding and malfunction management in milling operations. The technical integration of the HMD and modern NC control is explained, and the results of a laboratory trial of the HMD are mentioned. [0003]
  • It is an object of the invention to specify a system and a method which, in concrete operational situations, permits rapid and reliable access to expert knowledge in a simple and cost-effective manner. [0004]
  • This object is achieved by a system and by a method having the features specified in [0005] claims 1 and 6 respectively.
  • The invention is based on the insight that present-day machine tools and/or production machinery are of more complex design, which in many cases, for example in the event of service and repairs, require the use of a specialist, often the expert, for example, from the manufacturer. This means that the expert has to travel to the site and that a number of experts must be trained to achieve greater availability. The technical solution for mitigating this problem then consists in the first information data, i.e. all the information necessary for achieving a task requiring an expert being transmitted on-line with the aid of augmented-reality means from a user, e.g. a skilled operator, to an expert at a remote second location. As a result, the expert is practically linked virtually into the proceedings on site and in turn is able, with the aid of the augmented-reality system, to transmit to the skilled operator in situ his knowledge in the form of second information data transmitted to him. This results in synchronous presence of service-relevant data at the point of the skilled operator and the specialized expert who works centrally, for example at the manufacturer's site. [0006]
  • The information and/or documentation data can, for example, be data compiled and collected while a plant, an automation technology-controlled system or a process was set up, and/or documentation data maintained and, when necessary, updated according to predefinable criteria during operation of a plant or an automation system. These documentation data can. [0007]
  • Advantageous refinements consist in the documentation data being static and/or dynamic information data. Examples of such static information include technical data from manuals, exploded views, maintenance instructions, etc. Examples of dynamic information include process values such as temperature, pressure, signals, etc. [0008]
  • Rapid, situationally appropriate access to the documentation data is further assisted by the feature that the acquisition means include an image recording device, that the analyzing means are provided for analyzing the real information in such a way that an operational context, particularly an object of the documentation data, is determined from the real information, and that the system includes visualization means for visualizing the documentation data. [0009]
  • Rapid, situationally appropriate access to the documentation data is further assisted by the feature that the acquisition means are user-controlled and are designed, in particular, as speech-controlled acquisition means and/or acquisition means controlled by control data. [0010]
  • The deployment of augmented-reality techniques on the basis of the static and/or dynamic documentation data and/or process data can be optimized for many applications by the acquisition means and/or the visualizing means being designed as data goggles.[0011]
  • The invention is described and explained below in more detail with reference to the specific embodiments depicted in the figures, in which: [0012]
  • FIG. 1 shows a block diagram of a first embodiment of an augmented-reality system; [0013]
  • FIG. 2 shows a further block diagram of a first embodiment of an augmented-reality system; and [0014]
  • FIG. 3 shows a specific application for situationally appropriate access to expert knowledge and/or documentation data.[0015]
  • FIG. 1 shows a schematic depiction of an augmented-reality system for transmitting first information data from a first location O[0016] 1 to a remote second location O2 of an expert for providing assistance to a user at the first location O1, for example in the event of servicing or a repair, by the remote expert at the second location. The user, not explicitly shown in FIG. 1, is equipped with mobile equipment 4, 6. The mobile equipment 4, 6 includes data goggles 4 fitted with a video camera 2 and a microphone 11. The data goggles are linked to a device for communication without the use of wires, for example a radio transceiver 6, which can communicate with the automation system A1 . . . An via a radio interface 15. The automation system A1 . . . An can be linked, via a data link 14, to an augmented-reality system 10, hereinafter also abbreviated as AR system. The AR system includes an information module 1 b for storing or accessing information data, an AR base module 8 and an AR application module 9. The AR system 10 can be linked to the Internet 5 via a data link 13, with optional access to further storage data and documentation data 1 a via an Internet link 12 shown by way of example.
  • The user who is equipped with the [0017] data goggles 4 and the mobile radio transceiver 7 is able to move freely within the plant A1 . . . An for maintenance and service purposes. For example, if maintenance of, or repair to, a particular subcomponent of plants A1 . . . An has to be carried out, appropriate access to the relevant documentation data 1 a, 1 b is established with the aid of the camera 2 of the data goggles 4, optionally controlled by speech commands detected by microphone 11. To do this, a data link with plant A1 . . . An or with an appropriate radio transceiver unit is set up via the radio interface 15, and the data transmitted to the AR system 10. Within the AR system, the data obtained from the user are analyzed in accordance with the situation, and information data 1 a, 1 b are accessed automatically or in a manner controlled interactively by the user. The relevant documentation data 1 a, 1 b obtained are transmitted via the data links 14, 15 to the transceiver 6, with the overall result that an analysis is carried out on the basis of the operational situation detected, said analysis forming the basis for the selection of data from the available static information. This results in a situationally appropriate, object-oriented or component-oriented selection of relevant knowledge from the most up-to- date data sources 1 a, 1 b. Information is displayed with the aid of the visualization component used in each case, for example a handheld PC or data goggles. Referred to AR-based technologies. The operator in situ is therefore provided only with the in-formation he needs. This information is always up-to-date. The service technician therefore does not suffer from information overload from a “100-page manual”.
  • FIG. 2 shows a further specific application of a documentation processing system for service and maintenance. The system consists of an augmented-[0018] reality system 10 which comprises an information module 1 b for storing information data, an AR base system 8 and an AR application module 9. The AR system 10 can be linked to the Internet 5 via connecting lines 13, 18. Thence a link is possible, via an exemplary data link 12, to a remote PC 16 with a remote expert 22. Linkage between the individual modules of the AR system 10 is effected via links 19, 20, 21. The user communication between a user 7 and the AR system is effected via interfaces 8, 23. To this end, the AR system can be linked to a transceiver which enables bidirectional data communication between the AR system 10 and the user 7 via data goggles 4, either directly via the interface 8 or via a transceiver 17, located in the vicinity of the user 7, via an interface 23. The link 23 can be implemented via a separate data link or via the mains as a “power-line” modem. As well as a display device disposed in the vicinity of the eye pieces, the data goggles 4 comprise an image acquisition device 2 in the form of a camera and a microphone 11. With the aid of the data goggles 4, the user 7 can move round the plants A1 . . . An and carry out service or maintenance activities.
  • With the aid of the [0019] data goggles 4 and the corresponding radio transceivers, e.g. the radio transceiver 17 worn by personnel directly on the body, it is possible to achieve preventive functionality: the initial step is the detection of the respective operational situation, for example by the camera 2 or via location by the personnel 7. On the basis of the operational situation detected, a selection of data from the plant A1 . . . An undergoing maintenance is made in the AR system. The fundamental advantage of the system depicted in FIG. 3 is that this system assists the cooperation of the individual single functionalities in an application-relevant manner: i.e. a concrete operational situation is detected automatically, and this operational situation is then analyzed, the aspects relevant at that point being determined automatically from the most up-to-date available static information in conjunction with the dynamic data acquired instantaneously. As a result, for example, assembly suggestions are correlated with current process data. As a result, personnel 7 are provided with a situationally appropriate display of the relevant information, for example by a superposed visualization of the respective data in such a way that the real operational situation in the field of view of the personnel is expanded by the information acquired. As a result, personnel 7 are very rapidly put in the position of being able to act, thereby ensuring the requisite machine operating times. Assistance to the maintenance technician 7 in situ can also be provided via the remote expert 22 and the knowledge 16 available at the location of the remote expert 22.
  • FIG. 3 shows a specific application of situationally appropriate access to documentation data. FIG. 3 shows a first [0020] monitor region B 1 which shows a plant component. Shown in the right-hand monitor region B2 is a user 7 who, for example, is looking at an individual plant component. The user 7 is equipped with data goggles 4 which comprise a camera 2 as an acquisition means. Additionally disposed on the data goggles 4 are a microphone 11 and a loudspeaker 16. The left-hand monitor region B1 shows a view of conduits which can be viewed with the data goggles shown in window B2. Marked in the left-hand monitor region B I are two points P1, P2 which each represent two image details viewed with the aid of the data goggles 4. After the first point P1 has been viewed, i.e. after the conduit disposed at or near point P1 has been viewed, additional information is visualized for the user 7 in the data goggles 4. This additional information 11 consists of documentation data which, regarding the first point P1, include operational instructions for this pipe section and, regarding point P2, comprise the installation instruction to be implemented in a second step. The installation instruction in this case consists of the user 7 being informed of the torque and the sense of rotation of the screwed joint of point P2 via visualization of the additional data 112. The user 7 is therefore very quickly provided with situationally appropriate instructions for the object being viewed. If an intelligent tool is used which is able to detect the torque applied at any given moment, it is also possible for the user to be told, on the basis of the current torque, to increase or reduce the torque as required.
  • Below, background information is provided to the field of application of the invention: this involves an application-oriented requirement analysis and development of AR-based systems to support operational processes being developed, production and service of complex engineering products and plants in fabrication and process technology, and for service support systems as with motor vehicles, or for maintaining any industrial equipment. [0021]
  • Augmented reality, AR in brief, is a novel type of man-machine interaction of major potential for supporting industrial operational processes. With this technology, the field of view of the observer is enriched with computer-generated virtual objects, which means that intuitive use can be made of product or process information. In addition to the extremely simple interaction, the deployment of portable computers opens up AR application fields involving high mobility requirements, for example if process, measured or simulation data are linked to the real object. [0022]
  • The situation of German industry is characterized by increasing customer requirements in terms of individuality and quality of products and by the development processes taking substantially less time. Especially in developing, producing and servicing complex industrial products and plants it is possible, by means of innovative solutions to man-machine interaction, both to achieve jumps in efficiency and productivity and to design the work so as to enhance competence and training, by the users' need for knowledge and information being supported in a situationally appropriate manner on the basis of data available in any case. [0023]
  • Augmented reality is a technology with numerous innovative fields of application: [0024]
  • In development, for example, a “mixed mock-up” approach based on a mixed-virtual environment can result in a distinct acceleration of the early phases of development. Compared with immersive “virtual reality” (VR) solutions, the user is at a substantial advantage in that the haptic properties can be depicted faithfully with the aid of a real model, whereas aspects of visual perception, e.g. for display variants, can be manipulated in a virtual manner. In addition, there is a major potential for user-oriented validation of computer-assisted models, e.g. for component verification or in crash tests. [0025]
  • In flexible production it is possible, inter alia, to considerably facilitate the process of setting up machinery for qualified skilled operators by displaying, e.g. via mobile AR components, mixed-virtual clamping situations directly in the field of view. Fabrication planning and fabrication control appropriate to the skilled worker in the workshop is facilitated if information regarding the respective order status is perceived directly in situ in connection with the corresponding products. This also applies to fitting, with the option of presenting the individual procedural steps to the fitter in a mixed-virtual manner even in the training phase. In this connection it is possible, e.g. by comparing real fitting procedures with results of simulations, to achieve comprehensive optimizations which both improve the quality of operation scheduling and simplify and accelerate the fitting process in the critical start-up phase. [0026]
  • Finally, regarding service, conventional technologies are by now barely adequate for supporting and documenting the complex diagnostic and repair procedures. Since, however, these processes in many fields are in any case planned on the basis of digital data, AR technologies provide the option of adopting the information sources for maintenance purposes and of explaining the dismantling process to an engineer, e.g. in the data goggles, via the superposition with real objects. Regarding cooperative operation, the AR-assisted “remote eye” permits a distributed problem solution by virtue of a remote expert communicating across global distances with the member of staff in situ. This case is particularly relevant for the predominantly medium-sized machine tool manufacturers. Because of globalization, they are forced to set up production sites for their customers worldwide. Neither, however, is the presence of subsidiaries in all the important markets achievable on economic grounds, nor is it possible to dispense with the profound knowledge of experienced service staff of the parent company with respect to the increasingly more complex plants. [0027]
  • The special feature of man-machine interaction in augmented reality is the very simple and intuitive communication with the computer, supplemented, for example, by multimode interaction techniques such as speech processing or gesture recognition. The use of portable computer units in addition enables entirely novel mobile utilization scenarios, with the option of requesting the specific data at any time via a wireless network. Novel visualization techniques permit direct annotation, e.g. of measured data or simulation data, to the real object or into the real environment. In conjunction with distributed applications, a number of users are able to operate in a real environment with the aid of a shared database (shared augmented environments) or to cooperate with AR support in different environments. [0028]
  • Augmented reality has been the subject of intense research only in the last few years. Consequently, only a few applications exist, either on the national or the international level, usually in the form of scientific prototypes in research establishments. [0029]
  • U.S.A.: As with many novel technologies, the potential uses of augmented reality were first tapped in North America. Examples include cockpit design or maintenance of mechatronic equipment. The aircraft manufacturer Boeing has already carried out initial field trials using AR technology in the assembly field. The upshot is that in this hi-tech area too the U.S.A. occupy a key position, potentially making them technological leaders. [0030]
  • Japan: Various AR developments are being pushed in Japan, e.g. for mixed-virtual building design, telepresence or “cyber-shopping”. The nucleus is formed by the Mixed Reality Systems Laboratory founded in 1997, which is supported jointly as a center of competence by science and by commerce and industry. Particular stimuli in the consumer goods field are likely in the future from the Japanese home electronics industry. [0031]
  • Europe: So far, only very few research groups have been active in Europe in the AR field. One group at the University of Vienna is working on approaches to mixed-real visualization. The IGD group, as part of the ACTS project CICC, which has now come to an end, has developed initial applications for the building industry and a scientific prototype for staff training in car manufacturing. [0032]
  • The invention in particular should be seen in the specific context of the fields of application “production machinery and machine tools” (NC-controlled, automation-technology processes) and “diagnostics/service support systems for complex engineering components/equipment/systems” (e.g. vehicles, but also industrial machinery and plants). [0033]
  • The technical problem posed in this context is AR-based interaction between remote experts, operators in situ and the system. The complexity of present-day machine tools and production machinery in many cases requires the additional deployment of a specialist, often the expert of the firm that supplied the machine. [0034]
  • The solutions up to now have required the expert to come on site. For increased availability is necessary for a number of experts to have been trained and to be made available. With the support of AR technologies, not only the operator in situ, but also, the expert at the supplier's headquarters is provided, for example via a built-in video camera (mounted on the data goggles); with all the real and virtual information. The remote expert is integrated “live” into problem solution support and is able to make additional suggestions to the personnel in situ—again via AR-based technologies. [0035]
  • For the small and medium-sized companies it is hardly possible to have the necessary experts available at all manufacturing sites e.g. of the car industry. AR makes it possible for these companies, which are so important for us in Germany, to play along in the concert of the “global players”. What constitutes the inventive step? Synchronous presence of service-relevant data at the point of the operator (in situ, the client's plant . . . somewhere in the world) and the specialized expert (centrally at the supplier's headquarters, for example in Germany). A particularly significant feature, for example, is that current process data are made available to the remote expert, or conversely the operator in situ is assisted by the transmission of characteristic data/curves which are based on extensive measurement results and therefore could not have been compiled on site. [0036]
  • To sum up, the invention therefore relates to a system and a method of utilizing expert knowledge at a remote site, which involves the transmission, by means of augmented-reality means, of information data, e.g. in the form of video images, from a first location occupied by a skilled operator to a remote expert at a second location, and further involves the transmission of additional information data in the form of augmented-reality information by the remote expert to the skilled operator at the first location. [0037]

Claims (10)

We claim:
1. A system for transmitting first information data from a user at a first location to a remote expert at a second location wherein the information data are process values, video images and/or speech signals of the user and wherein additional information data in the form of augmented-reality information are transmitted from the remote expert at the second location to the user at the first location.
2. The system according to claim 1, wherein the information data are static and dynamic information data.
3. The system according to claim 1, wherein the system includes acquisition means comprising sensorics for acquiring the first information data and visualization means for visualizing the second information data.
4. The system according to claim 1, wherein the acquisition means are user-controlled.
5. The system according to claim 1, wherein the visualization means are designed as display devices disposed in the vicinity of eyepieces of data goggles, in that the acquisition means provided is an image acquisition device disposed on the data goggles, and in that a microphone disposed on the data goggles is provided to detect speech commands.
6. A method of transmitting first information data by means of an augmented-reality system from a user for example a skilled operator at a first location to a remote expert at a second location wherein the information data in particular are process values, signal values, video images and/or speech signals of the user and wherein additional information data in the form of augmented-reality information are transmitted from the remote expert at the second location to the user at the first location.
7. The method according to claim 6, wherein the documentation data are static and/or dynamic information data.
8. The method according to claim 6, wherein the first information data are acquired by means of acquisition means comprising sensorics, in particular by means of an image recording device, and in that the second information data are visualized to the user by means of visualizing means.
9. The method according to claim 6, wherein the acquisition means are user-controlled and are designed, in particular, as speech-controlled acquisition means and/or acquisition means controlled by control data.
10. The method according to claim 6, wherein the acquisition means and/or the visualization means are designed as data goggles.
US09/945,771 1999-03-02 2001-09-04 Utilizing augmented reality-based technologies to provide situation-related assistance to a skilled operator from remote experts Abandoned US20020067372A1 (en)

Applications Claiming Priority (19)

Application Number Priority Date Filing Date Title
DE19909013.0 1999-03-02
DE19909010.6 1999-03-02
DE19909009.2 1999-03-02
DE19909016 1999-03-02
DE19909011 1999-03-02
DE19909018.1 1999-03-02
DE19909023.8 1999-03-02
DE19909016.5 1999-03-02
DE19909018 1999-03-02
DE19909154.4 1999-03-02
DE19909012 1999-03-02
DE19909012.2 1999-03-02
DE19909154 1999-03-02
DE19909023 1999-03-02
DE19909010 1999-03-02
DE19909011.4 1999-03-02
DE19909013 1999-03-02
DE19909009 1999-03-02
PCT/DE2000/000659 WO2000052537A1 (en) 1999-03-02 2000-03-02 Use of augmented reality fundamental technology for the situation-specific assistance of a skilled worker via remote experts

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
PCT/DE2000/000659 Continuation WO2000052537A1 (en) 1999-03-02 2000-03-02 Use of augmented reality fundamental technology for the situation-specific assistance of a skilled worker via remote experts

Publications (1)

Publication Number Publication Date
US20020067372A1 true US20020067372A1 (en) 2002-06-06

Family

ID=27576004

Family Applications (5)

Application Number Title Priority Date Filing Date
US09/945,771 Abandoned US20020067372A1 (en) 1999-03-02 2001-09-04 Utilizing augmented reality-based technologies to provide situation-related assistance to a skilled operator from remote experts
US09/945,774 Abandoned US20020069072A1 (en) 1999-03-02 2001-09-04 Augmented-reality system with voice-based recording of information data, in particular of service reports
US09/945,777 Expired - Lifetime US6941248B2 (en) 1999-03-02 2001-09-04 System for operating and observing making use of mobile equipment
US09/945,776 Abandoned US20020046368A1 (en) 1999-03-02 2001-09-04 System for, and method of, situation-relevant asistance to interaction with the aid of augmented-reality technologies
US11/857,931 Expired - Lifetime US8373618B2 (en) 1999-03-02 2007-09-19 Augmented-reality system for situation-related support of the interaction between a user and an engineering apparatus

Family Applications After (4)

Application Number Title Priority Date Filing Date
US09/945,774 Abandoned US20020069072A1 (en) 1999-03-02 2001-09-04 Augmented-reality system with voice-based recording of information data, in particular of service reports
US09/945,777 Expired - Lifetime US6941248B2 (en) 1999-03-02 2001-09-04 System for operating and observing making use of mobile equipment
US09/945,776 Abandoned US20020046368A1 (en) 1999-03-02 2001-09-04 System for, and method of, situation-relevant asistance to interaction with the aid of augmented-reality technologies
US11/857,931 Expired - Lifetime US8373618B2 (en) 1999-03-02 2007-09-19 Augmented-reality system for situation-related support of the interaction between a user and an engineering apparatus

Country Status (5)

Country Link
US (5) US20020067372A1 (en)
EP (5) EP1183578B1 (en)
JP (5) JP2002538542A (en)
DE (5) DE50007901D1 (en)
WO (7) WO2000052540A1 (en)

Cited By (30)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020044162A1 (en) * 2000-07-05 2002-04-18 Ryusuke Sawatari Device for displaying link information and method for displaying the same
US20030043179A1 (en) * 2001-09-06 2003-03-06 International Business Machines Corporation Method and apparatus for providing user support based on contextual information
US20030046410A1 (en) * 2001-09-06 2003-03-06 International Business Machines Corporation Method and apparatus for providing entitlement information for interactive support
US20040183751A1 (en) * 2001-10-19 2004-09-23 Dempski Kelly L Industrial augmented reality
US6871322B2 (en) 2001-09-06 2005-03-22 International Business Machines Corporation Method and apparatus for providing user support through an intelligent help agent
EP1630708A1 (en) * 2004-08-31 2006-03-01 Sysmex Corporation Remote control method, remote control system, status informing device and control apparatus
US20070088526A1 (en) * 2003-11-10 2007-04-19 Wolfgang Friedrich System and method for carrying out and visually displaying simulations in an augmented reality
US8042050B2 (en) * 2001-07-31 2011-10-18 Hewlett-Packard Development Company, L.P. Method and apparatus for interactive broadcasting
CN102292707A (en) * 2011-05-11 2011-12-21 华为终端有限公司 Method and system for implementing augmented reality applications
US20110316845A1 (en) * 2010-06-25 2011-12-29 Palo Alto Research Center Incorporated Spatial association between virtual and augmented reality
US20120019547A1 (en) * 2010-07-23 2012-01-26 Pantech Co., Ltd. Apparatus and method for providing augmented reality using additional data
US20130083063A1 (en) * 2011-09-30 2013-04-04 Kevin A. Geisner Service Provision Using Personal Audio/Visual System
US8760471B2 (en) 2010-04-28 2014-06-24 Ns Solutions Corporation Information processing system, information processing method and program for synthesizing and displaying an image
WO2015125066A1 (en) * 2014-02-19 2015-08-27 Fieldbit Ltd. System and method for facilitating equipment maintenance using smartglasses
WO2015160515A1 (en) 2014-04-16 2015-10-22 Exxonmobil Upstream Research Company Methods and systems for providing procedures in real-time
US20170344958A1 (en) * 2014-12-19 2017-11-30 Robert Bosch Gmbh Identification and repair support device and method
US9955059B2 (en) 2014-10-29 2018-04-24 Kabushiki Kaisha Toshiba Electronic device, method, and computer program product
US20180338119A1 (en) * 2017-05-18 2018-11-22 Visual Mobility Inc. System and method for remote secure live video streaming
US10177547B2 (en) 2015-03-12 2019-01-08 Schleuniger Holding Ag Cable processing machine with improved precision mechanism for cable processing
CN109658519A (en) * 2018-12-28 2019-04-19 吉林大学 Vehicle multi-mode formula augmented reality system based on real traffic information image procossing
US10431108B2 (en) 2015-04-20 2019-10-01 NSF International Computer-implemented techniques for interactively training users to perform food quality, food safety, and workplace safety tasks
US10481594B2 (en) 2015-03-12 2019-11-19 Schleuniger Holding Ag Cable processing machine monitoring with improved precision mechanism for cable processing
WO2019174672A3 (en) * 2018-03-11 2019-11-21 Inovation Gmbh Method for operating a pair of smart glasses, method for assisting a person performing the activity, method for commissioning goods, smart glasses, device for actuating functions, system consisting of smart glasses and a computer communicating therewith, goods store and commissioning trolley
WO2021047823A1 (en) 2019-09-12 2021-03-18 Daimler Ag Method for operating a communication platform for troubleshooting for a motor vehicle, and communication platform
US11094220B2 (en) 2018-10-23 2021-08-17 International Business Machines Corporation Intelligent augmented reality for technical support engineers
US11270459B2 (en) * 2020-04-22 2022-03-08 Dell Products L.P. Enterprise system augmented reality detection
US11397972B2 (en) 2019-05-10 2022-07-26 Controlexpert Gmbh Method for assessing damage to a motor vehicle
US11507400B2 (en) 2020-02-28 2022-11-22 Wipro Limited Method and system for providing real-time remote assistance to a user
US11681970B2 (en) * 2018-04-30 2023-06-20 Telefonaktiebolaget Lm Ericsson (Publ) Automated augmented reality rendering platform for providing remote expert assistance
US11943227B2 (en) 2021-09-17 2024-03-26 Bank Of America Corporation Data access control for augmented reality devices

Families Citing this family (193)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE10024412A1 (en) * 2000-05-19 2001-11-29 Westfalia Separator Ind Gmbh Processes for controlling machines and information systems
DE10027136C2 (en) * 2000-05-31 2002-11-21 Luigi Grasso Mobile system for creating a virtual display
US8482488B2 (en) 2004-12-22 2013-07-09 Oakley, Inc. Data input management system for wearable electronically enabled interface
US20120105740A1 (en) 2000-06-02 2012-05-03 Oakley, Inc. Eyewear with detachable adjustable electronics module
DE10127396A1 (en) * 2000-06-13 2001-12-20 Volkswagen Ag Method for utilization of old motor vehicles using a sorting plant for removal of operating fluids and dismantling of the vehicle into components parts for sorting uses augmented reality (AR) aids to speed and improve sorting
DE10048743C2 (en) * 2000-09-29 2002-11-28 Siemens Ag automation system
DE10048563B4 (en) * 2000-09-30 2010-11-25 Meissner, Werner Device for the remote maintenance of technical equipment
US20020044152A1 (en) * 2000-10-16 2002-04-18 Abbott Kenneth H. Dynamic integration of computer generated and real world images
ITBO20000608A1 (en) 2000-10-18 2002-04-18 Gd Spa METHOD AND AUTOMATIC MACHINE FOR THE PROCESSING OF A PRODUCT
US20120154438A1 (en) * 2000-11-06 2012-06-21 Nant Holdings Ip, Llc Interactivity Via Mobile Image Recognition
DE10063089C1 (en) * 2000-12-18 2002-07-25 Siemens Ag User-controlled linking of information within an augmented reality system
DE10103922A1 (en) * 2001-01-30 2002-08-01 Physoptics Opto Electronic Gmb Interactive data viewing and operating system
DE10108064A1 (en) * 2001-02-20 2002-09-05 Siemens Ag Linked eye tracking information within an augmented reality system
US7013009B2 (en) 2001-06-21 2006-03-14 Oakley, Inc. Eyeglasses with wireless communication features
FI20012231A (en) * 2001-06-21 2002-12-22 Ismo Rakkolainen System for creating a user interface
JP2003080482A (en) * 2001-09-07 2003-03-18 Yaskawa Electric Corp Robot teaching device
US7451126B2 (en) * 2001-10-18 2008-11-11 Omron Corporation State space navigation system, user system and business methods for machine to machine business
DE10159610B4 (en) * 2001-12-05 2004-02-26 Siemens Ag System and method for creating documentation of work processes, especially in the area of production, assembly, service or maintenance
EP1487616B1 (en) * 2002-03-20 2010-06-30 Volkswagen Aktiengesellschaft Automatic process control
DE10320268B4 (en) * 2002-05-31 2012-08-16 Heidelberger Druckmaschinen Ag Device and method for finding and displaying information
DE10255056A1 (en) * 2002-11-25 2004-06-03 Grob-Werke Burkhart Grob E.K. Station with operator panel, esp. in processing or manufacturing-line, has portable operator panel wirelessly connected with station or control device of station and/or central control unit
SE0203908D0 (en) * 2002-12-30 2002-12-30 Abb Research Ltd An augmented reality system and method
DE10305384A1 (en) 2003-02-11 2004-08-26 Kuka Roboter Gmbh Method and device for visualizing computer-aided information
WO2004074949A1 (en) * 2003-02-24 2004-09-02 Bayerische Motoren Werke Aktiengesellschaft Method and device for visualising an automotive repair cycle
DE10325894B4 (en) 2003-06-06 2010-12-09 Siemens Ag Tool or production machine with display unit for the visualization of work processes
DE10325895A1 (en) 2003-06-06 2005-01-05 Siemens Ag Tool or production machine with head-up display
DE10326627A1 (en) * 2003-06-11 2005-01-05 Endress + Hauser Gmbh + Co. Kg Method for displaying the function of a field device of process automation technology
US20050022228A1 (en) * 2003-07-21 2005-01-27 Videotest Llc Digital recording-based computer testing and debugging system
DE102004016329A1 (en) * 2003-11-10 2005-05-25 Siemens Ag System and method for performing and visualizing simulations in an augmented reality
US8600550B2 (en) * 2003-12-12 2013-12-03 Kurzweil Technologies, Inc. Virtual encounters
US9841809B2 (en) * 2003-12-12 2017-12-12 Kurzweil Technologies, Inc. Virtual encounters
US9948885B2 (en) 2003-12-12 2018-04-17 Kurzweil Technologies, Inc. Virtual encounters
US9971398B2 (en) * 2003-12-12 2018-05-15 Beyond Imagination Inc. Virtual encounters
US20050130108A1 (en) * 2003-12-12 2005-06-16 Kurzweil Raymond C. Virtual encounters
DE102005011616B4 (en) * 2004-05-28 2014-12-04 Volkswagen Ag Mobile tracking unit
DE102004044718A1 (en) * 2004-09-10 2006-03-16 Volkswagen Ag Augmented reality help instruction generating system for e.g. aircraft, has control unit producing help instruction signal, representing help instruction in virtual space of three-dimensional object model, as function of interaction signal
DE102004053774A1 (en) * 2004-11-08 2006-05-11 Siemens Ag System for measuring and interpreting brain activity
DE102005061211B4 (en) 2004-12-22 2023-04-06 Abb Schweiz Ag Method for creating a human-machine user interface
US7715037B2 (en) 2005-03-01 2010-05-11 Xerox Corporation Bi-directional remote visualization for supporting collaborative machine troubleshooting
DE102005009437A1 (en) * 2005-03-02 2006-09-07 Kuka Roboter Gmbh Method and device for fading AR objects
US8150666B2 (en) * 2005-03-14 2012-04-03 Holomar, Inc. Methods and systems for combining models of goods and services
JP4933164B2 (en) 2005-07-01 2012-05-16 キヤノン株式会社 Information processing apparatus, information processing method, program, and storage medium
US7362738B2 (en) * 2005-08-09 2008-04-22 Deere & Company Method and system for delivering information to a user
US7920071B2 (en) * 2006-05-26 2011-04-05 Itt Manufacturing Enterprises, Inc. Augmented reality-based system and method providing status and control of unmanned vehicles
US9323055B2 (en) * 2006-05-26 2016-04-26 Exelis, Inc. System and method to display maintenance and operational instructions of an apparatus using augmented reality
EP2095178B1 (en) 2006-12-14 2015-08-12 Oakley, Inc. Wearable high resolution audio visual interface
US20080218331A1 (en) * 2007-03-08 2008-09-11 Itt Manufacturing Enterprises, Inc. Augmented reality-based system and method to show the location of personnel and sensors inside occluded structures and provide increased situation awareness
DE102007025796B4 (en) * 2007-06-02 2010-07-15 Koenig & Bauer Aktiengesellschaft Mobile control station of a rotary printing machine
US20090037378A1 (en) * 2007-08-02 2009-02-05 Rockwell Automation Technologies, Inc. Automatic generation of forms based on activity
WO2009036782A1 (en) * 2007-09-18 2009-03-26 Vrmedia S.R.L. Information processing apparatus and method for remote technical assistance
EP2206041A4 (en) * 2007-10-01 2011-02-16 Iconics Inc Visualization of process control data
KR100914848B1 (en) * 2007-12-15 2009-09-02 한국전자통신연구원 Method and architecture of mixed reality system
US8485038B2 (en) 2007-12-18 2013-07-16 General Electric Company System and method for augmented reality inspection and data visualization
WO2009094587A1 (en) * 2008-01-23 2009-07-30 Deering Michael F Eye mounted displays
US9812096B2 (en) 2008-01-23 2017-11-07 Spy Eye, Llc Eye mounted displays and systems using eye mounted displays
US8786675B2 (en) * 2008-01-23 2014-07-22 Michael F. Deering Systems using eye mounted displays
DE102008009446A1 (en) * 2008-02-15 2009-08-20 Volkswagen Ag Method for examining complex system, particularly motor vehicle, on deviations from quality specifications and on defectiveness, involves entering method state information by data input device in state information storage by testing person
DE102008020772A1 (en) * 2008-04-21 2009-10-22 Carl Zeiss 3D Metrology Services Gmbh Presentation of results of a measurement of workpieces
DE102008020771A1 (en) * 2008-04-21 2009-07-09 Carl Zeiss 3D Metrology Services Gmbh Deviation determining method, involves locating viewers at viewing position of device and screen, such that viewers view each position of exemplars corresponding to measured coordinates of actual condition
US7980512B1 (en) * 2008-06-13 2011-07-19 The Boeing Company System and method for displaying aerial refueling symbology
US20100082118A1 (en) * 2008-09-30 2010-04-01 Rockwell Automation Technologies, Inc. User interface display object for logging user-implemented solutions to industrial field problems
DE102009021729A1 (en) * 2009-05-11 2010-11-18 Michael Weinig Ag Machine for machining workpieces made of wood, plastic and the like
TWI423112B (en) * 2009-12-09 2014-01-11 Ind Tech Res Inst Portable virtual human-machine interaction device and method therewith
US20150309316A1 (en) 2011-04-06 2015-10-29 Microsoft Technology Licensing, Llc Ar glasses with predictive control of external device based on event input
US9341843B2 (en) 2010-02-28 2016-05-17 Microsoft Technology Licensing, Llc See-through near-eye display glasses with a small scale image source
US10180572B2 (en) 2010-02-28 2019-01-15 Microsoft Technology Licensing, Llc AR glasses with event and user action control of external applications
US9134534B2 (en) 2010-02-28 2015-09-15 Microsoft Technology Licensing, Llc See-through near-eye display glasses including a modular image source
US9091851B2 (en) 2010-02-28 2015-07-28 Microsoft Technology Licensing, Llc Light control in head mounted displays
US9366862B2 (en) 2010-02-28 2016-06-14 Microsoft Technology Licensing, Llc System and method for delivering content to a group of see-through near eye display eyepieces
US9285589B2 (en) 2010-02-28 2016-03-15 Microsoft Technology Licensing, Llc AR glasses with event and sensor triggered control of AR eyepiece applications
US9129295B2 (en) 2010-02-28 2015-09-08 Microsoft Technology Licensing, Llc See-through near-eye display glasses with a fast response photochromic film system for quick transition from dark to clear
US9759917B2 (en) 2010-02-28 2017-09-12 Microsoft Technology Licensing, Llc AR glasses with event and sensor triggered AR eyepiece interface to external devices
US20120249797A1 (en) 2010-02-28 2012-10-04 Osterhout Group, Inc. Head-worn adaptive display
US9097891B2 (en) 2010-02-28 2015-08-04 Microsoft Technology Licensing, Llc See-through near-eye display glasses including an auto-brightness control for the display brightness based on the brightness in the environment
US9229227B2 (en) 2010-02-28 2016-01-05 Microsoft Technology Licensing, Llc See-through near-eye display glasses with a light transmissive wedge shaped illumination system
US9097890B2 (en) 2010-02-28 2015-08-04 Microsoft Technology Licensing, Llc Grating in a light transmissive illumination system for see-through near-eye display glasses
US9128281B2 (en) 2010-09-14 2015-09-08 Microsoft Technology Licensing, Llc Eyepiece with uniformly illuminated reflective display
US9182596B2 (en) 2010-02-28 2015-11-10 Microsoft Technology Licensing, Llc See-through near-eye display glasses with the optical assembly including absorptive polarizers or anti-reflective coatings to reduce stray light
US9223134B2 (en) 2010-02-28 2015-12-29 Microsoft Technology Licensing, Llc Optical imperfections in a light transmissive illumination system for see-through near-eye display glasses
EP2539759A1 (en) * 2010-02-28 2013-01-02 Osterhout Group, Inc. Local advertising content on an interactive head-mounted eyepiece
US8351057B2 (en) 2010-03-18 2013-01-08 Xerox Corporation Self-powered user interface providing assembly instructions
SG182514A1 (en) * 2010-03-30 2012-08-30 Ns Solutions Corp Information processing apparatus, information processing method, and program
CN101833896B (en) * 2010-04-23 2011-10-19 西安电子科技大学 Geographic information guide method and system based on augment reality
US8621362B2 (en) 2011-01-21 2013-12-31 Xerox Corporation Mobile screen methods and systems for collaborative troubleshooting of a device
JP2012043396A (en) * 2010-08-13 2012-03-01 Hyundai Motor Co Ltd System and method for managing vehicle consumables using augmented reality
KR101219933B1 (en) 2010-09-13 2013-01-08 현대자동차주식회사 System for controlling device in vehicle using augmented reality and thereof method
WO2012060039A1 (en) * 2010-11-02 2012-05-10 Necカシオモバイルコミュニケーションズ株式会社 Information processing system and information processing method
US8490877B2 (en) 2010-11-09 2013-07-23 Metrologic Instruments, Inc. Digital-imaging based code symbol reading system having finger-pointing triggered mode of operation
CN102116876B (en) * 2011-01-14 2013-04-17 中国科学院上海技术物理研究所 Method for detecting spatial point target space-base on basis of track cataloguing model
JP2012155403A (en) * 2011-01-24 2012-08-16 Yokogawa Electric Corp Field apparatus monitoring system
US20120249588A1 (en) * 2011-03-22 2012-10-04 Panduit Corp. Augmented Reality Data Center Visualization
DE102011017305A1 (en) * 2011-04-15 2012-10-18 Abb Technology Ag Operating and monitoring system for technical installations
US20120304059A1 (en) * 2011-05-24 2012-11-29 Microsoft Corporation Interactive Build Instructions
US9329594B2 (en) * 2011-06-06 2016-05-03 Paramit Corporation Verification methods and systems for use in computer directed assembly and manufacture
US20120326948A1 (en) * 2011-06-22 2012-12-27 Microsoft Corporation Environmental-light filter for see-through head-mounted display device
US8872852B2 (en) 2011-06-30 2014-10-28 International Business Machines Corporation Positional context determination with multi marker confidence ranking
WO2013123264A1 (en) 2012-02-17 2013-08-22 Oakley, Inc. Systems and methods for removably coupling an electronic device to eyewear
DE102013010719A1 (en) 2012-07-30 2014-01-30 Heidelberger Druckmaschinen Ag Machine state-based display of documentation
US8965624B2 (en) 2012-08-14 2015-02-24 Ebay Inc. Method and system of vehicle tracking portal
US8933970B2 (en) 2012-09-11 2015-01-13 Longsand Limited Controlling an augmented reality object
DE102012217570A1 (en) * 2012-09-27 2014-03-27 Krones Ag Method for supporting operating and changeover processes
US9120226B2 (en) 2012-10-23 2015-09-01 Lincoln Global, Inc. System and method for remotely positioning an end effector
US9952438B1 (en) * 2012-10-29 2018-04-24 The Boeing Company Augmented reality maintenance system
US9959190B2 (en) 2013-03-12 2018-05-01 International Business Machines Corporation On-site visualization of component status
ITBO20130107A1 (en) * 2013-03-12 2014-09-13 Gd Spa OPERATOR SUPPORT SYSTEM IN THE MANAGEMENT OF AN AUTOMATIC MACHINE AND CORRESPONDING METHOD AND AUTOMATIC MACHINE
EP2973533A4 (en) 2013-03-15 2016-11-30 Oakley Inc Electronic ornamentation for eyewear
JP6138566B2 (en) * 2013-04-24 2017-05-31 川崎重工業株式会社 Component mounting work support system and component mounting method
CN205691887U (en) 2013-06-12 2016-11-16 奥克利有限公司 Modular communication system and glasses communication system
ES2525104B1 (en) * 2013-06-17 2015-09-29 Proyectos, Ingeniería Y Gestión, Sociedad Anónima (P.R.O.I.N.G.E., S.A.) Supervision and support system for manual industrial assembly operations through augmented reality and use procedure
DE102013211502A1 (en) * 2013-06-19 2014-12-24 Robert Bosch Gmbh identification device
JP6355909B2 (en) * 2013-10-18 2018-07-11 三菱重工業株式会社 Inspection record apparatus and inspection record evaluation method
US9740935B2 (en) * 2013-11-26 2017-08-22 Honeywell International Inc. Maintenance assistant system
US9993335B2 (en) 2014-01-08 2018-06-12 Spy Eye, Llc Variable resolution eye mounted displays
US10068173B2 (en) * 2014-05-22 2018-09-04 Invuity, Inc. Medical device featuring cladded waveguide
DE102014012710A1 (en) 2014-08-27 2016-03-03 Steinbichler Optotechnik Gmbh Method and device for determining the 3D coordinates of an object
US9746913B2 (en) 2014-10-31 2017-08-29 The United States Of America As Represented By The Secretary Of The Navy Secured mobile maintenance and operator system including wearable augmented reality interface, voice command interface, and visual recognition systems and related methods
US9697432B2 (en) 2014-12-09 2017-07-04 International Business Machines Corporation Generating support instructions by leveraging augmented reality
EP3241084B1 (en) 2014-12-29 2020-09-16 ABB Schweiz AG Method for identifying a sequence of events associated with a condition in a process plant
US9869996B2 (en) * 2015-01-08 2018-01-16 The Boeing Company System and method for using an internet of things network for managing factory production
DE102015201290A1 (en) * 2015-01-26 2016-07-28 Prüftechnik Dieter Busch AG Positioning two bodies by means of an alignment system with data glasses
WO2016153628A2 (en) 2015-02-25 2016-09-29 Brian Mullins Augmented reality content creation
US10142596B2 (en) 2015-02-27 2018-11-27 The United States Of America, As Represented By The Secretary Of The Navy Method and apparatus of secured interactive remote maintenance assist
EP3073452B1 (en) * 2015-03-26 2020-04-29 Skidata Ag Method for monitoring and controlling an access control system
DE102015207134A1 (en) * 2015-04-20 2016-10-20 Prüftechnik Dieter Busch AG Method for detecting vibrations of a device and vibration detection system
US9589390B2 (en) 2015-05-13 2017-03-07 The Boeing Company Wire harness assembly
JP6554948B2 (en) * 2015-07-07 2019-08-07 セイコーエプソン株式会社 Display device, display device control method, and program
DE102015214350A1 (en) * 2015-07-29 2017-02-02 Siemens Healthcare Gmbh Method for communication between a medical network and a medical operating staff by means of mobile data glasses, as well as mobile data glasses
US11172273B2 (en) 2015-08-10 2021-11-09 Delta Energy & Communications, Inc. Transformer monitor, communications and data collection device
US10055869B2 (en) 2015-08-11 2018-08-21 Delta Energy & Communications, Inc. Enhanced reality system for visualizing, evaluating, diagnosing, optimizing and servicing smart grids and incorporated components
WO2017041093A1 (en) 2015-09-03 2017-03-09 Delta Energy & Communications, Inc. System and method for determination and remediation of energy diversion in a smart grid network
US10984363B2 (en) 2015-09-04 2021-04-20 International Business Machines Corporation Summarization of a recording for quality control
US9838844B2 (en) 2015-09-25 2017-12-05 Ca, Inc. Using augmented reality to assist data center operators
DE102015116401A1 (en) 2015-09-28 2017-03-30 ESSERT Steuerungstechnik GmbH System, in particular augmented reality system, for operation and / or maintenance of a technical system
MX2018004053A (en) 2015-10-02 2018-12-17 Delta Energy & Communications Inc Supplemental and alternative digital data delivery and receipt mesh network realized through the placement of enhanced transformer mounted monitoring devices.
WO2017070646A1 (en) 2015-10-22 2017-04-27 Delta Energy & Communications, Inc. Data transfer facilitation across a distributed mesh network using light and optical based technology
US9961572B2 (en) 2015-10-22 2018-05-01 Delta Energy & Communications, Inc. Augmentation, expansion and self-healing of a geographically distributed mesh network using unmanned aerial vehicle (UAV) technology
CN106896732B (en) * 2015-12-18 2020-02-04 美的集团股份有限公司 Display method and device of household appliance
US10791020B2 (en) 2016-02-24 2020-09-29 Delta Energy & Communications, Inc. Distributed 802.11S mesh network using transformer module hardware for the capture and transmission of data
EP3214586A1 (en) 2016-03-04 2017-09-06 Thales Deutschland GmbH Method for maintenance support and maintenance support system
EP3223208A1 (en) * 2016-03-22 2017-09-27 Hexagon Technology Center GmbH Self control
US10187686B2 (en) 2016-03-24 2019-01-22 Daqri, Llc Recording remote expert sessions
EP3179450B1 (en) * 2016-04-12 2020-09-09 Siemens Healthcare GmbH Method and system for multi sensory representation of an object
CN105929948B (en) * 2016-04-14 2018-12-04 佛山市威格特电气设备有限公司 Based on augmented reality self learning type intelligent helmet and its operation method
US10142410B2 (en) 2016-04-29 2018-11-27 Raytheon Company Multi-mode remote collaboration
PT3260255T (en) * 2016-06-24 2019-11-29 Zuend Systemtechnik Ag System for cutting
US10652633B2 (en) 2016-08-15 2020-05-12 Delta Energy & Communications, Inc. Integrated solutions of Internet of Things and smart grid network pertaining to communication, data and asset serialization, and data modeling algorithms
US11348475B2 (en) * 2016-12-09 2022-05-31 The Boeing Company System and method for interactive cognitive task assistance
JP2018097160A (en) * 2016-12-14 2018-06-21 セイコーエプソン株式会社 Display system, display device, and control method for display device
US11042858B1 (en) 2016-12-23 2021-06-22 Wells Fargo Bank, N.A. Assessing validity of mail item
DE102017201827A1 (en) 2017-02-06 2018-08-09 Carl Zeiss Industrielle Messtechnik Gmbh Method for correcting deviations in a manufacturing process of an article
CN108418776B (en) * 2017-02-09 2021-08-20 上海诺基亚贝尔股份有限公司 Method and apparatus for providing secure services
US10872289B2 (en) 2017-04-08 2020-12-22 Geun Il Kim Method and system for facilitating context based information
US10223821B2 (en) 2017-04-25 2019-03-05 Beyond Imagination Inc. Multi-user and multi-surrogate virtual encounters
DE102017207992A1 (en) * 2017-05-11 2018-11-15 Homag Gmbh Process for monitoring a manufacturing process
US10748443B2 (en) 2017-06-08 2020-08-18 Honeywell International Inc. Apparatus and method for visual-assisted training, collaboration, and monitoring in augmented/virtual reality in industrial automation systems and other systems
CN107358657B (en) * 2017-06-30 2019-01-15 海南职业技术学院 The method and system of interaction is realized based on augmented reality
US10573081B2 (en) * 2017-08-03 2020-02-25 Taqtile, Inc. Authoring virtual and augmented reality environments via an XR collaboration application
DE102017215114A1 (en) * 2017-08-30 2019-02-28 Deutsches Zentrum für Luft- und Raumfahrt e.V. Manipulator system and method for controlling a robotic manipulator
KR102434402B1 (en) * 2017-09-19 2022-08-22 한국전자통신연구원 Apparatus and method for providing mixed reality content
US11080931B2 (en) * 2017-09-27 2021-08-03 Fisher-Rosemount Systems, Inc. Virtual x-ray vision in a process control environment
IT201700114872A1 (en) * 2017-10-12 2019-04-12 New Changer S R L VISION APPARATUS FOR HELP WITH FIBER OPTIC WIRING
EP3701355A1 (en) * 2017-10-23 2020-09-02 Koninklijke Philips N.V. Self-expanding augmented reality-based service instructions library
EP3483104B1 (en) 2017-11-10 2021-09-01 Otis Elevator Company Systems and methods for providing information regarding elevator systems
GB201719274D0 (en) * 2017-11-21 2018-01-03 Agco Int Gmbh Implement tractor connection application
WO2019123187A1 (en) * 2017-12-20 2019-06-27 Nws Srl Virtual training method
US11074292B2 (en) * 2017-12-29 2021-07-27 Realwear, Inc. Voice tagging of video while recording
JP7017777B2 (en) * 2018-02-01 2022-02-09 国立研究開発法人産業技術総合研究所 Information processing device, information processing method, and program for information processing device
TWI659279B (en) * 2018-02-02 2019-05-11 國立清華大學 Process planning apparatus based on augmented reality
CN108388138A (en) * 2018-02-02 2018-08-10 宁夏玲杰科技有限公司 Apparatus control method, apparatus and system
US10796153B2 (en) * 2018-03-12 2020-10-06 International Business Machines Corporation System for maintenance and repair using augmented reality
US10839214B2 (en) 2018-03-13 2020-11-17 International Business Machines Corporation Automated intent to action mapping in augmented reality environments
EP3797389A1 (en) 2018-05-21 2021-03-31 I.M.A. Industria Macchine Automatiche S.p.A Method to assist an operator in performing interventions on an operating machine
US20210256771A1 (en) * 2018-06-12 2021-08-19 Current Lighting Solutions, Llc Integrated management of sensitive controlled environments and items contained therein
US11200811B2 (en) 2018-08-03 2021-12-14 International Business Machines Corporation Intelligent recommendation of guidance instructions
US11244509B2 (en) 2018-08-20 2022-02-08 Fisher-Rosemount Systems, Inc. Drift correction for industrial augmented reality applications
CN110488790A (en) * 2018-11-30 2019-11-22 国核自仪系统工程有限公司 Nuclear power instrument control operational system based on augmented reality
WO2020120180A1 (en) * 2018-12-10 2020-06-18 Koninklijke Philips N.V. Systems and methods for augmented reality-enhanced field services support
DE102019104822A1 (en) 2019-02-26 2020-08-27 Wago Verwaltungsgesellschaft Mbh Method and device for monitoring an industrial process step
US10481579B1 (en) 2019-02-28 2019-11-19 Nanotronics Imaging, Inc. Dynamic training for assembly lines
US11209795B2 (en) * 2019-02-28 2021-12-28 Nanotronics Imaging, Inc. Assembly error correction for assembly lines
DE102019002139A1 (en) * 2019-03-26 2020-10-01 Diehl Defence Gmbh & Co. Kg Procedure for process documentation
JP6993382B2 (en) * 2019-04-26 2022-02-04 ファナック株式会社 Robot teaching device
US11166050B2 (en) * 2019-12-11 2021-11-02 At&T Intellectual Property I, L.P. Methods, systems, and devices for identifying viewed action of a live event and adjusting a group of resources to augment presentation of the action of the live event
US11080938B1 (en) 2020-03-30 2021-08-03 International Business Machines Corporation Automatic summarization of remotely-guided augmented reality sessions
US11138802B1 (en) 2020-04-06 2021-10-05 Saudi Arabian Oil Company Geo-augmented field excursion for geological sites
US20230252050A1 (en) * 2020-07-06 2023-08-10 Siemens Aktiengesellschaft Method and Analysis System for Technical Devices
US11816887B2 (en) 2020-08-04 2023-11-14 Fisher-Rosemount Systems, Inc. Quick activation techniques for industrial augmented reality applications
WO2022037758A1 (en) 2020-08-18 2022-02-24 Siemens Aktiengesellschaft Remote collaboration using augmented and virtual reality
EP3971833A1 (en) 2020-09-22 2022-03-23 Koninklijke Philips N.V. Control system for an augmented reality device
KR102605552B1 (en) * 2020-12-29 2023-11-27 주식회사 딥파인 Augmented Reality System
DE102021118085A1 (en) 2021-07-13 2023-01-19 Koenig & Bauer Ag Method for providing information to a printing press and/or peripheral devices
WO2023075308A1 (en) * 2021-10-26 2023-05-04 (주)메가플랜 Electronic device for guiding maintenance of product and method for operating electronic device
EP4345554A1 (en) * 2022-09-30 2024-04-03 Murrelektronik GmbH Method for computer-assisted installation of electrical components of a machine arranged in a spatially decentralised manner

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5742263A (en) * 1995-12-18 1998-04-21 Telxon Corporation Head tracking system for a head mounted display system
US6091546A (en) * 1997-10-30 2000-07-18 The Microoptical Corporation Eyeglass interface system
US6172657B1 (en) * 1996-02-26 2001-01-09 Seiko Epson Corporation Body mount-type information display apparatus and display method using the same
US6345207B1 (en) * 1997-07-15 2002-02-05 Honda Giken Kogyo Kabushiki Kaisha Job aiding apparatus

Family Cites Families (78)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
SE8305378L (en) * 1983-09-30 1985-03-31 Asea Ab INDUSTRIAL ROBOT
FR2594968B1 (en) * 1986-02-21 1988-09-16 Alsthom ASSISTANCE DEVICE FOR ASSEMBLY OPERATIONS OF A SELF-CONTROLLED ASSEMBLY
US4834473A (en) * 1986-03-26 1989-05-30 The Babcock & Wilcox Company Holographic operator display for control systems
US4796206A (en) * 1986-06-02 1989-01-03 International Business Machines Corporation Computer assisted vehicle service featuring signature analysis and artificial intelligence
US5003300A (en) * 1987-07-27 1991-03-26 Reflection Technology, Inc. Head mounted display for miniature video display system
US5136526A (en) * 1987-09-03 1992-08-04 Reinhold Baur Determination of the thickness of a magnetic tape
US5121319A (en) * 1989-10-23 1992-06-09 International Business Machines Corporation Method and apparatus for selective suspension and resumption of computer based manufacturing processes
JP2947840B2 (en) * 1989-12-22 1999-09-13 株式会社日立製作所 Plant operation monitoring device
US5717598A (en) * 1990-02-14 1998-02-10 Hitachi, Ltd. Automatic manufacturability evaluation method and system
EP0490994B1 (en) * 1990-07-09 1997-06-04 Bell Helicopter Textron Inc. Method and apparatus for semi-automated insertion of conductors into harness connectors
JP2865828B2 (en) * 1990-08-22 1999-03-08 株式会社日立製作所 Method and device for displaying operation procedure
DE4119803A1 (en) * 1991-06-15 1992-12-17 Bernd Dipl Ing Kelle Acoustic prompting method for machine tool operation - using speech synthesiser module with programmed instructions and warnings, coupled to position display via interface
US5781913A (en) * 1991-07-18 1998-07-14 Felsenstein; Lee Wearable hypermedium system
US5450596A (en) * 1991-07-18 1995-09-12 Redwear Interactive Inc. CD-ROM data retrieval system using a hands-free command controller and headwear monitor
US5644493A (en) * 1991-08-30 1997-07-01 Nsk Ltd. Production information processing system
DE69221987T2 (en) * 1991-11-01 1998-02-05 Sega Enterprises Kk Imaging device attached to the head
US6081750A (en) * 1991-12-23 2000-06-27 Hoffberg; Steven Mark Ergonomic man-machine interface incorporating adaptive pattern recognition based control system
CA2086449C (en) * 1992-01-06 2000-03-07 Steven W. Rogers Computer interface board for electronic automotive vehicle service
WO1993014454A1 (en) * 1992-01-10 1993-07-22 Foster-Miller, Inc. A sensory integrated data interface
JPH05324039A (en) * 1992-05-26 1993-12-07 Fanuc Ltd Numerical controller
US5320538A (en) * 1992-09-23 1994-06-14 Hughes Training, Inc. Interactive aircraft training system and method
US5815411A (en) * 1993-09-10 1998-09-29 Criticom Corporation Electro-optic vision system which exploits position and attitude
WO1994025909A1 (en) * 1993-04-23 1994-11-10 Mitsubishi Denki Kabushiki Kaisha Numeric controller of machine tool and method of generating numeric control program
US5590062A (en) * 1993-07-02 1996-12-31 Matsushita Electric Industrial Co., Ltd. Simulator for producing various living environments mainly for visual perception
EP1326122B1 (en) * 1993-08-12 2006-09-06 Seiko Epson Corporation Head-mounted image display device and data processing apparatus including the same
US6061064A (en) * 1993-08-31 2000-05-09 Sun Microsystems, Inc. System and method for providing and using a computer user interface with a view space having discrete portions
US6278461B1 (en) * 1993-09-10 2001-08-21 Geovector Corporation Augmented reality vision systems which derive image information from other vision systems
JP4001643B2 (en) * 1993-10-05 2007-10-31 スナップ−オン・テクノロジイズ・インク Two-hand open type car maintenance equipment
US5475797A (en) 1993-10-22 1995-12-12 Xerox Corporation Menu driven system for controlling automated assembly of palletized elements
US6424321B1 (en) * 1993-10-22 2002-07-23 Kopin Corporation Head-mounted matrix display
US5815126A (en) * 1993-10-22 1998-09-29 Kopin Corporation Monocular portable communication and display system
JPH086708A (en) 1994-04-22 1996-01-12 Canon Inc Display device
JPH07311857A (en) * 1994-05-16 1995-11-28 Fujitsu Ltd Picture compositing and display device and simulation system
JPH085954A (en) * 1994-06-21 1996-01-12 Matsushita Electric Ind Co Ltd Spectacles type picture display device
AUPM701394A0 (en) * 1994-07-22 1994-08-18 Monash University A graphical display system
JP3069014B2 (en) * 1994-10-21 2000-07-24 株式会社東京精密 Coordinate measurement system with operation guidance
JPH08161028A (en) 1994-12-06 1996-06-21 Mitsubishi Electric Corp Operation support system
US7133845B1 (en) * 1995-02-13 2006-11-07 Intertrust Technologies Corp. System and methods for secure transaction management and electronic rights protection
US5850352A (en) * 1995-03-31 1998-12-15 The Regents Of The University Of California Immersive video, including video hypermosaicing to generate from multiple video views of a scene a three-dimensional video mosaic from which diverse virtual video scene images are synthesized, including panoramic, scene interactive and stereoscopic images
US6181371B1 (en) * 1995-05-30 2001-01-30 Francis J Maguire, Jr. Apparatus for inducing attitudinal head movements for passive virtual reality
US5745387A (en) * 1995-09-28 1998-04-28 General Electric Company Augmented reality maintenance system employing manipulator arm with archive and comparison device
JPH09114543A (en) * 1995-10-02 1997-05-02 Xybernaut Corp Handfree computer system
US6064749A (en) * 1996-08-02 2000-05-16 Hirota; Gentaro Hybrid tracking for augmented reality using both camera motion detection and landmark tracking
JP3338618B2 (en) * 1996-10-07 2002-10-28 ミノルタ株式会社 Display method and display device for real space image and virtual space image
US5912650A (en) * 1996-10-16 1999-06-15 Kaiser Electro-Optics, Inc. Dichoptic display utilizing a single display device
JP3106107B2 (en) * 1996-11-20 2000-11-06 株式会社東芝 Information communication system in plant
JP3697816B2 (en) * 1997-01-29 2005-09-21 株式会社島津製作所 Patrol inspection support system
JPH10214035A (en) 1997-01-30 1998-08-11 Canon Inc Back light device and liquid crystal display device using the same
JPH10222543A (en) * 1997-02-07 1998-08-21 Hitachi Ltd Checking, maintaining and supporting portable terminal and checking and maintaining method using it
US5912720A (en) * 1997-02-13 1999-06-15 The Trustees Of The University Of Pennsylvania Technique for creating an ophthalmic augmented reality environment
JPH10293790A (en) * 1997-04-17 1998-11-04 Toshiba Corp Power equipment work management device
DE19716327A1 (en) * 1997-04-18 1998-10-29 Branscheid Industrieelektronik Display device for manufacturing information
US6512968B1 (en) * 1997-05-16 2003-01-28 Snap-On Technologies, Inc. Computerized automotive service system
US6094625A (en) * 1997-07-03 2000-07-25 Trimble Navigation Limited Augmented vision for survey work and machine control
JPH1141166A (en) * 1997-07-18 1999-02-12 Omron Corp Radio communication system and terminal equipment therefor
US6064335A (en) * 1997-07-21 2000-05-16 Trimble Navigation Limited GPS based augmented reality collision avoidance system
US6037914A (en) 1997-08-25 2000-03-14 Hewlett-Packard Company Method and apparatus for augmented reality using a see-through head-mounted display
US6522312B2 (en) * 1997-09-01 2003-02-18 Canon Kabushiki Kaisha Apparatus for presenting mixed reality shared among operators
JPH11102438A (en) * 1997-09-26 1999-04-13 Minolta Co Ltd Distance image generation device and image display device
US6037882A (en) * 1997-09-30 2000-03-14 Levy; David H. Method and apparatus for inputting data to an electronic system
US5980084A (en) * 1997-11-24 1999-11-09 Sandia Corporation Method and apparatus for automated assembly
US6625299B1 (en) * 1998-04-08 2003-09-23 Jeffrey Meisner Augmented reality technology
US6255961B1 (en) 1998-05-08 2001-07-03 Sony Corporation Two-way communications between a remote control unit and one or more devices in an audio/visual environment
US6629065B1 (en) * 1998-09-30 2003-09-30 Wisconsin Alumni Research Foundation Methods and apparata for rapid computer-aided design of objects in virtual reality and other environments
US6195618B1 (en) * 1998-10-15 2001-02-27 Microscribe, Llc Component position verification using a probe apparatus
US6356437B1 (en) * 1999-03-29 2002-03-12 Siemens Dematic Postal Automation, L.P. System, apparatus and method for providing a portable customizable maintenance support instruction system
US6697894B1 (en) * 1999-03-29 2004-02-24 Siemens Dematic Postal Automation, L.P. System, apparatus and method for providing maintenance instructions to a user at a remote location
US6574672B1 (en) * 1999-03-29 2003-06-03 Siemens Dematic Postal Automation, L.P. System, apparatus and method for providing a portable customizable maintenance support computer communications system
US7165041B1 (en) * 1999-05-27 2007-01-16 Accenture, Llp Web-based architecture sales tool
US6725184B1 (en) * 1999-06-30 2004-04-20 Wisconsin Alumni Research Foundation Assembly and disassembly sequences of components in computerized multicomponent assembly models
US6408257B1 (en) * 1999-08-31 2002-06-18 Xerox Corporation Augmented-reality display method and system
US7124101B1 (en) * 1999-11-22 2006-10-17 Accenture Llp Asset tracking in a network-based supply chain environment
US7130807B1 (en) * 1999-11-22 2006-10-31 Accenture Llp Technology sharing during demand and supply planning in a network-based supply chain environment
JP3363861B2 (en) * 2000-01-13 2003-01-08 キヤノン株式会社 Mixed reality presentation device, mixed reality presentation method, and storage medium
US20020010734A1 (en) * 2000-02-03 2002-01-24 Ebersole John Franklin Internetworked augmented reality system and method
US6556971B1 (en) * 2000-09-01 2003-04-29 Snap-On Technologies, Inc. Computer-implemented speech recognition system training
US6442460B1 (en) * 2000-09-05 2002-08-27 Hunter Engineering Company Method and apparatus for networked wheel alignment communications and services
US6587783B2 (en) * 2000-10-05 2003-07-01 Siemens Corporate Research, Inc. Method and system for computer assisted localization, site navigation, and data navigation

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5742263A (en) * 1995-12-18 1998-04-21 Telxon Corporation Head tracking system for a head mounted display system
US6172657B1 (en) * 1996-02-26 2001-01-09 Seiko Epson Corporation Body mount-type information display apparatus and display method using the same
US6345207B1 (en) * 1997-07-15 2002-02-05 Honda Giken Kogyo Kabushiki Kaisha Job aiding apparatus
US6091546A (en) * 1997-10-30 2000-07-18 The Microoptical Corporation Eyeglass interface system
US6349001B1 (en) * 1997-10-30 2002-02-19 The Microoptical Corporation Eyeglass interface system

Cited By (47)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7171627B2 (en) * 2000-07-05 2007-01-30 Sony Corporation Device for displaying link information and method for displaying the same
US20020044162A1 (en) * 2000-07-05 2002-04-18 Ryusuke Sawatari Device for displaying link information and method for displaying the same
US8042050B2 (en) * 2001-07-31 2011-10-18 Hewlett-Packard Development Company, L.P. Method and apparatus for interactive broadcasting
US20030043179A1 (en) * 2001-09-06 2003-03-06 International Business Machines Corporation Method and apparatus for providing user support based on contextual information
US20030046410A1 (en) * 2001-09-06 2003-03-06 International Business Machines Corporation Method and apparatus for providing entitlement information for interactive support
US6871322B2 (en) 2001-09-06 2005-03-22 International Business Machines Corporation Method and apparatus for providing user support through an intelligent help agent
US6973620B2 (en) * 2001-09-06 2005-12-06 International Business Machines Corporation Method and apparatus for providing user support based on contextual information
US6976067B2 (en) 2001-09-06 2005-12-13 International Business Machines Corporation Method and apparatus for providing entitlement information for interactive support
US20040183751A1 (en) * 2001-10-19 2004-09-23 Dempski Kelly L Industrial augmented reality
US7126558B1 (en) 2001-10-19 2006-10-24 Accenture Global Services Gmbh Industrial augmented reality
US20060244677A1 (en) * 2001-10-19 2006-11-02 Dempski Kelly L Industrial augmented reality
US7372451B2 (en) 2001-10-19 2008-05-13 Accenture Global Services Gmbh Industrial augmented reality
US7852355B2 (en) * 2003-11-10 2010-12-14 Siemens Aktiengesellschaft System and method for carrying out and visually displaying simulations in an augmented reality
US20070088526A1 (en) * 2003-11-10 2007-04-19 Wolfgang Friedrich System and method for carrying out and visually displaying simulations in an augmented reality
US8158431B2 (en) 2004-08-31 2012-04-17 Sysmex Corporation Remote control system
EP1630708A1 (en) * 2004-08-31 2006-03-01 Sysmex Corporation Remote control method, remote control system, status informing device and control apparatus
US7998741B2 (en) 2004-08-31 2011-08-16 Sysmex Corporation Remote control system
US20060046299A1 (en) * 2004-08-31 2006-03-02 Sysmex Corporation Remote control method, remote control system, status informing device and control apparatus
US20110191409A1 (en) * 2004-08-31 2011-08-04 Sysmex Corporation Remote control method, remote control system, status informing device and control apparatus
US8394636B2 (en) 2004-08-31 2013-03-12 Sysmex Corporation Remote control method, remote control system, status informing device and control apparatus
US8760471B2 (en) 2010-04-28 2014-06-24 Ns Solutions Corporation Information processing system, information processing method and program for synthesizing and displaying an image
US20110316845A1 (en) * 2010-06-25 2011-12-29 Palo Alto Research Center Incorporated Spatial association between virtual and augmented reality
CN102402778A (en) * 2010-07-23 2012-04-04 株式会社泛泰 Apparatus and method for providing augmented reality using additional data
US20120019547A1 (en) * 2010-07-23 2012-01-26 Pantech Co., Ltd. Apparatus and method for providing augmented reality using additional data
CN102292707B (en) * 2011-05-11 2014-01-08 华为终端有限公司 Method and system for implementing augmented reality applications
US8743146B2 (en) 2011-05-11 2014-06-03 Huawei Device Co., Ltd. Method and system for implementing augmented reality application
CN102292707A (en) * 2011-05-11 2011-12-21 华为终端有限公司 Method and system for implementing augmented reality applications
US20130083063A1 (en) * 2011-09-30 2013-04-04 Kevin A. Geisner Service Provision Using Personal Audio/Visual System
US9128520B2 (en) * 2011-09-30 2015-09-08 Microsoft Technology Licensing, Llc Service provision using personal audio/visual system
WO2015125066A1 (en) * 2014-02-19 2015-08-27 Fieldbit Ltd. System and method for facilitating equipment maintenance using smartglasses
WO2015160515A1 (en) 2014-04-16 2015-10-22 Exxonmobil Upstream Research Company Methods and systems for providing procedures in real-time
US9955059B2 (en) 2014-10-29 2018-04-24 Kabushiki Kaisha Toshiba Electronic device, method, and computer program product
US20170344958A1 (en) * 2014-12-19 2017-11-30 Robert Bosch Gmbh Identification and repair support device and method
US10481594B2 (en) 2015-03-12 2019-11-19 Schleuniger Holding Ag Cable processing machine monitoring with improved precision mechanism for cable processing
US10177547B2 (en) 2015-03-12 2019-01-08 Schleuniger Holding Ag Cable processing machine with improved precision mechanism for cable processing
US10581228B2 (en) 2015-03-12 2020-03-03 Schleuniger Holding Ag Cable processing machine with improved precision mechanism for cable processing
US10431108B2 (en) 2015-04-20 2019-10-01 NSF International Computer-implemented techniques for interactively training users to perform food quality, food safety, and workplace safety tasks
US20180338119A1 (en) * 2017-05-18 2018-11-22 Visual Mobility Inc. System and method for remote secure live video streaming
WO2019174672A3 (en) * 2018-03-11 2019-11-21 Inovation Gmbh Method for operating a pair of smart glasses, method for assisting a person performing the activity, method for commissioning goods, smart glasses, device for actuating functions, system consisting of smart glasses and a computer communicating therewith, goods store and commissioning trolley
US11681970B2 (en) * 2018-04-30 2023-06-20 Telefonaktiebolaget Lm Ericsson (Publ) Automated augmented reality rendering platform for providing remote expert assistance
US11094220B2 (en) 2018-10-23 2021-08-17 International Business Machines Corporation Intelligent augmented reality for technical support engineers
CN109658519A (en) * 2018-12-28 2019-04-19 吉林大学 Vehicle multi-mode formula augmented reality system based on real traffic information image procossing
US11397972B2 (en) 2019-05-10 2022-07-26 Controlexpert Gmbh Method for assessing damage to a motor vehicle
WO2021047823A1 (en) 2019-09-12 2021-03-18 Daimler Ag Method for operating a communication platform for troubleshooting for a motor vehicle, and communication platform
US11507400B2 (en) 2020-02-28 2022-11-22 Wipro Limited Method and system for providing real-time remote assistance to a user
US11270459B2 (en) * 2020-04-22 2022-03-08 Dell Products L.P. Enterprise system augmented reality detection
US11943227B2 (en) 2021-09-17 2024-03-26 Bank Of America Corporation Data access control for augmented reality devices

Also Published As

Publication number Publication date
EP1183578A1 (en) 2002-03-06
JP2003524814A (en) 2003-08-19
WO2000052542A1 (en) 2000-09-08
DE50003377D1 (en) 2003-09-25
EP1159657B1 (en) 2003-08-20
EP1183578B1 (en) 2003-08-20
EP1157315B1 (en) 2004-09-22
EP1157316A1 (en) 2001-11-28
WO2000052537A1 (en) 2000-09-08
EP1157316B1 (en) 2003-09-03
JP2002538543A (en) 2002-11-12
EP1159657A1 (en) 2001-12-05
JP2002538700A (en) 2002-11-12
US20020069072A1 (en) 2002-06-06
WO2000052540A1 (en) 2000-09-08
US6941248B2 (en) 2005-09-06
WO2000052539A1 (en) 2000-09-08
DE50007901D1 (en) 2004-10-28
WO2000052538A1 (en) 2000-09-08
WO2000052541A1 (en) 2000-09-08
US20020046368A1 (en) 2002-04-18
WO2000052536A1 (en) 2000-09-08
US20020049566A1 (en) 2002-04-25
EP1157314A1 (en) 2001-11-28
JP2002538542A (en) 2002-11-12
US20080100570A1 (en) 2008-05-01
DE50003357D1 (en) 2003-09-25
DE50007902D1 (en) 2004-10-28
EP1157314B1 (en) 2004-09-22
US8373618B2 (en) 2013-02-12
JP2002538541A (en) 2002-11-12
DE50003531D1 (en) 2003-10-09
EP1157315A1 (en) 2001-11-28

Similar Documents

Publication Publication Date Title
US20020067372A1 (en) Utilizing augmented reality-based technologies to provide situation-related assistance to a skilled operator from remote experts
US7324081B2 (en) Augmented-reality system for situation-related support of the interaction between a user and an engineering apparatus
US7814122B2 (en) System and method for documentation processing with multi-layered structuring of information
US7103506B2 (en) System and method for object-oriented marking and associating information with selected technological components
CN108089696B (en) Virtual reality and augmented reality for industrial automation
Friedrich et al. ARVIKA-Augmented Reality for Development, Production and Service.
US7010362B2 (en) Tool for configuring and managing a process control network including the use of spatial information
US7474929B2 (en) Enhanced tool for managing a process control network
US6738040B2 (en) Different display types in a system-controlled, context-dependent information display
CN100592230C (en) Wireless handheld communicator in a process control environment
CN113703569A (en) System and method for virtual reality and augmented reality for industrial automation
Wucherer HMI, the Window to the Manufacturing and Process Industry
Kunnen et al. System based Component Identification using Coordinate Determination in Mixed Reality
Carlberg Development of control system for modularized heating, ventilation and air conditioning test rig-Introducing Labview as a graphical user interface

Legal Events

Date Code Title Description
AS Assignment

Owner name: SIEMENS AKTIENGESELLSCHAFT, GERMANY

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:FRIEDRICH, WOLFGANG;WOHLGEMUTH, WOLFGANG;REEL/FRAME:012446/0830;SIGNING DATES FROM 20010809 TO 20010911

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION