US20080026838A1 - Multi-player non-role-playing virtual world games: method for two-way interaction between participants and multi-player virtual world games - Google Patents

Multi-player non-role-playing virtual world games: method for two-way interaction between participants and multi-player virtual world games Download PDF

Info

Publication number
US20080026838A1
US20080026838A1 US11/465,918 US46591806A US2008026838A1 US 20080026838 A1 US20080026838 A1 US 20080026838A1 US 46591806 A US46591806 A US 46591806A US 2008026838 A1 US2008026838 A1 US 2008026838A1
Authority
US
United States
Prior art keywords
virtual world
game system
participant
world
player
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US11/465,918
Inventor
James E. Dunstan
Steven Bress
Daniel Bress
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to US11/465,918 priority Critical patent/US20080026838A1/en
Publication of US20080026838A1 publication Critical patent/US20080026838A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/70Game security or game management aspects
    • A63F13/79Game security or game management aspects involving player-related data, e.g. identities, accounts, preferences or play histories
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8082Virtual reality
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8094Unusual game types, e.g. virtual cooking

Definitions

  • the present invention relates to multi-player Virtual World Games and, more specifically, to systems and methods for allowing two-way interaction in multi-player virtual world games, suitable for broadcast “reality” entertainment and/or location-based entertainment.
  • Multi-Player Virtual World games are well known in the entertainment art and include such online computer games as Everquest, Dark Age of Camelot and World of Warcraft.
  • a player uses a home computer as an interface to these games.
  • Our current invention in general, teaches new systems and methods to interface with a MVWG, and new MVWG based on these new systems and methods.
  • Our current invention is concerned with the following broad areas: Role-Playing, Number of Participants, Interaction, Motion Capture, Real-Time, Expression Mapping, Objects, Number of Players, Story Telling and Physical Level of Effort.
  • First-Person Shooters such as Quake and Halo, although not traditionally thought of as RPGs for the purposes of our discussion are considered RPGs.
  • the ability to aim and fire a weapon in the real world requires a skill set including holding a weapon steady, aiming, breathing techniques during shooting, and reaction to recoil, among others.
  • the skill set for firing a weapon in Quake and Halo includes merely moving a mouse and clicking. Since an Avatar in Quake and Halo interact in the virtual world by play mechanics rather than a player's real world abilities they are role-playing games.
  • VR Soccer a player makes contact with a real soccer ball.
  • the game gathers information about the movement of the ball, and this movement is used to move the ball in the virtual world.
  • VR Soccer would be considered a non-role-playing game.
  • the interactive action in VR Soccer is limited to only kicking the ball, the player neither runs, nor otherwise control his/her position within the videogame.
  • Physical Level of Effort refers to activity in the Real World required to produce a result in the Virtual World. For example, to move forward in World of Warcraft (WoW) a player just needs to press the up arrow and his character in game will move forward indefinitely. Therefore the physical Level of Effort to play WoW is near zero. To play soccer in real life requires running up and down the field for 90 minutes, the physical level of effort to play is therefore “high”. To play VR Soccer a player takes 3-4 steps and kicks a soccer ball into the screen. This kicking is repeated a few times during the few minutes of play of the game. Therefore the physical Level of Effort to play VR Soccer is “low.”
  • Objects for this discussion refer to objects a player may manipulate that exist simultaneously in both the real and virtual world. For example, in VR Soccer a player kicks a real ball into a screen, after the game detects the ball motion, a virtual ball moves in the virtual world.
  • U.S. Pat. No. 6,162,123 teaches the use of an object, such as a sword as an input device for an interactive game.
  • Current art uses objects in arcade games/simulations. Current art uses objects solely as alternative game/simulation input devices.
  • Motion Capture is well known in the art. It is the process of translating a live performance into a digital performance.
  • U.S. Pat. No. 6,831,603 has an in-depth discussion of the current state of the art of motion capture.
  • Expression Mapping is well known in the art. It is the process of adding facial animation to virtual beings.
  • U.S. Pat. No. 6,735,566 is one of many methods for expression mapping.
  • World of Warcraft has short emotive animations a player may activate for his virtual character. Examples include: cry, cheer, sleep, and yawn.
  • the virtual character's expression remains unchanged, the emotion is created by virtual body language and sound. Additionally a player may misrepresent his current expression with this system.
  • An action in the real world such as clicking a mouse in WoW or hitting a soccer ball in VR Soccer may have an effect in the virtual world.
  • the length of time it takes for the real world action to have an effect in the virtual world is referred to as lag.
  • the less lag the closer to real time a game is.
  • Games such as VR Soccer are not real time, as a player may take 3-4 steps forward before kicking a ball, and then the ball has a certain amount of travel time before being recognized by the game and a virtual ball moved. A few seconds may elapse from when a player initiates movement and when a change is made in the virtual world.
  • Interaction refers to the ability to make changes to the Virtual World.
  • WoW a player may take the action of collecting a virtual herb. This virtual herb is subsequently not available to another player.
  • games such as VR Soccer the Virtual World is unchanging, the only change being a player's score.
  • Our USPTO “An Improved Massively-Multiplayer On-Line Game” teaches the use of unique arcade-style input devices for Massively-Multiplayer On-Line Games.
  • a player of WoW creates a story for his virtual character, of virtual victories and defeats, good deeds and bad.
  • a player of VR Soccer does not create a story, he earns a score, which at most can be compared with prior static scores stored in a particular machine.
  • FIG. 1 presents an overview of current art in the broad areas our current invention is concerned with.
  • the present invention is directed toward a multi-player virtual world game system (the “Game System”) and methods that comprises a new and unique entertainment experience.
  • Game System multi-player virtual world game system
  • This system and method comprise a novel combination of elements, which are presented in FIG. 2 .
  • This novel combination of new and previously known elements together comprises a unique entertainment experience, suitable for “reality TV” and/or location-based entertainment.
  • the present novel Game System is directed toward inserting a player into a virtual world, where the player's real world actions, articulated movements, activities and skills determine what actions and activities his Avatar in the virtual world does. Conversely, experiences of the Avatar in the Virtual World are translated back into the real world as feedback.
  • the critical observation is: the contestants undergo the physical challenge of an epic adventure with none of the dangers.
  • the present Game System is able to simulate the physical challenge of a jungle adventure without biting insects, poisonous snakes, leaches, anacondas and piranhas, not to mention the enormous expense a real jungle adventure would entail.
  • the present Game System is a unique combination of elements consisting of:
  • FIG. 1 is an overview of the current art.
  • FIG. 2 is an overview of the current art and our current invention.
  • the present Game System is a unique combination of elements consisting of:
  • Motion Capture is well known in the art. It is also a fast evolving field. Our present Game System is not concerned with any specific motion capture system.
  • the motion capture systems allow a computer to determine the real world position of motion capture sensors.
  • mapping between a motion capture system in the real world and an Avatar in the virtual world.
  • the simplest case would be a single sensor on a participant. The participant may be in a 20′ by 20′ room. If there were a similarly square room in the virtual world, the single sensor could be used to tell the computer where in the room the Avatar should be placed in the virtual world, and at what height. If the participant paced around the room, the Avatar could be made to pace around the room at the same rate. For this case, there would not be much more information available, but it does illustrate a simple mapping between the real world and a virtual world.
  • a more complex system and one that is closer to our preferred embodiment, multiple sensors would be used on a participant.
  • multiple sensors In a single sensor method while the position of a participant in the room was known, there was no information available about the participant's body position, or the way that he moved his limbs as he moved around the room.
  • sensors may be placed at either side of every major joint on a participant's arms and legs. This allows even a participant's head position and head tilt angle to be measured accurately.
  • the motion capture system then has very detailed information about the exact manner in which a participant's musculoskeletal system is moving at any given time.
  • This data may be fed into the virtual world for a fairly direct mapping from the real world to the virtual world. That is, a particpant's right leg bone, which is connected to the thighbone, just moved to an angle of 7.5 degrees, and the thighbone, which is connected to the hipbone, just moved to 2.4 degrees.
  • This information is used by the virtual world to move the participant Avatar's bones in a similar manner.
  • the motion of an Avatar may take on a very realistic lifelike look and feel.
  • Interacting with a virtual world is well known in the art, especially in multi-player virtual world games (MVWG) such as Everquest and Dark Age of Camelot.
  • MVWG virtual world games
  • a human interacts with a computer, sending commands for the player's Avatar to do things (e.g. pick up an item).
  • the virtual environment then responds and reacts to these actions (e.g., the item appears in the “inventory” of the Avatar).
  • This interaction is all within the virtual world, and not within the real world of the player.
  • devices have been used to provide feedback from virtual world to the player (e.g., forcefeedback joysticks, such as Microsoft Sidewinder Force-Feedback Pro), this tactile feedback cannot be characterized as true two-way interaction.
  • the Game System if a human player begins to run on her treadmill, her Avatar will begin running in the virtual world. If the path of the Avatar's run is uphill in the virtual world, this virtual world environment change is transmitted fed back through the Game System to the treadmill, whereupon either the angle of the treadmill or the tension of the belt is changed to require the real world human to work harder to “run” uphill in the virtual world. If the exertion is too much for the player, she will stop, turn, or otherwise avoid continuing that path in the virtual world, because the exertion is too much.
  • Our present Game System is designed for dozens of players and to be reset/reused after a relatively short period of time, compared to existing MVWGs. Additionally, it is a goal of our present Game System to offer different experiences to each contestant. Thus our present Game System allows participants to have permanent interaction with the virtual world, limited by good game design. This permanent interaction takes two main forms, taking/consuming virtual objects, and changing the virtual terrain.
  • Our present Game System does not respawn objects as MVWGs do. That is, if there are ten apples on a virtual tree and Player A takes one, when Player B sees the tree there will be nine apples on it.
  • written hints are another class of objects, which could be taken. Additionally, a contestant may write hints/letters in the real world that are transferred to the virtual world by the actions of his Avatar and left for other contestants to find.
  • contestant can pick up, she can also put down in a different location. That is, contestant's can move objects. Permanently moving objects is not possible in most MVWGs. If moving objects is implemented, the moved object may disappear (despawn) after a pre-selected period.
  • participant can change the state of objects. For example, if a participant unlocks a door, that door will be in an unlocked state when the next participant encounters it. Similarly, an object, such as a chest may be left in an open condition. In contrast to existing MVWGs our present Game System does not reset the state of objects after a pre-determined amount of time.
  • Our present Game System allows participants to change the virtual terrain.
  • terrain marking such as spray painting.
  • the contestants equip themselves with a spray paint can in the virtual world, they are given a replica can to carry in the real world, as discussed above.
  • This real world spray paint can replica may have buttons on it indicating right arrow, left arrow, up arrow, skull and crossbones, etc. So a participant would “aim” the can as he would a weapon and press a button to indicate what he wants to spray paint in the virtual world.
  • a participant could mark a path out of a “dungeon” or direct a lagging party member which direction to go at a crossroads by actions in the real world that are translated into the virtual world.
  • the use of system resources for terrain marking can be controlled by limiting the means of terrain marking, for example by limiting how many spray paint cans exist in the virtual world and how many “marks” can be made by a particular can and also by the number of different “marks” that can be made.
  • Participants may permanently change the virtual terrain by destructive methods as well, such as by blowing bridges, cutting down trees, breaking windows, etc., with the only constraint being good game play.
  • a virtual small bridge may span a virtual ravine. If there were other ways to bypass the ravine, our present Game System may allow the small bridge to be blown up. However, if crossing that bridge were key to all contestants, then our present Game System would not allow it to be blown up. This can be accomplished to not allowing sufficient virtual explosives into the virtual world to blow it up, or deny access to portions of the bridge, such as support piers to blow it up. As discussed above in terrain marking, all participants will be able to see the results of terrain destruction.
  • blowing open a door may not only help a contestant's team, it may also help his opponent's team.
  • a participant may need to decide between blowing a door open quickly and taking more time to unlock then lock it to thwart his opponents.
  • a participant may blow down a door as a false lead, in an effort to mislead other participants.
  • the use of system resources for terrain destruction can be controlled by limiting the amount of virtual destructive material in the game. Such as limiting the amount of virtual explosives, and/or limiting the amount of gas for a gas-powered saw, etc.
  • Real Time means is defined as meaning that actions participants input for their Avatars occur with as little delay as possible consistent with current technology (less than a few seconds).
  • MVWGs such as Everquest and Dark Age of Camelot.
  • contestants are able to use the same control devices for their gaming session, so any breaks in a gaming session are solely at their discretion.
  • Our present Game System may require participants to physically move between different simulators during a single gaming session, thus there is a period where participants are not able to control their Avatars. Having Avatars enter an “idle” stage is discussed below.
  • Avatars exist in the virtual world, but their participants are not able to control them.
  • a contestant bikes to a virtual river He then wants to cross the virtual river by swimming.
  • a participant exits a bicycle simulator a member of the staff/crew puts the contestant's Avatar into idle mode, and the participant must walk to the swimming simulator, where the staff/crew toggles the participant's Avatar off idle mode.
  • AI Artificial Intelligence
  • An Artificial Intelligence (AI) could be written for directing a participant's Avatar while he is between simulators, for example if he was attacked by a virtual wild animal. That is not an ideal situation. It would really not be fair to have a participant's Avatar damaged when the contestant was not in control. A more ideal situation would be to design safe zones in the virtual world. That is, an area in the virtual world without non-player characters (NPCs) and hidden from view from other participant's Avatars. For example, the only way to enter a virtual river may be a small depression, with high reeds around it.
  • the objects an Avatar can carry is determined by the number of “spaces” in his bags, the virtual weight of the objects or both.
  • a key element of our present Game System is a physical challenge. Therefore, the objects an Avatar can carry in the virtual world are determined by what a participant can carry in the real world. The only objects a participant can use in the real world (while playing our present Game System) are objects that his Avatar acquires in the virtual world.
  • Radios Other objects are utility objects, such as radios.
  • a radio may be required to call for a supply drop, or simply to ask questions, download maps, etc. So not only will participants have to carry a replica radio, they would also be required to carry a means of powering a radio and any other virtual electronic gear they intend on using in the virtual world.
  • Good game design is giving participants choices. In this case participants may decide to depend upon batteries or they may opt for hand-powered devices to create power.
  • the goal of our present Game System is to simulate the physical challenge of an epic adventure with none of the dangers. For example, walking and running can be simulated by putting a contestant on a treadmill and varying the pace and elevation. The future is certain to bring new technology to accomplish the same task, such as “Shifty Tiles” invented by University of Tsukuba and ATR Media Information Research Labs http://www.trnmag.com/Stories/2004/081104/Shifty tiles bring walking to VR Brief 081104.html0.
  • Our present Game System is described using current cost-efficient technology, but is not limited by this description.
  • Stair Simulators The most common type of Stair Simulator is characterized by two independent pedals, such as the StairMaster® Free Climber series. This type of Stair Simulator is less than ideal however. Most stair risers are eight inches in height. So to simulate natural movement, a participant's throw on a pedal should be consistently eight inches. However, since these type of machines have a free range on the pedals, the throw may vary widely. This motion would look unrealistic when motion captured and mapped to the virtual world.
  • StairMaster® StepMill 7000 A more desirable alternative is the StairMaster® StepMill 7000.
  • This device is a revolving staircase; therefore it forces a participant into the proper motion, for motion capture. It is capable of speeds from 24 to 162 steps per minute, controllable by a user or through an external program through its C.S.A.F.E. interface. No special training is required, just step onto it and climb stairs. Its form factor quality is OK as is, and can be enhanced with some modification. Finally at a price of $5,000 it is cost efficient.
  • the StepMill 7000 is adequate direct from the factory. However it can be improved for use with our present Game System with some modifications.
  • Our present Game System is designed for reality TV and/or location based entertainment. For reality TV, part of the broadcast show will be participants on the simulators. In addition, location based entertainment most likely would be structured so that an audience could watch a game in progress. Therefore the side railings and the Console (Head) are not necessary and obstruct view of a participant.
  • a participant needs to adjust the speed of the device, as a participant may want to rest, walk or run up the steps in the virtual world.
  • a participant controller interface to the unit is required if the console is removed.
  • the art allows a very small, wireless interface device, with a pause button, and up arrow and down arrow to be built, which could be strapped to a participant's arm and/or worn on a belt.
  • Real World motion may be mapped on a one to one basis with the Virtual World. That is, one step on the StepMill is one step in the virtual world. Sensors may be placed on top of the revolving steps. These sensors could send information to our current Game System when a participant's foot hits the top of a step. This would aid mapping the participant's Avatar's feet to the virtual stairs he is climbing.
  • Treadmills are well known in the art.
  • a typical treadmill is the Life Fitness® 9500. It main components are a 62 ⁇ 18 inch revolving belt, powered by a 4.0 hp motor. User adjustable speed from 0.5-12 mph in 0.1 increments, and user adjustable incline 0-15%. This treadmill also features safety bars and grips. These are undesirable for our present Game System, as if a participant were to hold on to the equipment, their body motion would not be realistic for motion capture. Therefore any potential handholds would need to be removed.
  • a treadmill can be improved for use with our present Game System with some additional modifications.
  • the form factor quality of a treadmill can be improved by the removal of the head. This would necessitate building a participant controllable interface, as above, to enable a participant to speed up, slow down, or stop the treadmill.
  • Our present Game System requires the current speed of the treadmill, in order to track a participant's progress through the virtual world. This information could either come from the contestant's interface, or a separate interface between the treadmill and our present Game System.
  • a interface would be required between the treadmill and our present Game System so that the game could vary the angle of the treadmill to match the virtual terrain. That is, raise the angle of the treadmill when going uphill, and lowering it when going downhill.
  • a treadmill could be mounted at a negative angle, so its' range of motion could be say ⁇ 5% to +10% instead of the standard 0-15%.
  • An improvement would be to mount a treadmill on a 2-degree-of-freedom motion platform. This would allow the treadmill to move front and back more than the 15% standard and also side-to-side, to better simulate the virtual terrain.
  • An additional benefit of this system is that rough terrain can be simulated by introducing small random motion to the motion platform. This system would even allow a small earthquake to be simulated.
  • Motion platforms are well known in the art. The major considerations in mounting a treadmill to a motion platform are obvious safety issues which are well known in the art, so are not covered here.
  • Contestant Control of a Walking/Running Simulator As discussed above, a participant would require a small interface device, capable of being mounted on an arm or hung on a belt, to control the speed of a treadmill.
  • the preferred embodiment of our present Game System is a team game. Therefore a contestant may wish to go the same speed as a team leader. Therefore an additional button to set the treadmill at the same speed as the team leader would be an improvement.
  • the Game System calls for an interface device to allow for such virtual world navigation.
  • a device could be, but is not limited to: A joystick, a hand-held game controller, a head tracking system, or a shoulder angle tracking system.
  • An ideal switch would be unobtrusive for aesthetic reasons. It should be small so as to not interfere with the physical activities required of a participant. Finally, it should be activated in a method as to not interfere with realistic motion capture.
  • a disadvantage of this method is that is partial obscures a participant's face.
  • Switch similar to Advanced Multimedia Devices, Inc. Flat Mini-Beam Switch (SSW-MB) http://www.amdi.net/sensors.htm. This switch is small enough to attach to the palm of a participant's hand. It is activated by proximity so it could be positioned that a small inward to palm motion with a thumb would trigger it. In the future, as technology gets smaller and less expensive, it may be possible use a combination eye tracker and blink detector for motion control such as made by EyeTracking, Inc. http://www.eyetracking.com/.
  • Bicycle Trainers can be broken down into two broad categories: stationary exercise bikes and bike trainers, which attach a resistance mechanism to real bike. Stationary bikes fall into two categories, upright and recumbent. Life Cycle® manufactures models of each stationary bike category. Upright stationary bikes position a contestant close to the proper position for realistic motion capture. Both stationary bikes and bike trainers can increase resistance to simulate going up hills. Stationary bikes do not have gears. As proper use of gears is a very important component of bicycling in the real world, a bike trainer is closer to an ideal bike simulator than a stationary bike.
  • Racer Mate® manufactures a bike trainer “Pro BASIC CompuTrainerTM” http://www.racermateinc.com/compu pro basic.asp which is a good choice (at the time of this writing).
  • a frame attaches to a bike's rear wheel.
  • the bike's rear wheel is positioned off the floor, seated in a load generator, which can change resistance.
  • the Pro Basic model can simulate up to a 15% incline.
  • the Pro Basic additionally comes equipped with a built-in interface and additional software of use to a game designer.
  • a device such as the Pro Basic comes close to an ideal simulator, as discussed above, as it incorporates a real bike.
  • a bike trainer can be improved for use with our present Game System.
  • a simple improvement would be to mount a fan in front of a participant. The fan speed would be synchronized with the virtual speed of the bike, thus giving a participant feedback and making for a more interesting visual for spectators.
  • a bike trainer can be further improved by incorporating a method to brake in the virtual world. Sensors, such as strain sensors can be positioned in a bike's handbrakes. This would enable a participant to use the brakes as they are normally used and send this braking information to the virtual world.
  • Pressure sensors can be put in the handgrips of a bike. The harder a participant squeezes the tighter the virtual bike would turn in that direction. As participants rest some of their weight on the front of the handgrips, the sensors would need to be positioned at the rear, so that they would only be activated by a gripping motion with the fingers.
  • the only disadvantage to this system is that this is not a natural method of steering a bike, and a training period may be required before a participant is comfortable with it.
  • the bike trainer as discussed above is close to an ideal simulator. Its major deficiency is that it is unable to lean, as a real bike leans, for turning. Thus the bike trainer is unable to give proper body motion for realistic motion capture.
  • Tectrix® http://www.refstar.com/tectrix/products/vr/index.html manufactures a stationary bike called the VR Bike. On this stationary bike, the seat and pedals rotate on a horizontal axis. When a rider initiates a turn, the VR Bike leans into it, thus providing realistic leaning motion for motion capture. If this leaning motion was deemed necessary for a particular version of our current Game System, modifications to this device and the virtual world could be made.
  • our present Game System is not limited to that.
  • One of the goals of our present Game System is to present various physical challenges to participants. If the physical challenge is real, that is, our participants sweat, what is actually being simulated does not have to simulate known devices in the real world.
  • the Tectrix VR Bike can simulate varying terrain. However, as the hand positions do not match a real recumbent bike's hand positions, a fantasy bike can be created for the virtual world that would match the hand positions of the Tectrix VR bike. In effect we have a real world simulator for a fantasy device.
  • the Tectrix VR Bike would require some modification before being used in this role. For form factor reasons the head and the behind-the-seat speakers can be removed. Pressure sensors can be positioned in the handgrips to give a contestant a means to indicate braking. More on Fantasy Simulators below.
  • a lap pool is an open water simulator.
  • the disadvantage of using a lap pool is that turning around at the end of the lap pool does not make for realistic motion capture.
  • Endless Pools® http://www.endlesspools.com manufactures a solution. They bill themselves as a swimmer's treadmill. Their device uses a water jet to create an adjustable current in a pool as small as 8 ⁇ 15 feet. A device such as this comes close to our ideal simulator.
  • One disadvantage of using a device such as this is that it is not participant controlled. That is, the participant has to swim at whatever speed has been set into the device to stay in the middle of the pool. A participant could be given an input device to be worn on the wrist to change the speed of the current. However, while a participant is inputting the change the current may push him to the back wall, which would be undesirable. Another solution is the participant could shout out the change in speed he wants, i.e. “plus two”, “minus three”, etc. and a member of the crew/staff could input the change for him, or voice recognition software could be used and coded to make the required changes in speed. The disadvantages of these methods is that the result is jerky unnatural movement when the speed of the current is changed.
  • a sensor is used to determine a swimmers distance from the front of the pool.
  • a small neutral area is assigned in the middle of the pool. If a swimmer moves out of the neutral zone and towards the front of the pool, the velocity of the water stream is increased until the swimmer is back in the neutral position. Similarly if the swimmer moves toward the back of the pool, the water stream is reduced. Thusly a participant is in precise control of the speed he wants to go, simply by speeding up or slowing down.
  • An ideal swimming simulator would have a method to navigate through the virtual world. Unfortunately this is not possible with the current art.
  • the solution is to change the game. That is, if the game calls for swimming in the virtual world the path a participant needs to take in the virtual world is straight, so the simulator and virtual world are in synch. For example, in the game the challenge may be to swim across a one-mile wide virtual river before the opposition, but where on the other side a participant lands is immaterial.
  • the PaddleOne C can be modified to be suitable for our present Game System. This device does not have any electronics, so there is no method for our present Game System to know how much effort (how fast) a participant is doing. The amount of work a participant is doing can be determined by placing a strain sensor where the cable attaches to the paddle or by putting a pressure sensor on the paddle, where a contestant's hand near the paddle blade rests. The information from sensors such as these can be interpolated into the speed of the virtual canoe.
  • Modifying the PaddleOne C as described above would be an adequate canoe simulator for our present Game System. Producing location-based entertainment or a reality show is as much art as science.
  • the PaddleOne C's strength is that it is cost-efficient and easy for a participant to hop on it and start paddling. If canoeing is a small part of the story/challenge of the game being staged the PaddleOne might be preferred embodiment. However, if canoeing is a large part of the story/challenge another solution may be preferred.
  • One simulator tethers a canoe in a body of water such as a pool.
  • the other simulator creates a current in a pool for a free-floating canoe to row against, similar to Endless Pool discussed above.
  • the tether solution is considerably less expensive that the free-floating one.
  • the free-floating solution has a more aesthetic quality, it has much better visuals and it gives the contestants more of a challenge, as the free-floating canoe is much harder to control than the tethered one.
  • Both the tethered and free-floating canoe simulators use a real canoe.
  • This canoe can be a multi-person canoe. This can lead to additional challenges for participants, for example a single person canoe is easier to control than one where multiple people are rowing.
  • Using a multi-person canoe offers participants more choices. For example, not all participants need to be rowing at the same time, that is, in a four man canoe, two participants may rest.
  • participants must carry objects in the real world that they need to survive, such as food and water, and they must also carry objects that are the physical manifestation of objects they will need in game, think “Staff of Ra” in “Raiders of the Lost Ark”.
  • a fully loaded canoe is harder to paddle than an empty one, which gives the participants more choices as they prepare for the canoe portion of their story/challenge.
  • a real canoe is modified by having bungee cords mounted port stern, port bow, starboard stern, and starboard bow. Participants load and enter the canoe on the side of a pool. Staff/crew then position the canoe in the center of a pool and connect the bungee cords to the appropriate corners of the pool. Stain sensors are positioned on each of the bungee cords. These sensors give our present Game System enough information to determine speed and direction of the virtual canoe. That is, the more tension on the stern bungee cords, the faster the virtual canoe is going. The more tension on the port bow bungee the more acutely the virtual canoe is turning to starboard.
  • a free-floating canoe simulator is described in more detail in our USPTO Provisional Patent application entitled An Improved Artificial Water Current Control Device.
  • this canoe simulator tracks where the front of the canoe is relative to the front and sides of the pool.
  • This canoe simulator has three water jets, one positioned in front of the canoe, one to port and one to starboard. As the front of the canoe leaves a neutral area, the velocity of the different jets is modified to keep the canoe in the center of the pool. The velocity of the different jets is enough information for our present Game System to determine the speed and direction of the virtual canoe.
  • Simulating water current would further challenge participants. Simulating current either straight ahead or straight behind is easy. The velocity a participant maintains on the simulator is calculated and the velocity of the virtual current is added for a back current or subtracted from a front current.
  • the PaddleOne and the Tethered Canoe simulators would simulate sideways water current by changing their track in the virtual world. That is, the vector obtained from the simulator would be modified by the vector of a virtual current, and this modified vector would be used to track the canoe in the virtual world.
  • the free-floating canoe simulator is superior as the intensity of the side jets can be changed to simulate the virtual current, thus providing contestants with more feedback and challenge.
  • Our present Game System uses simulators to add a physical challenge to the Game System, and the physical component is also a necessary component for a Fantasy Simulator. As long as a physical component remains, the types of possible simulators is only limited by the human imagination. One skilled in the art, in light of the teachings above, would understand that there are no technical challenges to constructing the Fantasy Simulators discussed below. Our present Game System is illustrated, but not limited, by a discussion of just a few Fantasy Simulators.
  • human-powered sub do exist http://www.enme.umd.edu/terpedo/ most participants and audience would have no knowledge of them. Additionally, the real human-powered subs are ugly and awkward to use. For our present Game System we can take a real vehicle and modify it, in other words turn it into a Fantasy Vehicle. The virtual sub would need to be more aesthetic pleasing. Thus the virtual sub would be improved with a glass canopy and a pressurized cockpit. With those changes in mind, the Tectrix VR Bike discussed above would put a participant in the proper position for “realistic” motion capture of a human-powered sub. A means for navigating the virtual sub up or down in the water column would have to be added. This could be as simple as a button on the right hand rest for up and a button on the left hand rest for down.
  • Inflatable Plane Like the human-powered sub addressed above, inflatable planes are real. The Goodyear Inflatoplane, GA-468 was designed to be dropped behind lines for downed pilots to use to be rescued. Again we can take the concept of a real vehicle and modify it for our present Game System. A human-powered inflatable plane capable of being stored and carried in a backpack could enhance storylines and add a unique challenge to contestants. As with the sub above, the Tectrix VR Bike can be used as the real world simulator. An additional challenge for participants would be to manufacture a real world mock-up of the virtual inflatable plane, and have a contestant blow up and assemble the plane in the real world before being able to use it in the virtual world.
  • Multi-Person Powered Dirigible One iteration is a virtual dirigible with a large open passenger compartment slung underneath it. It has four backward facing hand-powered propellers in the stern of the passenger compartment and in the bow a large spoked wheel for right/left travel and a large lever for up/down travel. For this vehicle, the spoked wheel and lever would need to be constructed in the real world. A simple way of presenting this simulator would be to have four stationary upright bikes in a row with a “captain” navigating through the virtual world by use of the wheel and lever.
  • a more complex simulator can be built.
  • the open passenger compartment can be replicated in the real world. Additionally, it could be mounted on a motion platform, if the dirigible was an important element of the story/challenge.
  • the passenger compartment could have room for additional people, so as an additional challenge, the participants could load up the dirigible with people and supplies and take turns powering it with the stationary bikes.
  • An additional improvement would be to mount simulated cannons on both sides of the passenger compartment. Correctly firing these cannons would fire cannons in the virtual world.
  • the simulators discussed above are basically fixed in orientation. That allows an image of the virtual world to be displayed in front of the participants. Having the virtual world displayed in front of a participation allows him to navigate the virtual world. There are some challenges that require navigation clues/challenges in the real world that map to the virtual world.
  • rooftop lumping Challenge The goal with building a set for a rooftop jumping challenge would be similar to that described above for “The Word of God”. That is to provide a setting where a participant is forced to use proper body motion for realistic motion capture, and provide the participant visual cues in the real world. Additionally, sensors must give our present Game System information about a contestant's location so he can be mapped in the virtual world. As in the challenge above, a grid of pressure sensors on the floor covered by a canvas painted with rooftops, would meet the requirements. An improvement would be to build little walls that are typical on flat roofed buildings. Additionally building slanted roofs, would force contestants to use proper body motion for slanted roofs in the virtual world. To provide for proper motion capture of a participant missing a jump, the roofs can be built over a pool or over a heavy mat.
  • Role Playing Game (RPG) as one in which a participant is represented in a virtual world by an Avatar, where the Avatar's ability to interact with the virtual world is determined by game mechanics rather than by the participant's real world abilities.
  • RPGs such as Everquest and World of Warcraft
  • movement in the Virtual World is determined by game mechanics, such as what objects an Avatar is carrying in the virtual world, or what spells are on an Avatar or what magic gear an Avatar is wearing.
  • movement in our present Game System is dependent on the physical level of effort a participant is willing to make.
  • the ability of a participant to overcome an obstacle or solve a puzzle is not dependent on the abilities of their computer Avatar, but rather on their own physical abilities and “smarts.”
  • Games such as Quake and Halo are RPGs for the purpose of this discussion because the ability to aim and fire a weapon is determined by mouse clicking for where a participant wishes to shoot, and game mechanics determining the whether the shot hit or not.
  • Our present Game System may use replica weapons such as the ones manufactured by FATS, Inc. http://www.fatsinc.com/about/news/pr65.cfm “FATS virtual weapons resemble the fit and function of live weapons to include recoil.
  • FATS virtual training provides accurate, real-time diagnostics including point-of-aim, weapon status, trigger pressure and cant.” This makes weapon firing much more challenging.
  • games such as Halo and Quake reloading a weapon is done by one keystroke. In our present Game System, a participant must reload a weapon with a new magazine by hand. No Role Playing makes our present Game System much more challenging than RPGs.
  • a bow may be modified by adding a sensor to detect how much force is used to pull back the string. Additional sensor/s would determine the x, y, z orientation of the bow. One skilled in the arts would understand this gives our present Game System enough information to make a determination of where the arrow would land in the virtual world.
  • a regular arrow could be modified with a large foam tip, for safety.
  • a throwing spear could be modified by adding a sensor/s to determine the orientation of the spear and another sensor to determine velocity when thrown.
  • a simulated whip would consist of just the whip handle with a velocity sensor and sensor/s for x, y, z orientation.
  • a participant would quickly move the handle in a whip like fashion to generate velocity and end the motion with the whip handle aimed at a target.
  • Our present Game System would determine if the velocity was high enough and the aim was close enough to score a “hit”.
  • the whip handle may include a button/s. In the movies Indiana Jones was able to grab objects with his whip. A button may be pressed by a participant to indicate an attempt to grab an object rather than hit it.
  • Our current Game System motion captures the participants.
  • One skilled in the art would appreciate that if our Game System knew the physical qualities of a item, such as a hand-held weapon, it would be able to position that weapon in the virtual world.
  • a staff/crew member could input to our present Game System what a participant is holding, and with what hand.
  • sensors such as RFID could be placed on items, such as hand-held weapons and on a participant's gloves, as a method to input to our present Game System what a contestant is holding.
  • a velocity sensor may be added to hand-held weapons.
  • NPCs Non-Player Characters
  • RPGs such as Everquest and World of Warcraft how NPCs react with Avatars is determined by game mechanics, how high an Avatar's faction is with the faction of the NPC.
  • Our present Game System may make use of this known system in the interest of being cost effective.
  • NPC Avatars would be controlled by actors, staff/crew members, audience members or even celebrities. These individuals would be classically role-playing as defined by Wikipedia: “In role-playing, participants adopt characters, or parts, that have personalities, motivations, and backgrounds different from their own.” Acting is role playing.” That is they would act a part.
  • the Game System presents the ability to place participants in fantastic story lines as never before. Average people can venture to places never imagined, and interact with a virtual environment that fights back. An accountant who just yesterday was sitting behind a desk can now be dangled above a deep chasm on a swaying rope bridge. How she responds to this stress and stimuli will be different than what a true adventurer might.
  • Good game play is the same as good story telling. Good story telling involves choices. Does a participant choose to take the short route over the mountain, or the safer route around the mountain? Does a participant jump into the river, or take a detour to go over a bridge? Does a participant go off the main path to get food, or push forward and go hungry?
  • Our present Game System has a fixed start point and a fixed location contestants must locate and travel to in order to complete the game. Additionally, contestants must complete their journey in a fixed amount of time. If there was a two-lane highway between the start and end locations there would not be much of a story. The challenge in creating a good story is to place physical obstacles and mental obstacles (puzzles) in the path of participants and to give them choices on how to solve and/or go around these obstacles.
  • Our present Game System is suitable for three general categories of games: “reality” show; co-operative; and competitive.
  • An ideal iteration of our Game System would involve a video record of a contestant's participation. Therefore attention must be given to not only what makes a good game for a participant, but what will make an interesting story for an audience.
  • “Reality” shows are known for interpersonal cooperation and competition as well as team competition.
  • Our Game System is suitable for Location Based Entertainment (LBE).
  • An LBE Game could be a cooperative game, with all contestants helping each other towards a common goal. This type of activity is very common for Corporate Team Building.
  • a LBE Game could be competitive in either teams or individual. Paint Ball features competitive teams.
  • a “reality” show game would take weeks to complete.
  • a Corporate Team Building game would take a weekend to complete.
  • a competitive game would take at least half a day to compete.
  • Episodes 1-3 Qualifications/Meet the Players. Potential contestants gather in regional centers to try and qualify for one of the six teams. Each contestant is put through a rigorous battery of physical, mental, and virtual challenges and rated by a panel of judges (made up of an ex-pro sports star, a psychiatrist, and a noted archeologist). Physical prowess is important, but so too are quick wits, and the ability to work within a team structure, while still maintaining one's individuality. Contestants must be able to adapt to the unique interfaces between the real world and the virtual world they will inhabit in the show. In each of the first three episodes, two of the regional competitions are highlighted, with a special emphasis on introducing the ultimate players, as well as the show concept and the unique technology they will be using.
  • Episode 7 “The Secrets of Chol.” In Episode 7 all six teams race around Chol to find its secrets as quickly as they can. There is full interaction of all 36 players, and no team can win on its own, as each will have accumulated a necessary clue. Players are not told this, however, and undoubtedly one or more of the team will believe they can win by physically beating the others. Quickly they will find, however, that if a team is defeated, its clue pieces magically disappear. The teams that recognize the quickest that the only enemy is the city will score the most points. Based on the overall scores from the three episodes, two of the teams are sacrificed to the gods (sent home), leaving the top four teams.
  • Episodes 8-11 “Head To Head in the Jungle.” The secrets of Chol now revealed, the next four episodes pit teams head to head based on their rankings to date as they make their way into the jungle (e.g., Episode 8 pits 1 vs. 4, Episode 9 pits 2 vs. 3, Episode 10 pits 1 vs. 3 and Episode 11 pits 2 vs. 4).
  • the physical requirements are increased, as much climbing, repelling, and walking across narrow rope bridges is required.
  • the stakes for both the teams and the audience are raised, as the physical demands and increasingly powerful NPCs begin to eliminate players from each team. Teams with the week off aren't idle, however, as they are interjected into the world as NPCs to battle against the competing teams, and to steal whatever knowledge they can. Based on the scores after these episodes, two more teams are eliminated.
  • Episode 12 “The Winter Solstice.” The remaining two teams are pitted against each other in one final adventure to see who will solve the riddle of the Maya, and collect the rare green diamonds. This episode ties together the entire quest story, and forces the teams to put all of the skills and knowledge to use. A climactic scene forces the teams to win, or die, on the ball fields of Chitzen ltza. It's winner take all, or is it? There may still be one more challenge.
  • Episode 13 “All Star Challenge.” The top 8 rated players, and two voted on by the audience and invited back, form an all star team that must solve the last riddle of the Maya, with the future of the world hanging in the balance.
  • the Springfield Snakes reach the ruins first.
  • the Snakes indicate to the crew that they wish to dismount from the bikes.
  • the Snake's Avatars go into an idle animation for two minutes, the pre-determined allotted time for changing from bicycle simulators to dismounted.
  • the Snake participants gather up their gear, which was stored on their bicycle racks, and follow the crew to a room with a mat with a grid of sensors underneath it.
  • the virtual world is displayed in front on them on a large screen. After two minutes from their bicycle simulator dismount, their Avatars are taken out of idle animation and the Snakes are once again controlling their Avatars.
  • the Rockville Raptors dismount when they are far enough away from the Snakes to do it without fear of being shot. Once dismounted and formed up they head back toward the ruins by walking on their treadmills. Meanwhile Jim decides the Snakes are not going to risk a fight, as he feels the Raptors are better at long range weapons then his team. Jim signals to the Snakes to cross the river in one of the two canoes at the ruins and to destroy the other canoe, so the Raptors cannot easily follow.
  • the Raptors are walking down the road in the virtual world, in the real world they come to the end of the sound stage. Their Avatars are put into idle mode for 30 seconds as the Raptors get themselves reset. Meanwhile, the Snakes demolition guy Mark, takes out a C4 explosive replication, arms the timer for ten seconds and drops it on the floor where he sees the second canoe in the virtual world. The Snakes then indicate that they want to board the first canoe. The Snake's Avatars are put into idle mode for six minutes, as the canoe simulator is physically far away.
  • the Raptors continue walking down the road to the ruins. They see the Snakes getting into one canoe and shortly later see the other canoe blow up. Since the Snakes now have a lead on the Raptors, Steve decides to examine the ruins. The Raptors find many glyphs. In one set of glyphs they recognize the name of the city they are trying to find Chol. The Raptors work on deciphering the glyphs. They finally get it and discover the glyphs describe a “short cut” to the lost city. The Raptors find the small trail the glyphs indicated and head on out, hopeful that this short-cut will enable them to beat the Snakes to Chol.

Abstract

The present novel Game System is directed toward inserting a player into a virtual world, where the player's real world actions, articulated movements, activities and skills determine what actions and activities his Avatar in the virtual world does. Conversely, experiences of the Avatar in the Virtual World are translated back into the real world as feedback. The critical observation is: the contestants undergo the physical challenge of an epic adventure with none of the dangers. For example, the present Game System is able to simulate the physical challenge of a jungle adventure without biting insects, poisonous snakes, leaches, anacondas and piranhas, not to mention the enormous expense a real jungle adventure would entail. Our current invention teaches a Multi-player Virtual World Game suitable for reality TV and/or location based entertainment. The present Game System is a unique combination of elements consisting of: 1. Motion Capturing of contestants, 2. Two-way interaction with the virtual world, 3. In Real Time, and requiring, 4. Interacting with objects existing in both the real world and virtual world, all requiring, 5. High Physical Levels of Effort of contestants based on the status of the virtual world environment, 6. Such that there is No Role Playing, and such that the Game System produces, 7. A unique way of Story Telling.

Description

    RELATED APPLICATION
  • This application claims priority under 35 U.S.C. § 119 based on U.S. Provisional Application No. 60/710,123, filed Aug. 22, 2005, the disclosure of which is incorporated herein by reference.
  • BACKGROUND OF THE INVENTION
  • A. Field of the Invention
  • The present invention relates to multi-player Virtual World Games and, more specifically, to systems and methods for allowing two-way interaction in multi-player virtual world games, suitable for broadcast “reality” entertainment and/or location-based entertainment.
  • B. Description of Related Art
  • Multi-Player Virtual World games (MVWG) are well known in the entertainment art and include such online computer games as Everquest, Dark Age of Camelot and World of Warcraft. In general, a player uses a home computer as an interface to these games. Our current invention, in general, teaches new systems and methods to interface with a MVWG, and new MVWG based on these new systems and methods. Our current invention is concerned with the following broad areas: Role-Playing, Number of Participants, Interaction, Motion Capture, Real-Time, Expression Mapping, Objects, Number of Players, Story Telling and Physical Level of Effort.
  • Our current invention spans the art of Computer Generated Images (CGI) in Movies/Advertising, Multi-Player Virtual World Games (MVWG) and Arcade Virtual Reality (VR) Games. For ease of discussion, we will in general, limit our discussion to the following examples: World of Warcraft (WoW) for discussing MVWGs and VR Soccer http://www.highwaygames.com/body.php3?action=mach view&machine id=945 for discussing Arcade VR Games. The following definitions are helpful in describing the invention:
  • Role Playing
  • Wikipedia: “In role-playing, participants adopt characters, or parts, that have personalities, motivations, and backgrounds different from their own.” Acting is role-playing. For the purposes of this discussion we will define a Role Playing Game (RPG) as one in which a player is represented in a virtual world by an Avatar (a computer representation of a game character), where the Avatar's ability to interact with the virtual world is determined predominantly by game mechanics rather than by the player's real world abilities. Thus World of Warcraft (WoW) is an RPG.
  • First-Person Shooters such as Quake and Halo, although not traditionally thought of as RPGs for the purposes of our discussion are considered RPGs. The ability to aim and fire a weapon in the real world requires a skill set including holding a weapon steady, aiming, breathing techniques during shooting, and reaction to recoil, among others. The skill set for firing a weapon in Quake and Halo includes merely moving a mouse and clicking. Since an Avatar in Quake and Halo interact in the virtual world by play mechanics rather than a player's real world abilities they are role-playing games.
  • In VR Soccer a player makes contact with a real soccer ball. The game gathers information about the movement of the ball, and this movement is used to move the ball in the virtual world. As the ball is moved by the player's real world skills, VR Soccer would be considered a non-role-playing game. Thus, the interactive action in VR Soccer is limited to only kicking the ball, the player neither runs, nor otherwise control his/her position within the videogame.
  • Physical Level of Effort
  • Physical Level of Effort refers to activity in the Real World required to produce a result in the Virtual World. For example, to move forward in World of Warcraft (WoW) a player just needs to press the up arrow and his character in game will move forward indefinitely. Therefore the physical Level of Effort to play WoW is near zero. To play soccer in real life requires running up and down the field for 90 minutes, the physical level of effort to play is therefore “high”. To play VR Soccer a player takes 3-4 steps and kicks a soccer ball into the screen. This kicking is repeated a few times during the few minutes of play of the game. Therefore the physical Level of Effort to play VR Soccer is “low.”
  • Objects
  • Objects for this discussion refer to objects a player may manipulate that exist simultaneously in both the real and virtual world. For example, in VR Soccer a player kicks a real ball into a screen, after the game detects the ball motion, a virtual ball moves in the virtual world. U.S. Pat. No. 6,162,123 teaches the use of an object, such as a sword as an input device for an interactive game. Current art uses objects in arcade games/simulations. Current art uses objects solely as alternative game/simulation input devices.
  • Motion Capture
  • Motion Capture is well known in the art. It is the process of translating a live performance into a digital performance. U.S. Pat. No. 6,831,603 has an in-depth discussion of the current state of the art of motion capture.
  • Expression Mapping
  • Expression Mapping is well known in the art. It is the process of adding facial animation to virtual beings. U.S. Pat. No. 6,735,566 is one of many methods for expression mapping. World of Warcraft (WoW) has short emotive animations a player may activate for his virtual character. Examples include: cry, cheer, sleep, and yawn. Of note, the virtual character's expression remains unchanged, the emotion is created by virtual body language and sound. Additionally a player may misrepresent his current expression with this system.
  • Real Time
  • An action in the real world, such as clicking a mouse in WoW or hitting a soccer ball in VR Soccer may have an effect in the virtual world. The length of time it takes for the real world action to have an effect in the virtual world is referred to as lag. The less lag, the closer to real time a game is. Games such as VR Soccer are not real time, as a player may take 3-4 steps forward before kicking a ball, and then the ball has a certain amount of travel time before being recognized by the game and a virtual ball moved. A few seconds may elapse from when a player initiates movement and when a change is made in the virtual world.
  • Interaction
  • Interaction refers to the ability to make changes to the Virtual World. In WoW a player may take the action of collecting a virtual herb. This virtual herb is subsequently not available to another player. In games such as VR Soccer the Virtual World is unchanging, the only change being a player's score. Our USPTO “An Improved Massively-Multiplayer On-Line Game” teaches the use of unique arcade-style input devices for Massively-Multiplayer On-Line Games.
  • Number of Participants
  • This refers to how many players can play the game at any one time.
  • Story Telling
  • A movie tells a story. A player of WoW creates a story for his virtual character, of virtual victories and defeats, good deeds and bad. A player of VR Soccer does not create a story, he earns a score, which at most can be compared with prior static scores stored in a particular machine.
  • Background Summary
  • Please refer to FIG. 1, which presents an overview of current art in the broad areas our current invention is concerned with.
  • SUMMARY OF THE INVENTION
  • The present invention is directed toward a multi-player virtual world game system (the “Game System”) and methods that comprises a new and unique entertainment experience. This system and method comprise a novel combination of elements, which are presented in FIG. 2. This novel combination of new and previously known elements together comprises a unique entertainment experience, suitable for “reality TV” and/or location-based entertainment.
  • More particularly, the present novel Game System is directed toward inserting a player into a virtual world, where the player's real world actions, articulated movements, activities and skills determine what actions and activities his Avatar in the virtual world does. Conversely, experiences of the Avatar in the Virtual World are translated back into the real world as feedback. The critical observation is: the contestants undergo the physical challenge of an epic adventure with none of the dangers. For example, the present Game System is able to simulate the physical challenge of a jungle adventure without biting insects, poisonous snakes, leaches, anacondas and piranhas, not to mention the enormous expense a real jungle adventure would entail.
  • Our current invention teaches a Multi-player Virtual World Game suitable for reality TV and/or location based entertainment. The present Game System is a unique combination of elements consisting of:
  • Motion Capturing of contestants,
  • Two-way interaction with the virtual world,
  • In Real Time, and requiring
  • Interacting with objects existing in both the real world and virtual world, all requiring
  • High Physical Levels of Effort of contestants based on the status of the virtual world environment,
  • Such that there is No Role Playing, and such that the Game System produces
  • A unique way of Story Telling.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate the invention and, together with the description, explain the invention. In the drawings,
  • FIG. 1 is an overview of the current art.
  • FIG. 2 is an overview of the current art and our current invention.
  • DETAILED DESCRIPTION
  • The following detailed description of the invention refers to the accompanying drawings. The same reference numbers in different drawings identify the same or similar elements. Also, the following detailed description does not limit the invention.
  • A multi-player non-role-playing virtual world games is described. Our present Game System is a novel combination of new and previously known elements, which together comprises a unique entertainment experience. (See FIG. 2.) Many of these elements are well known in the art so are described primarily only as far as our present Game System modifies the art. In addition, the benefits of using one element over a superficially similar element will be discussed.
  • For the sake of clarity and not limitation, our current invention will be described with a game called Quest! Mayan Jade™. In the discussion below the use of “Quest! Mayan Jade” means the preferred embodiment of our current invention. The discussion below will teach each aspect of our current invention, first in generalities, and then, more specifically as illustrated by “Quest! Mayan Jade.”
  • The defining characteristic of our present Game System is: a participant undergoes the physical challenges of an epic adventure with none of the dangers.
  • Our current invention teaches a Multi-player Virtual World Game suitable for reality TV and/or location based entertainment. The present Game System is a unique combination of elements consisting of:
  • Motion Capturing of contestants, allowing for
  • Two-way interaction with the virtual world,
  • In Real Time, and requiring
  • Interacting with objects existing in both the real world and virtual world, all requiring
  • High Physical Levels of Effort of contestants based on the status of the virtual world environment,
  • Such that there is No Role Playing, and such that the Game System produces
  • A unique way of Story Telling.
  • 1. Motion Capture
  • Motion Capture is well known in the art. It is also a fast evolving field. Our present Game System is not concerned with any specific motion capture system. One example of the many systems that are available is MotionAnalysis http://www.motionanalysis.com/about mac/gollum.html.
  • The motion capture systems allow a computer to determine the real world position of motion capture sensors. In order to make use of the data in a virtual world, there has to be some form of mapping between a motion capture system in the real world and an Avatar in the virtual world. The simplest case would be a single sensor on a participant. The participant may be in a 20′ by 20′ room. If there were a similarly square room in the virtual world, the single sensor could be used to tell the computer where in the room the Avatar should be placed in the virtual world, and at what height. If the participant paced around the room, the Avatar could be made to pace around the room at the same rate. For this case, there would not be much more information available, but it does illustrate a simple mapping between the real world and a virtual world.
  • A more complex system, and one that is closer to our preferred embodiment, multiple sensors would be used on a participant. In a single sensor method while the position of a participant in the room was known, there was no information available about the participant's body position, or the way that he moved his limbs as he moved around the room.
  • In a complex motion capture setup, sensors may be placed at either side of every major joint on a participant's arms and legs. This allows even a participant's head position and head tilt angle to be measured accurately. The motion capture system then has very detailed information about the exact manner in which a participant's musculoskeletal system is moving at any given time. This data may be fed into the virtual world for a fairly direct mapping from the real world to the virtual world. That is, a particpant's right leg bone, which is connected to the thighbone, just moved to an angle of 7.5 degrees, and the thighbone, which is connected to the hipbone, just moved to 2.4 degrees. This information is used by the virtual world to move the participant Avatar's bones in a similar manner. Thus the motion of an Avatar may take on a very realistic lifelike look and feel.
  • 2. Two-Way Interaction with the Virtual World.
  • Interacting with a virtual world is well known in the art, especially in multi-player virtual world games (MVWG) such as Everquest and Dark Age of Camelot. In existing MVWGs, the interaction is characterized as one-way. A human interacts with a computer, sending commands for the player's Avatar to do things (e.g. pick up an item). The virtual environment then responds and reacts to these actions (e.g., the item appears in the “inventory” of the Avatar). This interaction is all within the virtual world, and not within the real world of the player. Although devices have been used to provide feedback from virtual world to the player (e.g., forcefeedback joysticks, such as Microsoft Sidewinder Force-Feedback Pro), this tactile feedback cannot be characterized as true two-way interaction.
  • In the Game System, when an Avatar picks up an item in the virtual world and placed in the Avatar's inventory, a representation of that item is also introduced into the real world, and the human is required to carry that representation, including its mass, in the real world.
  • As another example meant to explain but not limit the invention, in the Game System, if a human player begins to run on her treadmill, her Avatar will begin running in the virtual world. If the path of the Avatar's run is uphill in the virtual world, this virtual world environment change is transmitted fed back through the Game System to the treadmill, whereupon either the angle of the treadmill or the tension of the belt is changed to require the real world human to work harder to “run” uphill in the virtual world. If the exertion is too much for the player, she will stop, turn, or otherwise avoid continuing that path in the virtual world, because the exertion is too much.
  • In this way, there is true two-way interaction between the human and the virtual world—what the human does causes a change in the virtual world, but changes in the virtual world cause changes to the real world environment in which the player interacts.
  • This two-way interaction also has an impact on the virtual world that does not currently exist in the art. It is the goal of current MVWGs to offer the same experience to all their players. So if Player A find a treasure chest, that chest will at some point reappear (respawn) to give Player B a chance to find the same chest. If Player B knocks down a door, after a set period the door will repair itself, to offer the same challenge to Player A. In summary, existing art has a high component of interactivity with the virtual world, but with a few extremely rare exceptions, there is no permanent interaction with the virtual world.
  • Permanent Interaction with the Virtual World. Our present Game System is designed for dozens of players and to be reset/reused after a relatively short period of time, compared to existing MVWGs. Additionally, it is a goal of our present Game System to offer different experiences to each contestant. Thus our present Game System allows participants to have permanent interaction with the virtual world, limited by good game design. This permanent interaction takes two main forms, taking/consuming virtual objects, and changing the virtual terrain.
  • Our present Game System does not respawn objects as MVWGs do. That is, if there are ten apples on a virtual tree and Player A takes one, when Player B sees the tree there will be nine apples on it. In addition to food, water, ammunition and other supplies, written hints are another class of objects, which could be taken. Additionally, a contestant may write hints/letters in the real world that are transferred to the virtual world by the actions of his Avatar and left for other contestants to find.
  • Objects that a contestant can pick up, she can also put down in a different location. That is, contestant's can move objects. Permanently moving objects is not possible in most MVWGs. If moving objects is implemented, the moved object may disappear (despawn) after a pre-selected period.
  • In our present Game System participants can change the state of objects. For example, if a participant unlocks a door, that door will be in an unlocked state when the next participant encounters it. Similarly, an object, such as a chest may be left in an open condition. In contrast to existing MVWGs our present Game System does not reset the state of objects after a pre-determined amount of time.
  • Our present Game System allows participants to change the virtual terrain. One example of this is terrain marking, such as spray painting. If the contestants equip themselves with a spray paint can in the virtual world, they are given a replica can to carry in the real world, as discussed above. This real world spray paint can replica may have buttons on it indicating right arrow, left arrow, up arrow, skull and crossbones, etc. So a participant would “aim” the can as he would a weapon and press a button to indicate what he wants to spray paint in the virtual world. Thus a participant could mark a path out of a “dungeon” or direct a lagging party member which direction to go at a crossroads by actions in the real world that are translated into the virtual world. All contestants can see these spray paint markings. Therefore contestants must decide whether the beneficial use of terrain marking outweighs the possible negative effects of another group of contestants seeing these markings. It also allows contestants to falsely mark the terrain in an effort to mislead other contestants. The use of system resources for terrain marking can be controlled by limiting the means of terrain marking, for example by limiting how many spray paint cans exist in the virtual world and how many “marks” can be made by a particular can and also by the number of different “marks” that can be made.
  • Participants may permanently change the virtual terrain by destructive methods as well, such as by blowing bridges, cutting down trees, breaking windows, etc., with the only constraint being good game play. For example, a virtual small bridge may span a virtual ravine. If there were other ways to bypass the ravine, our present Game System may allow the small bridge to be blown up. However, if crossing that bridge were key to all contestants, then our present Game System would not allow it to be blown up. This can be accomplished to not allowing sufficient virtual explosives into the virtual world to blow it up, or deny access to portions of the bridge, such as support piers to blow it up. As discussed above in terrain marking, all participants will be able to see the results of terrain destruction. Thus blowing open a door may not only help a contestant's team, it may also help his opponent's team. Thus a participant may need to decide between blowing a door open quickly and taking more time to unlock then lock it to thwart his opponents. In addition, a participant may blow down a door as a false lead, in an effort to mislead other participants. The use of system resources for terrain destruction can be controlled by limiting the amount of virtual destructive material in the game. Such as limiting the amount of virtual explosives, and/or limiting the amount of gas for a gas-powered saw, etc.
  • 3. Real Time
  • For the purposes of this discussion Real Time means is defined as meaning that actions participants input for their Avatars occur with as little delay as possible consistent with current technology (less than a few seconds). This is well known in the art as exemplified by MVWGs such as Everquest and Dark Age of Camelot. In these existing MVWGs, contestants are able to use the same control devices for their gaming session, so any breaks in a gaming session are solely at their discretion. Our present Game System may require participants to physically move between different simulators during a single gaming session, thus there is a period where participants are not able to control their Avatars. Having Avatars enter an “idle” stage is discussed below.
  • There will be occasions then, where Avatars exist in the virtual world, but their participants are not able to control them. For example, a contestant bikes to a virtual river. He then wants to cross the virtual river by swimming. In the real world a participant exits a bicycle simulator, a member of the staff/crew puts the contestant's Avatar into idle mode, and the participant must walk to the swimming simulator, where the staff/crew toggles the participant's Avatar off idle mode.
  • An Artificial Intelligence (AI) could be written for directing a participant's Avatar while he is between simulators, for example if he was attacked by a virtual wild animal. That is not an ideal situation. It would really not be fair to have a participant's Avatar damaged when the contestant was not in control. A more ideal situation would be to design safe zones in the virtual world. That is, an area in the virtual world without non-player characters (NPCs) and hidden from view from other participant's Avatars. For example, the only way to enter a virtual river may be a small depression, with high reeds around it.
  • A situation may arise where two competing contestants wish to enter a river at the same time in roughly the same place. In this case our present Game System may spawn a separate safe zone “instance” for each contestant. That is, there would be two safe zones; one for each contestant and our present Game System would “lock out” a competing contestant from entering an occupied safe zone.
  • Our Game System physically challenges contestants. Efforts must be made to make these challenges as safe as possible. With this in mind, changing from one simulator to another may put a participant's Avatar in an idle mode for a predetermined period of time. This would avoid the situation where a participant may feel the need to run around a studio between simulators and injure himself. This predetermined period would be determined on the distance between the various simulators and the difficulty of exiting one and entering another. For example, going between a bicycle and swimming simulator may put a participant's Avatar in idle mode for four minutes, while going between a walking simulator and bicycle simulator may put an Avatar in idle mode for two minutes.
  • 4. Interacting with Objects Existing in Both the Real World and Virtual World
  • In a typical Multi-player Virtual World Game (MVWG) the objects an Avatar can carry is determined by the number of “spaces” in his bags, the virtual weight of the objects or both. A key element of our present Game System is a physical challenge. Therefore, the objects an Avatar can carry in the virtual world are determined by what a participant can carry in the real world. The only objects a participant can use in the real world (while playing our present Game System) are objects that his Avatar acquires in the virtual world.
  • For example, it is morning in the virtual world. The participant's Avatars are at a supply drop and must decide what to carry with them that day. If a participant wants to eat and drink some food during the day, he must go through the virtual supplies and decide what to carry. He may decide he will want 2 apples, a turkey leg and 2 liters of water between now and the next supply drop. Those items then must be put into the participants backpack (either by him or staff/crew) and carried. A participant may crave a banana, but if there are no virtual bananas in the virtual supply drop, he does not get one.
  • In this manner objects exist in both the real and virtual world. This could be take to Survivor-esque levels by requiring participants to carry all food that they will eat during the day, or it may be limited to just food that they want during actual game play. In addition to food and water, participants will be required to carry any change of clothes they may want, as well as any other amenities. Good game play would be to position supply drops off the most direct path, to give participants a choice between carrying more and traveling a shorter distance, or carrying less but going a longer path.
  • Special Objects or Relics. “One of the most important props featured in the [movie] Raiders of the Lost Ark, the Headpiece of the Staff of Ra was able to pinpoint the final resting place of the lost Ark of the Covenant. When the headpiece was placed in a particular position in the Well of Souls, the sun would shine through the crystal and reveal the location of the Ark.” http://www.indyprops.com/pp-headpiece.htm. In our present Game System a participant would have to carry a real-world replica of an in-game relic like the Headpiece of the Staff of Ra. This gives participants an obvious physical challenge and an intellectual one. Good game play would be to introduce numerous relics and items that appeared to be relics. Participants would have to choose to either burden themselves down with carrying all relics they came across or figuring out the real ones from the fake.
  • Weapons, Explosives and Such. In first-person shooters, a gamer typically has a choice of multiple weapons, with no penalty. In our present Game System all weapons available in the virtual world, will have a replica in the real world. A participant must carry replicas of any and all weapons he wants to use. This leads to good game play as participants individually and as a group, must decide between short range weapons, such as shot-guns and long-range weapons such as a B.A.R. As a B.A.R. weighs over 14 pounds, a participant can quickly become over-burdened by poor choices of what to carry. Not only weapons but also ammunition, which are surprisingly weighty, will exist in both the real and virtual world. Again, more choices for participants, weigh yourself down with ammo, or take a chance of running out later on. As other objects in our present Game System, replica explosives will have to be carried. Yet again a participant must make a choice of how much to carry.
  • Other objects are utility objects, such as radios. In the virtual world a radio may be required to call for a supply drop, or simply to ask questions, download maps, etc. So not only will participants have to carry a replica radio, they would also be required to carry a means of powering a radio and any other virtual electronic gear they intend on using in the virtual world. Good game design is giving participants choices. In this case participants may decide to depend upon batteries or they may opt for hand-powered devices to create power.
  • 5. High Physical Level of Effort
  • The goal of our present Game System is to simulate the physical challenge of an epic adventure with none of the dangers. For example, walking and running can be simulated by putting a contestant on a treadmill and varying the pace and elevation. The future is certain to bring new technology to accomplish the same task, such as “Shifty Tiles” invented by University of Tsukuba and ATR Media Information Research Labs http://www.trnmag.com/Stories/2004/081104/Shifty tiles bring walking to VR Brief 081104.html0. Our present Game System is described using current cost-efficient technology, but is not limited by this description.
  • An ideal simulator has the following characteristics:
  • Forces a contestant to use proper body motion for realistic motion capture,
  • Is controllable by a participant (i.e. how fast a participant walks on a treadmill),
  • Is controllable by the virtual world (i.e. the angle of a treadmill should match the slope of the ground in the virtual world),
  • Is safe for an untrained healthy adult to use, and
  • Has form factor compatible with a reality TV show or location based entertainment.
  • There exists in the art simulators for such activities as Walking, Running, Stair Climbing, Rowing, Bike Riding, Sail Boating, Hang Gliding, Parachuting, Rock Climbing, Wind Surfing, Kayaking, Surfing, Skiing, Snow Boarding and Swimming, among others. Our present Game System will be illustrated, but not limited, by a discussion of treadmills, stair climbers, bike simulators, swimming simulators, canoe simulators, fantasy simulators and creating a “virtual set”.
  • Interface for Two-Way Interaction. The following discussion will mainly concentrate on modifying current simulators, as that is the most cost effective method. In order to use a current simulator, our present Game System may need to pass information to the simulator. Additionally, our present Game System may need to gather information from the simulator. One skilled in the art would realize that building an interface to pass information to and from our present Game System to a simulator is well known in the art, and hence is not covered in this discussion.
  • Stair Simulators. The most common type of Stair Simulator is characterized by two independent pedals, such as the StairMaster® Free Climber series. This type of Stair Simulator is less than ideal however. Most stair risers are eight inches in height. So to simulate natural movement, a participant's throw on a pedal should be consistently eight inches. However, since these type of machines have a free range on the pedals, the throw may vary widely. This motion would look unrealistic when motion captured and mapped to the virtual world.
  • A more desirable alternative is the StairMaster® StepMill 7000. This device is a revolving staircase; therefore it forces a participant into the proper motion, for motion capture. It is capable of speeds from 24 to 162 steps per minute, controllable by a user or through an external program through its C.S.A.F.E. interface. No special training is required, just step onto it and climb stairs. Its form factor quality is OK as is, and can be enhanced with some modification. Finally at a price of $5,000 it is cost efficient.
  • The StepMill 7000 is adequate direct from the factory. However it can be improved for use with our present Game System with some modifications. Our present Game System is designed for reality TV and/or location based entertainment. For reality TV, part of the broadcast show will be participants on the simulators. In addition, location based entertainment most likely would be structured so that an audience could watch a game in progress. Therefore the side railings and the Console (Head) are not necessary and obstruct view of a participant. A participant needs to adjust the speed of the device, as a participant may want to rest, walk or run up the steps in the virtual world. A participant controller interface to the unit is required if the console is removed. The art allows a very small, wireless interface device, with a pause button, and up arrow and down arrow to be built, which could be strapped to a participant's arm and/or worn on a belt.
  • Real World motion may be mapped on a one to one basis with the Virtual World. That is, one step on the StepMill is one step in the virtual world. Sensors may be placed on top of the revolving steps. These sensors could send information to our current Game System when a participant's foot hits the top of a step. This would aid mapping the participant's Avatar's feet to the virtual stairs he is climbing.
  • Walking/Running Simulators
  • Treadmills are well known in the art. A typical treadmill is the Life Fitness® 9500. It main components are a 62×18 inch revolving belt, powered by a 4.0 hp motor. User adjustable speed from 0.5-12 mph in 0.1 increments, and user adjustable incline 0-15%. This treadmill also features safety bars and grips. These are undesirable for our present Game System, as if a participant were to hold on to the equipment, their body motion would not be realistic for motion capture. Therefore any potential handholds would need to be removed.
  • A treadmill can be improved for use with our present Game System with some additional modifications. The form factor quality of a treadmill can be improved by the removal of the head. This would necessitate building a participant controllable interface, as above, to enable a participant to speed up, slow down, or stop the treadmill. Our present Game System requires the current speed of the treadmill, in order to track a participant's progress through the virtual world. This information could either come from the contestant's interface, or a separate interface between the treadmill and our present Game System.
  • If the treadmill does not have a C.S.A.F.E. interface or equivalent, a interface would be required between the treadmill and our present Game System so that the game could vary the angle of the treadmill to match the virtual terrain. That is, raise the angle of the treadmill when going uphill, and lowering it when going downhill. A treadmill could be mounted at a negative angle, so its' range of motion could be say −5% to +10% instead of the standard 0-15%.
  • Using a treadmill with the modifications described above is very cost efficient. The disadvantage of this method is its limitations in simulating varying virtual terrain. It has a limited range of motion (15%) and changing the angle of the machine is very slow, so that there is a disconnect between the real and virtual world. In addition, this method does not allow any lateral (side to side) movement of the treadmill to simulate varying terrain.
  • An improvement would be to mount a treadmill on a 2-degree-of-freedom motion platform. This would allow the treadmill to move front and back more than the 15% standard and also side-to-side, to better simulate the virtual terrain. An additional benefit of this system is that rough terrain can be simulated by introducing small random motion to the motion platform. This system would even allow a small earthquake to be simulated. Motion platforms are well known in the art. The major considerations in mounting a treadmill to a motion platform are obvious safety issues which are well known in the art, so are not covered here.
  • Contestant Control of a Walking/Running Simulator. As discussed above, a participant would require a small interface device, capable of being mounted on an arm or hung on a belt, to control the speed of a treadmill. The preferred embodiment of our present Game System is a team game. Therefore a contestant may wish to go the same speed as a team leader. Therefore an additional button to set the treadmill at the same speed as the team leader would be an improvement.
  • There also needs to be a method for a participant to navigate in the virtual world. Due to the nature of treadmills, a contestant's feet must always be pointing forward, so no information can be gotten from feet placement. The Game System calls for an interface device to allow for such virtual world navigation. Such a device could be, but is not limited to: A joystick, a hand-held game controller, a head tracking system, or a shoulder angle tracking system.
  • By way of example, but in no way limiting the scope of the present invention, while a contestant is on a treadmill a representation of the virtual world will be projected in front of her. Ideally the projected image is large enough to require a participant to move her head to see all of it. Our present Game System is motion capturing a participant's movement. Therefore, the Game System is able to extract the angle of a participant head (side to side) from the position of his feet. This is the same information that would be provided by a joystick or similar game-controlling device. A disadvantage of his method is the participant would be moving in the virtual world whenever he looked around. Therefore a switch is needed in conjunction with the extracted angle for controlled navigation through the virtual world.
  • An ideal switch would be unobtrusive for aesthetic reasons. It should be small so as to not interfere with the physical activities required of a participant. Finally, it should be activated in a method as to not interfere with realistic motion capture. One solution would be an Eye Blink Switch, such as this http://www.assistireland.ie/index.asp?locID=143&docID=212. In this case when a participant blinks, the angle of his head would be extracted from his body capture information and used to rotate the participant's Avatar in the virtual world. A disadvantage of this method is that is partial obscures a participant's face.
  • Another alternative would be a switch similar to Advanced Multimedia Devices, Inc. Flat Mini-Beam Switch (SSW-MB) http://www.amdi.net/sensors.htm. This switch is small enough to attach to the palm of a participant's hand. It is activated by proximity so it could be positioned that a small inward to palm motion with a thumb would trigger it. In the future, as technology gets smaller and less expensive, it may be possible use a combination eye tracker and blink detector for motion control such as made by EyeTracking, Inc. http://www.eyetracking.com/.
  • Bicycle Simulators.
  • Bicycle Trainers can be broken down into two broad categories: stationary exercise bikes and bike trainers, which attach a resistance mechanism to real bike. Stationary bikes fall into two categories, upright and recumbent. Life Cycle® manufactures models of each stationary bike category. Upright stationary bikes position a contestant close to the proper position for realistic motion capture. Both stationary bikes and bike trainers can increase resistance to simulate going up hills. Stationary bikes do not have gears. As proper use of gears is a very important component of bicycling in the real world, a bike trainer is closer to an ideal bike simulator than a stationary bike.
  • Racer Mate® manufactures a bike trainer “Pro BASIC CompuTrainer™” http://www.racermateinc.com/compu pro basic.asp which is a good choice (at the time of this writing). A frame attaches to a bike's rear wheel. The bike's rear wheel is positioned off the floor, seated in a load generator, which can change resistance. The Pro Basic model can simulate up to a 15% incline. The Pro Basic additionally comes equipped with a built-in interface and additional software of use to a game designer. A device such as the Pro Basic comes close to an ideal simulator, as discussed above, as it incorporates a real bike.
  • A bike trainer can be improved for use with our present Game System. A simple improvement would be to mount a fan in front of a participant. The fan speed would be synchronized with the virtual speed of the bike, thus giving a participant feedback and making for a more interesting visual for spectators. A bike trainer can be further improved by incorporating a method to brake in the virtual world. Sensors, such as strain sensors can be positioned in a bike's handbrakes. This would enable a participant to use the brakes as they are normally used and send this braking information to the virtual world.
  • In the real world, following close behind another biker is slightly easier than being uncovered, due to wind resistance. Ideally this would be modeled in the virtual world, to add additional strategy to the game. That is, if a participant followed closely behind another, the resistance on the following participant's bike would be slightly less than that of the leader.
  • There needs to be a means to a method for a participant to steer in the virtual world. In the real world steering is done by a rider leaning, and the bike following the direction of the lean. This is not possible with the current art of bike trainers. There are two equally good methods to get steering information.
  • Pressure sensors can be put in the handgrips of a bike. The harder a participant squeezes the tighter the virtual bike would turn in that direction. As participants rest some of their weight on the front of the handgrips, the sensors would need to be positioned at the rear, so that they would only be activated by a gripping motion with the fingers. The only disadvantage to this system is that this is not a natural method of steering a bike, and a training period may be required before a participant is comfortable with it.
  • As the contestant is being motion captured that information could be used for steering. That is, if a participant leaned, moved his shoulders, to one side the virtual bike would turn in that direction. The bigger the lean, the harder the turn. This is a more natural method of steering a bike. The disadvantage to this method is a participant might trigger an inadvertent turn, by simply stretching. To ensure this does not happen, sensors, such as proximity sensors could be positioned in both handgrips. In order to use motion capture as steering, both sensors would need to be activated, that is a contestant would have to have both hands on the handgrips.
  • The bike trainer as discussed above is close to an ideal simulator. Its major deficiency is that it is unable to lean, as a real bike leans, for turning. Thus the bike trainer is unable to give proper body motion for realistic motion capture. Tectrix® http://www.refstar.com/tectrix/products/vr/index.html manufactures a stationary bike called the VR Bike. On this stationary bike, the seat and pedals rotate on a horizontal axis. When a rider initiates a turn, the VR Bike leans into it, thus providing realistic leaning motion for motion capture. If this leaning motion was deemed necessary for a particular version of our current Game System, modifications to this device and the virtual world could be made.
  • To this point we have discussed simulating real activities and devices in the virtual world. However, our present Game System is not limited to that. One of the goals of our present Game System is to present various physical challenges to participants. If the physical challenge is real, that is, our participants sweat, what is actually being simulated does not have to simulate known devices in the real world. The Tectrix VR Bike can simulate varying terrain. However, as the hand positions do not match a real recumbent bike's hand positions, a fantasy bike can be created for the virtual world that would match the hand positions of the Tectrix VR bike. In effect we have a real world simulator for a fantasy device.
  • The Tectrix VR Bike would require some modification before being used in this role. For form factor reasons the head and the behind-the-seat speakers can be removed. Pressure sensors can be positioned in the handgrips to give a contestant a means to indicate braking. More on Fantasy Simulators below.
  • Swimming Simulators.
  • In a broad sense, a lap pool is an open water simulator. The disadvantage of using a lap pool is that turning around at the end of the lap pool does not make for realistic motion capture. Endless Pools® http://www.endlesspools.com manufactures a solution. They bill themselves as a swimmer's treadmill. Their device uses a water jet to create an adjustable current in a pool as small as 8×15 feet. A device such as this comes close to our ideal simulator.
  • One disadvantage of using a device such as this is that it is not participant controlled. That is, the participant has to swim at whatever speed has been set into the device to stay in the middle of the pool. A participant could be given an input device to be worn on the wrist to change the speed of the current. However, while a participant is inputting the change the current may push him to the back wall, which would be undesirable. Another solution is the participant could shout out the change in speed he wants, i.e. “plus two”, “minus three”, etc. and a member of the crew/staff could input the change for him, or voice recognition software could be used and coded to make the required changes in speed. The disadvantages of these methods is that the result is jerky unnatural movement when the speed of the current is changed.
  • A more ideal and organic solution is detailed in our USPTO Provisional Patent Application entitled An Improved Artificial Water Current Control Device. In brief, a sensor is used to determine a swimmers distance from the front of the pool. A small neutral area is assigned in the middle of the pool. If a swimmer moves out of the neutral zone and towards the front of the pool, the velocity of the water stream is increased until the swimmer is back in the neutral position. Similarly if the swimmer moves toward the back of the pool, the water stream is reduced. Thusly a participant is in precise control of the speed he wants to go, simply by speeding up or slowing down.
  • An ideal swimming simulator would have a method to navigate through the virtual world. Unfortunately this is not possible with the current art. The solution is to change the game. That is, if the game calls for swimming in the virtual world the path a participant needs to take in the virtual world is straight, so the simulator and virtual world are in synch. For example, in the game the challenge may be to swim across a one-mile wide virtual river before the opposition, but where on the other side a participant lands is immaterial.
  • Canoe Simulator
  • PaddleOne® manufactures a canoe simulator http://www.paddleone.com/canoe kayak/paddleonec.php?I=en. The PaddleOne C can be modified to be suitable for our present Game System. This device does not have any electronics, so there is no method for our present Game System to know how much effort (how fast) a participant is doing. The amount of work a participant is doing can be determined by placing a strain sensor where the cable attaches to the paddle or by putting a pressure sensor on the paddle, where a contestant's hand near the paddle blade rests. The information from sensors such as these can be interpolated into the speed of the virtual canoe.
  • There needs to be a method for a participant to navigate in the virtual world. The simplest method would be to use foot switches/sensors. Activating these devices would require a very small motion, so it would not lead to unrealistic motion capture. A more challenging method for a contestant would be to compare the amount of work a participant does on the right and left side. That is, if his right stroke were more powerful than his left, the canoe would move to the right in the virtual world. To accomplish this, a sensor would have to be added to the paddle so the present game would have information on what side the stroke is.
  • Modifying the PaddleOne C as described above would be an adequate canoe simulator for our present Game System. Producing location-based entertainment or a reality show is as much art as science. The PaddleOne C's strength is that it is cost-efficient and easy for a participant to hop on it and start paddling. If canoeing is a small part of the story/challenge of the game being staged the PaddleOne might be preferred embodiment. However, if canoeing is a large part of the story/challenge another solution may be preferred.
  • In addition to what part a particular simulation plays in the story/challenge, another consideration must be cost. Therefore we will discuss two more canoe simulators. One simulator tethers a canoe in a body of water such as a pool. The other simulator creates a current in a pool for a free-floating canoe to row against, similar to Endless Pool discussed above. The tether solution is considerably less expensive that the free-floating one. The free-floating solution has a more aesthetic quality, it has much better visuals and it gives the contestants more of a challenge, as the free-floating canoe is much harder to control than the tethered one.
  • Both the tethered and free-floating canoe simulators use a real canoe. This canoe can be a multi-person canoe. This can lead to additional challenges for participants, for example a single person canoe is easier to control than one where multiple people are rowing. Using a multi-person canoe offers participants more choices. For example, not all participants need to be rowing at the same time, that is, in a four man canoe, two participants may rest. In our Game System, participants must carry objects in the real world that they need to survive, such as food and water, and they must also carry objects that are the physical manifestation of objects they will need in game, think “Staff of Ra” in “Raiders of the Lost Ark”. A fully loaded canoe is harder to paddle than an empty one, which gives the participants more choices as they prepare for the canoe portion of their story/challenge.
  • There are a number of methods to construct canoe simulators, our present Game System is illustrated, but not limited by the following descriptions. A real canoe is modified by having bungee cords mounted port stern, port bow, starboard stern, and starboard bow. Participants load and enter the canoe on the side of a pool. Staff/crew then position the canoe in the center of a pool and connect the bungee cords to the appropriate corners of the pool. Stain sensors are positioned on each of the bungee cords. These sensors give our present Game System enough information to determine speed and direction of the virtual canoe. That is, the more tension on the stern bungee cords, the faster the virtual canoe is going. The more tension on the port bow bungee the more acutely the virtual canoe is turning to starboard.
  • A free-floating canoe simulator is described in more detail in our USPTO Provisional Patent application entitled An Improved Artificial Water Current Control Device. As in the Endless Pool modification described above in the Swimming Simulator section this canoe simulator tracks where the front of the canoe is relative to the front and sides of the pool. This canoe simulator has three water jets, one positioned in front of the canoe, one to port and one to starboard. As the front of the canoe leaves a neutral area, the velocity of the different jets is modified to keep the canoe in the center of the pool. The velocity of the different jets is enough information for our present Game System to determine the speed and direction of the virtual canoe.
  • Simulating water current would further challenge participants. Simulating current either straight ahead or straight behind is easy. The velocity a participant maintains on the simulator is calculated and the velocity of the virtual current is added for a back current or subtracted from a front current. The PaddleOne and the Tethered Canoe simulators would simulate sideways water current by changing their track in the virtual world. That is, the vector obtained from the simulator would be modified by the vector of a virtual current, and this modified vector would be used to track the canoe in the virtual world. The free-floating canoe simulator is superior as the intensity of the side jets can be changed to simulate the virtual current, thus providing contestants with more feedback and challenge.
  • Fantasy Simulator
  • Our present Game System uses simulators to add a physical challenge to the Game System, and the physical component is also a necessary component for a Fantasy Simulator. As long as a physical component remains, the types of possible simulators is only limited by the human imagination. One skilled in the art, in light of the teachings above, would understand that there are no technical challenges to constructing the Fantasy Simulators discussed below. Our present Game System is illustrated, but not limited, by a discussion of just a few Fantasy Simulators.
  • Human-Powered Sub. Although human-powered sub do exist http://www.enme.umd.edu/terpedo/ most participants and audience would have no knowledge of them. Additionally, the real human-powered subs are ugly and awkward to use. For our present Game System we can take a real vehicle and modify it, in other words turn it into a Fantasy Vehicle. The virtual sub would need to be more aesthetic pleasing. Thus the virtual sub would be improved with a glass canopy and a pressurized cockpit. With those changes in mind, the Tectrix VR Bike discussed above would put a participant in the proper position for “realistic” motion capture of a human-powered sub. A means for navigating the virtual sub up or down in the water column would have to be added. This could be as simple as a button on the right hand rest for up and a button on the left hand rest for down.
  • Inflatable Plane. Like the human-powered sub addressed above, inflatable planes are real. The Goodyear Inflatoplane, GA-468 was designed to be dropped behind lines for downed pilots to use to be rescued. Again we can take the concept of a real vehicle and modify it for our present Game System. A human-powered inflatable plane capable of being stored and carried in a backpack could enhance storylines and add a unique challenge to contestants. As with the sub above, the Tectrix VR Bike can be used as the real world simulator. An additional challenge for participants would be to manufacture a real world mock-up of the virtual inflatable plane, and have a contestant blow up and assemble the plane in the real world before being able to use it in the virtual world.
  • Multi-Person Powered Dirigible. One iteration is a virtual dirigible with a large open passenger compartment slung underneath it. It has four backward facing hand-powered propellers in the stern of the passenger compartment and in the bow a large spoked wheel for right/left travel and a large lever for up/down travel. For this vehicle, the spoked wheel and lever would need to be constructed in the real world. A simple way of presenting this simulator would be to have four stationary upright bikes in a row with a “captain” navigating through the virtual world by use of the wheel and lever.
  • A more complex simulator can be built. The open passenger compartment can be replicated in the real world. Additionally, it could be mounted on a motion platform, if the dirigible was an important element of the story/challenge. The passenger compartment could have room for additional people, so as an additional challenge, the participants could load up the dirigible with people and supplies and take turns powering it with the stationary bikes. An additional improvement would be to mount simulated cannons on both sides of the passenger compartment. Correctly firing these cannons would fire cannons in the virtual world.
  • Sets
  • The simulators discussed above are basically fixed in orientation. That allows an image of the virtual world to be displayed in front of the participants. Having the virtual world displayed in front of a participation allows him to navigate the virtual world. There are some challenges that require navigation clues/challenges in the real world that map to the virtual world.
  • In the movie “Indiana Jones and the Last Crusade”, Jones had three tests before he could get to the grail. The second test “The Word of God” involved jumping from tile to tile. To recreate a challenge similar to that in our present Game System we would require a set, so a participant could see the tiles in front of him and jump based on that, not based on a reference to the tiles in the virtual world. One simple iteration would be to place a grid of pressure sensors on a floor. These sensors would then be covered with a material such as canvas painted with a representation of the tiles. Thus a participant would have a real world reference, and he would have to jump properly for realistic motion capture. Our present invention would know if a participant jumped on the correct tiles by which pressure sensors were activated.
  • There is not one “right” solution as to how to implement a challenge such as “The Word of God”. It really depends on the resources available and how important the particular challenge is to the overall story/challenge. A more interesting visually and more challenging method would be to make real world representations of the tile and place these tiles over the sensor floor. A pressure sensor would be on each tile. If the tiles were a few inches high, it would be more challenging for a participant to jump and land firmly on a tile only, while avoiding the floor. If a participant landed on either a wrong tile or hit the floor between tiles a “challenge failure” sequence would be triggered in the virtual world. This method has the benefit of participants that don't land squarely on a tile, having to balance themselves. However, in case they lose the challenge there is nothing to motion capture, so a “canned” sequence would have to be used.
  • If this challenge was very important to the story/challenge even more resources could be devoted to it. In order to motion capture a failed challenge, where a participant falls to his virtual doom by missing a tile and/or stepping on a wrong tile this iteration would be created above a pool of water. The tiles would be supported by poles in the pool. The correct tiles would be locked in place, that is they would remain flat when weight is put on them. However, the incorrect tiles would be hinged so that would not support any weight, and if jumped upon would dump a participant in the pool. Proximity sensors on the tiles would enable our present Game System to know which tile a contestant was on, so his action could be mapped properly to the virtual world.
  • Rooftop lumping Challenge. The goal with building a set for a rooftop jumping challenge would be similar to that described above for “The Word of God”. That is to provide a setting where a participant is forced to use proper body motion for realistic motion capture, and provide the participant visual cues in the real world. Additionally, sensors must give our present Game System information about a contestant's location so he can be mapped in the virtual world. As in the challenge above, a grid of pressure sensors on the floor covered by a canvas painted with rooftops, would meet the requirements. An improvement would be to build little walls that are typical on flat roofed buildings. Additionally building slanted roofs, would force contestants to use proper body motion for slanted roofs in the virtual world. To provide for proper motion capture of a participant missing a jump, the roofs can be built over a pool or over a heavy mat.
  • One knowledgeable in the art would understand that there are many ways to determine the location of a person inside a known area, like a sound stage. Our present invention is not concerned with how a contestant's location is known, just that it is known and capable of being passed to our present Game System to map his location in the virtual world.
  • When participants are on simulators, the simulator's orientation is restricted and thus our present Game System is able to project an image of the virtual world in front of contestant. When participants are on sets they may wear Heads Up displays and the Game System would project the virtual world on these displays. In this way participants would have natural movements to be body captured and our present game would be able to position them properly in the virtual world. The disadvantage to this system is the head-up display will obscure a portion of a contestant's face. In this case, the advantages outweigh the slight disadvantage.
  • Idle Animations
  • As participants move from one simulator to another, there is nothing meaningful to motion capture. In the virtual world, when there is no meaningful motion to capture, idle animations will be displayed. For example, a participant bikes to a virtual world dead-end rockslide and needs to dismount his bike simulator and then get on a climbing wall simulator. This idle animation could be as simple as a virtual participant shifting his weight. Idle animations are well known in the art. The staff/crew are responsible for signaling to our Game System that an idle animation is required.
  • 6. No Role Playing
  • For the purposes of this discussion we will define Role Playing Game (RPG) as one in which a participant is represented in a virtual world by an Avatar, where the Avatar's ability to interact with the virtual world is determined by game mechanics rather than by the participant's real world abilities. In RPGs such as Everquest and World of Warcraft, movement in the Virtual World is determined by game mechanics, such as what objects an Avatar is carrying in the virtual world, or what spells are on an Avatar or what magic gear an Avatar is wearing. As discussed above, movement in our present Game System is dependent on the physical level of effort a participant is willing to make. Similarly, the ability of a participant to overcome an obstacle or solve a puzzle is not dependent on the abilities of their computer Avatar, but rather on their own physical abilities and “smarts.”
  • Games such as Quake and Halo are RPGs for the purpose of this discussion because the ability to aim and fire a weapon is determined by mouse clicking for where a participant wishes to shoot, and game mechanics determining the whether the shot hit or not. Our present Game System may use replica weapons such as the ones manufactured by FATS, Inc. http://www.fatsinc.com/about/news/pr65.cfm “FATS virtual weapons resemble the fit and function of live weapons to include recoil. FATS virtual training provides accurate, real-time diagnostics including point-of-aim, weapon status, trigger pressure and cant.” This makes weapon firing much more challenging. Additionally, in games such as Halo and Quake reloading a weapon is done by one keystroke. In our present Game System, a participant must reload a weapon with a new magazine by hand. No Role Playing makes our present Game System much more challenging than RPGs.
  • In additional to firearms, other ranged weapons may be simulated. A bow may be modified by adding a sensor to detect how much force is used to pull back the string. Additional sensor/s would determine the x, y, z orientation of the bow. One skilled in the arts would understand this gives our present Game System enough information to make a determination of where the arrow would land in the virtual world. A regular arrow could be modified with a large foam tip, for safety. In a similar fashion a throwing spear could be modified by adding a sensor/s to determine the orientation of the spear and another sensor to determine velocity when thrown.
  • Some devices are more challenging to simulate, such as the whip used by Indiana Jones in the movies. A real whip is: 1) hard to use for an untrained participant; and 2) potentially dangerous to participant and cast/crew members. Therefore a simulated whip would consist of just the whip handle with a velocity sensor and sensor/s for x, y, z orientation. In use, a participant would quickly move the handle in a whip like fashion to generate velocity and end the motion with the whip handle aimed at a target. Our present Game System would determine if the velocity was high enough and the aim was close enough to score a “hit”. Additionally, the whip handle may include a button/s. In the movies Indiana Jones was able to grab objects with his whip. A button may be pressed by a participant to indicate an attempt to grab an object rather than hit it.
  • Our current Game System motion captures the participants. One skilled in the art would appreciate that if our Game System knew the physical qualities of a item, such as a hand-held weapon, it would be able to position that weapon in the virtual world. A staff/crew member could input to our present Game System what a participant is holding, and with what hand. Or, sensors such as RFID could be placed on items, such as hand-held weapons and on a participant's gloves, as a method to input to our present Game System what a contestant is holding. A velocity sensor may be added to hand-held weapons. Once our present Game System knows what item a participant, is holding, the Game System can track the item in the virtual world and determine if the object hits an object in the virtual world. If a collision is detected our present Game System could then make a determination of what effect the hit would have based on real-world physics, not game mechanics.
  • Non-Player Characters (NPCs). In RPGs such as Everquest and World of Warcraft how NPCs react with Avatars is determined by game mechanics, how high an Avatar's faction is with the faction of the NPC. Our present Game System may make use of this known system in the interest of being cost effective. Ideally, however, NPC Avatars would be controlled by actors, staff/crew members, audience members or even celebrities. These individuals would be classically role-playing as defined by Wikipedia: “In role-playing, participants adopt characters, or parts, that have personalities, motivations, and backgrounds different from their own.” Acting is role playing.” That is they would act a part. If their part is that of village chiefdom, they would treat a participant unkindly if the participant blew up his village. The ideal non-player character system of our present Game System involves acting or classical role-playing, but the participants will interact with real people. Real people who have real feelings, who can take offense at something a participant says, who can be lied to by a participant, etc.
  • 7. Story Telling
  • Because the game system allows for participants to experience great adventures without the dangers, the Game System presents the ability to place participants in fantastic story lines as never before. Average people can venture to places never imagined, and interact with a virtual environment that fights back. An accountant who just yesterday was sitting behind a desk can now be dangled above a deep chasm on a swaying rope bridge. How she responds to this stress and stimuli will be different than what a true adventurer might.
  • Current MVWGs are referred to as “sandbox” games. That is, players are given access to the virtual world and left there to make their own stories of victories and defeats. Games such as Everquest do have an ongoing story, but it is glacially slow, and the story progress is external to the players. The closest our present Game System is to existing art is “Epic” quests in Everquest.
  • The stories our present Game System can tell are only limited by the imagination and resources available. Our present Game System is ideal to tell “Indiana Jones” type stories. In a broad sense the stories our present Game System can bring to life have the following components: athletics, problem solving, cooperation and competition.
  • Good Story Telling
  • In the discussion above we referred to good game play. Good game play is the same as good story telling. Good story telling involves choices. Does a participant choose to take the short route over the mountain, or the safer route around the mountain? Does a participant jump into the river, or take a detour to go over a bridge? Does a participant go off the main path to get food, or push forward and go hungry?
  • Our present Game System has a fixed start point and a fixed location contestants must locate and travel to in order to complete the game. Additionally, contestants must complete their journey in a fixed amount of time. If there was a two-lane highway between the start and end locations there would not be much of a story. The challenge in creating a good story is to place physical obstacles and mental obstacles (puzzles) in the path of participants and to give them choices on how to solve and/or go around these obstacles.
  • Types of Games
  • Our present Game System is suitable for three general categories of games: “reality” show; co-operative; and competitive. An ideal iteration of our Game System would involve a video record of a contestant's participation. Therefore attention must be given to not only what makes a good game for a participant, but what will make an interesting story for an audience. “Reality” shows are known for interpersonal cooperation and competition as well as team competition. Our Game System is suitable for Location Based Entertainment (LBE). An LBE Game could be a cooperative game, with all contestants helping each other towards a common goal. This type of activity is very common for Corporate Team Building. A LBE Game could be competitive in either teams or individual. Paint Ball features competitive teams.
  • Ideally a “reality” show game would take weeks to complete. A Corporate Team Building game would take a weekend to complete. And a competitive game would take at least half a day to compete.
  • Story Telling as been described in generalities, as the story possibilities are so endless. Our present Game System could be used to make an “Indiana Jones” game, a Discover the Source of the Nile game, a Lord of the Rings game, a Star Trek game, etc. Whatever story is to be told, the key element is to give contestants choices.
  • An Example of Our Present Game as a “Reality” Show
  • Following is a discussion of an example of our present Game System as a “reality” television show named “Quest! Mayan Jade”. This discussion is intended to illustrate, but not limit, our present invention. There are three components to our present Game System as a “reality” show; contestant's actions in the virtual world, contestant's action in the real world, and what the audience sees.
  • Quest! Mayan Jade. The Virtual World Story
  • By now the entire world knows of Professor Kort, who stumbled down from the Guatemalan Highlands and eventually found his way to Belize City, just 30 days ago. He was covered in strange tattoos and mysterious wounds. He was confined to a psychiatric ward as he told a story of a jungle filled with ghosts, of giant Mayan ruins that no one else has seen, of traveling to the ninth layer of the underworld, and of a “lost” Mayan civilization far in the Chiapas Highlands. The professor claimed to have witnessed ancient Mayan rites being performed as the Mayan Shamans began preparing their world for the end of time in the year 2012.
  • At first people thought the Professor was crazy. But six mysterious individuals (the Benefactors) sent spies in, spread some money around, and found the only item the professor carried out of the mountains was a strange piece of jade, that appeared to be part of a larger glyph that might be a map, a calendar, or something else of great importance. Imbedded in the jade was a single green diamond of 10 carats, making that single piece worth millions of dollars, and holding out the possibility of greater wealth if the other pieces could be found.
  • The Benefactors believe that the Professor's journey started in the recently discovered Mayan city of Chol during Winter Solstice. One witness, who has since disappeared, states that the Professor told him he discovered a tremendous secret by watching the Winter Solstice sun set at the Temple of Chol. This is consistent with findings at other Mayan ruins.
  • Re-supply on the trip to Chol will be a major problem. The Benefactors are setting up hidden supply drops. Teams must make it to these drops at the precise time scheduled—too early will be just as unsafe as being too late. These drops are in some of the worst areas imaginable.
  • Once in Chol, which covers six square miles and contains 10,000 individual structures teams must make the city give up its secrets. Teams should arrive in Chol two weeks before the Winter Solstice. That does not leave you much time to ferret out its secrets. To make things even more difficult, the other teams will not only be searching for these same secrets, but will be actively trying to mislead each other, find and raid the resupply drops, and attack other teams outright. After leaving Chol, teams will encounter an impenetrable jungle that has never been mapped. The Professor spoke of going to the highlands, which means some serious climbing for each team, while searching for clues to the next destination. All manner of insects, snakes, and animals stand in each team's way, and the Professor kept muttering about Jungle Ghosts.
  • Once teams reach the ultimate destination, the most difficult puzzle is yet to solve. The Professor kept babbling about a certain glyph, which held both the secret to the future, and a stash of green diamonds. Could the jade piece be part of that glyph? Even if any one team could assemble all the parts, will they be able to unravel the mystery in time?
  • Quest! Mayan Jade. Episodic Breakdown:
  • Episodes 1-3. Qualifications/Meet the Players. Potential contestants gather in regional centers to try and qualify for one of the six teams. Each contestant is put through a rigorous battery of physical, mental, and virtual challenges and rated by a panel of judges (made up of an ex-pro sports star, a psychiatrist, and a noted archeologist). Physical prowess is important, but so too are quick wits, and the ability to work within a team structure, while still maintaining one's individuality. Contestants must be able to adapt to the unique interfaces between the real world and the virtual world they will inhabit in the show. In each of the first three episodes, two of the regional competitions are highlighted, with a special emphasis on introducing the ultimate players, as well as the show concept and the unique technology they will be using. At the end of each episode, the final ten contestants for each team will be placed into a virtual arena to test their skills. Six are chosen for each team, with four being chosen by the experts to ensure the necessary breadth of skills, and the remaining two “wild card” players chosen by viewers within the geographic region (local affiliate tie-ins available).
  • Episodes 4-6. “Three Roads To Chol.” In the next three episodes, two of the teams are matched head-to-head as they race to find the lost city of Chol. Although starting at slightly different points, they are near enough that they ultimately will interact on their way. The teams must battle the environment and each other as they find clues, solve puzzles, and run, swim, and swing their way to the ancient city. At the end of the episode, the teams end up at outskirts of Chol, and must find a way into the city. They can cooperate, go their own way, our fight it out right there (one less team in the competition, right?)
  • Episode 7. “The Secrets of Chol.” In Episode 7 all six teams race around Chol to find its secrets as quickly as they can. There is full interaction of all 36 players, and no team can win on its own, as each will have accumulated a necessary clue. Players are not told this, however, and undoubtedly one or more of the team will believe they can win by physically beating the others. Quickly they will find, however, that if a team is defeated, its clue pieces magically disappear. The teams that recognize the quickest that the only enemy is the city will score the most points. Based on the overall scores from the three episodes, two of the teams are sacrificed to the gods (sent home), leaving the top four teams.
  • Episodes 8-11. “Head To Head in the Jungle.” The secrets of Chol now revealed, the next four episodes pit teams head to head based on their rankings to date as they make their way into the jungle (e.g., Episode 8 pits 1 vs. 4, Episode 9 pits 2 vs. 3, Episode 10 pits 1 vs. 3 and Episode 11 pits 2 vs. 4). By now comfortable with the equipment, the physical requirements are increased, as much climbing, repelling, and walking across narrow rope bridges is required. The stakes for both the teams and the audience are raised, as the physical demands and increasingly powerful NPCs begin to eliminate players from each team. Teams with the week off aren't idle, however, as they are interjected into the world as NPCs to battle against the competing teams, and to steal whatever knowledge they can. Based on the scores after these episodes, two more teams are eliminated.
  • Episode 12. “The Winter Solstice.” The remaining two teams are pitted against each other in one final adventure to see who will solve the riddle of the Maya, and collect the rare green diamonds. This episode ties together the entire quest story, and forces the teams to put all of the skills and knowledge to use. A climactic scene forces the teams to win, or die, on the ball fields of Chitzen ltza. It's winner take all, or is it? There may still be one more challenge.
  • Episode 13. “All Star Challenge.” The top 8 rated players, and two voted on by the audience and invited back, form an all star team that must solve the last riddle of the Maya, with the future of the world hanging in the balance.
  • A Detailed Look at a Possible Segment from Quest! Mayan Jade. Episode 4: Rockville Raptors vs. Springfield Snakes, Encounter at the River.
  • In the virtual world, two teams, the Rockville Raptors and the Springfield Snakes are racing each other to the lost City of Chol. Both teams are bicycling down different roads, which end at the same place, small Mayan ruins by the side of river. The segment starts with both teams on bicycle simulators, with the virtual world projected in front of them.
  • In the virtual world the Springfield Snakes reach the ruins first. The Snakes indicate to the crew that they wish to dismount from the bikes. In the virtual world the Snake's Avatars go into an idle animation for two minutes, the pre-determined allotted time for changing from bicycle simulators to dismounted. The Snake participants gather up their gear, which was stored on their bicycle racks, and follow the crew to a room with a mat with a grid of sensors underneath it. The virtual world is displayed in front on them on a large screen. After two minutes from their bicycle simulator dismount, their Avatars are taken out of idle animation and the Snakes are once again controlling their Avatars.
  • In the virtual world the Springfield Snakes spread out investigating the ruins. Jim, one of the Snakes is positioned to watch the two roads that converge at the ruins. Jim sees the Rockville Raptors on their bicycles heading down the road straight at him. Jim prepares his longbow for firing and yells at the Raptors, although he doesn't intend to actually shoot. Steve, the captain of the Rockville Raptors yells at his team to turn around, bike back up the road and dismount out of range of the Snakes, not sure if the Snakes really want an actual battle.
  • In the real world, the Rockville Raptors dismount when they are far enough away from the Snakes to do it without fear of being shot. Once dismounted and formed up they head back toward the ruins by walking on their treadmills. Meanwhile Jim decides the Snakes are not going to risk a fight, as he feels the Raptors are better at long range weapons then his team. Jim signals to the Snakes to cross the river in one of the two canoes at the ruins and to destroy the other canoe, so the Raptors cannot easily follow.
  • As the Raptors are walking down the road in the virtual world, in the real world they come to the end of the sound stage. Their Avatars are put into idle mode for 30 seconds as the Raptors get themselves reset. Meanwhile, the Snakes demolition guy Mark, takes out a C4 explosive replication, arms the timer for ten seconds and drops it on the floor where he sees the second canoe in the virtual world. The Snakes then indicate that they want to board the first canoe. The Snake's Avatars are put into idle mode for six minutes, as the canoe simulator is physically far away. (Previously, in the early episodes, the Snakes had been timed as to how fast they could load up a canoe in real life, and they were able to do it in three minutes.) At this point, the show producer/game master puts the Raptors in idle mode for three minutes so the Snakes will not be penalized for changing simulators. So the Raptors are given a short break, as the Snakes get into their canoe simulator. If a raw live feed was going out to an audience, both teams would be seen to be in idle mode.
  • In the virtual world the Raptors continue walking down the road to the ruins. They see the Snakes getting into one canoe and shortly later see the other canoe blow up. Since the Snakes now have a lead on the Raptors, Steve decides to examine the ruins. The Raptors find many glyphs. In one set of glyphs they recognize the name of the city they are trying to find Chol. The Raptors work on deciphering the glyphs. They finally get it and discover the glyphs describe a “short cut” to the lost city. The Raptors find the small trail the glyphs indicated and head on out, hopeful that this short-cut will enable them to beat the Snakes to Chol.
  • No element, act, or instruction used in the description of the present application should be construed as critical or essential to the invention unless explicitly described as such. Also, as used herein, the article “a” is intended to include one or more items. Where only one item is intended, the term “one” or similar language is used.
  • The above specification and examples provide a complete description of the structure and use of exemplary embodiments of the invention. Since many embodiments of the invention can be made without departing from the spirit and scope of the invention, the invention is defined by the claims and their equivalents.

Claims (14)

1. A method for a multi-player non-role-playing virtual world game system comprising:
quantifying a player's real world physical activity, and;
modifying a player's real world level of difficulty by conditions in a virtual world, and;
mapping a player's real world physical activity to an avatar for the purpose of interacting with the virtual world.
2. The method of claim 1 further comprising motion capturing of players.
3. The method of claim 2 wherein motion capturing is done in real time.
4. The method of claim 1 further comprising objects that exist in both the real world and the virtual world and can be interacted with in both.
5. The method of claim 1 further comprising requiring high levels of physical activity.
6. The method of claim 5 wherein the high levels of physical activity are performed on a simulator.
7. The method of claim 6 wherein the simulator's level of difficulty is modified by conditions in the virtual world.
8. The method of claim 6 wherein the simulator is a stair simulator.
9. The method of claim 6 wherein the simulator is a walking/running simulator.
10. The method of claim 6 wherein the simulator is a bicycle simulator.
11. The method of claim 6 wherein the simulator is a swimming simulator.
12. The method of claim 6 wherein the simulator is a canoe simulator.
13. The method of claim 1 wherein the game system is a “reality show” game system.
14. The method of claim 1 wherein the game system is a “location based entertainment” game system.
US11/465,918 2005-08-22 2006-08-21 Multi-player non-role-playing virtual world games: method for two-way interaction between participants and multi-player virtual world games Abandoned US20080026838A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US11/465,918 US20080026838A1 (en) 2005-08-22 2006-08-21 Multi-player non-role-playing virtual world games: method for two-way interaction between participants and multi-player virtual world games

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US71012305P 2005-08-22 2005-08-22
US11/465,918 US20080026838A1 (en) 2005-08-22 2006-08-21 Multi-player non-role-playing virtual world games: method for two-way interaction between participants and multi-player virtual world games

Publications (1)

Publication Number Publication Date
US20080026838A1 true US20080026838A1 (en) 2008-01-31

Family

ID=38986993

Family Applications (1)

Application Number Title Priority Date Filing Date
US11/465,918 Abandoned US20080026838A1 (en) 2005-08-22 2006-08-21 Multi-player non-role-playing virtual world games: method for two-way interaction between participants and multi-player virtual world games

Country Status (1)

Country Link
US (1) US20080026838A1 (en)

Cited By (331)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080001951A1 (en) * 2006-05-07 2008-01-03 Sony Computer Entertainment Inc. System and method for providing affective characteristics to computer generated avatar during gameplay
US20080096665A1 (en) * 2006-10-18 2008-04-24 Ariel Cohen System and a method for a reality role playing game genre
US20080204411A1 (en) * 2002-02-07 2008-08-28 Microsoft Corporation Recognizing a movement of a pointing device
US20080215974A1 (en) * 2007-03-01 2008-09-04 Phil Harrison Interactive user controlled avatar animations
US20090318224A1 (en) * 2007-03-01 2009-12-24 Sony Computer Entertainment Europe Limited Entertainment device and method
US20100029384A1 (en) * 2008-07-22 2010-02-04 Sony Online Entertainment Llc System and method for physics interactions in a simulation
US20100057715A1 (en) * 2008-09-04 2010-03-04 International Business Machines Corporation Prevention of a User Mimicking Another User in a Virtual World
US20100088650A1 (en) * 2008-10-07 2010-04-08 Christopher Kaltenbach Internet-enabled apparatus, system and methods for physically and virtually rendering three-dimensional objects
WO2010042773A1 (en) * 2008-10-09 2010-04-15 Wms Gaming, Inc. Controlling and presenting virtual wagering game environments
US20100146464A1 (en) * 2003-03-25 2010-06-10 Microsoft Corporation Architecture For Controlling A Computer Using Hand Gestures
US20100199229A1 (en) * 2009-01-30 2010-08-05 Microsoft Corporation Mapping a natural input device to a legacy system
US20100241998A1 (en) * 2009-03-20 2010-09-23 Microsoft Corporation Virtual object manipulation
US20100281432A1 (en) * 2009-05-01 2010-11-04 Kevin Geisner Show body position
US20100281433A1 (en) * 2009-04-29 2010-11-04 International Business Machines Corporation Computer Method and Apparatus Specifying Avatar Entrance and Exit
US20100299640A1 (en) * 2009-05-21 2010-11-25 Microsoft Corporation Tracking in a virtual world
US20100295847A1 (en) * 2009-05-21 2010-11-25 Microsoft Corporation Differential model analysis within a virtual world
US20100302365A1 (en) * 2009-05-29 2010-12-02 Microsoft Corporation Depth Image Noise Reduction
US20100306825A1 (en) * 2009-05-27 2010-12-02 Lucid Ventures, Inc. System and method for facilitating user interaction with a simulated object associated with a physical location
US20100306084A1 (en) * 2009-05-28 2010-12-02 Yunus Ciptawilangga Need-based online virtual reality ecommerce system
US20100306121A1 (en) * 2009-05-28 2010-12-02 Yunus Ciptawilangga Selling and delivering real goods and services within a virtual reality world
US20100302247A1 (en) * 2009-05-29 2010-12-02 Microsoft Corporation Target digitization, extraction, and tracking
US20100306120A1 (en) * 2009-05-28 2010-12-02 Yunus Ciptawilangga Online merchandising and ecommerce with virtual reality simulation of an actual retail location
US20100302143A1 (en) * 2009-05-27 2010-12-02 Lucid Ventures, Inc. System and method for control of a simulated object that is associated with a physical location in the real world environment
US20100302257A1 (en) * 2009-05-29 2010-12-02 Microsoft Corporation Systems and Methods For Applying Animations or Motions to a Character
WO2010138344A2 (en) * 2009-05-27 2010-12-02 Lucid Ventures, Inc. System and method for control of a simulated object that is associated with a physical location in the real world environment
US20100304804A1 (en) * 2009-05-27 2010-12-02 Lucid Ventures, Inc. System and method of simulated objects and applications thereof
US20100303297A1 (en) * 2009-05-30 2010-12-02 Anton Mikhailov Color calibration for object tracking
US20100325189A1 (en) * 2009-06-23 2010-12-23 Microsoft Corportation Evidence-based virtual world visualization
US20100331065A1 (en) * 2008-03-07 2010-12-30 Virtually Live Limited Media System and Method
US20110032336A1 (en) * 2009-01-30 2011-02-10 Microsoft Corporation Body scan
US20110055846A1 (en) * 2009-08-31 2011-03-03 Microsoft Corporation Techniques for using human gestures to control gesture unaware programs
US20110078052A1 (en) * 2009-05-28 2011-03-31 Yunus Ciptawilangga Virtual reality ecommerce with linked user and avatar benefits
US20110075921A1 (en) * 2009-09-30 2011-03-31 Microsoft Corporation Image Selection Techniques
US20110081044A1 (en) * 2009-10-07 2011-04-07 Microsoft Corporation Systems And Methods For Removing A Background Of An Image
US20110119640A1 (en) * 2009-11-19 2011-05-19 Microsoft Corporation Distance scalable no touch computing
US20110136557A1 (en) * 2008-08-05 2011-06-09 Konami Digital Entertainment Co., Ltd. Game device, method for controlling game device, program, and information storage medium
US20110143834A1 (en) * 2009-12-15 2011-06-16 Wms Gaming, Inc. Location-based customization of avatars in gaming systems
CN102135882A (en) * 2010-01-25 2011-07-27 微软公司 Voice-body identity correlation
US20110190062A1 (en) * 2010-02-02 2011-08-04 Nintendo Of America Inc. Massively single-playing online game
US20110199302A1 (en) * 2010-02-16 2011-08-18 Microsoft Corporation Capturing screen objects using a collision volume
US8009022B2 (en) 2009-05-29 2011-08-30 Microsoft Corporation Systems and methods for immersive interaction with virtual objects
US20110210915A1 (en) * 2009-05-01 2011-09-01 Microsoft Corporation Human Body Pose Estimation
US20110216965A1 (en) * 2010-03-05 2011-09-08 Microsoft Corporation Image Segmentation Using Reduced Foreground Training Data
US20110216976A1 (en) * 2010-03-05 2011-09-08 Microsoft Corporation Updating Image Segmentation Following User Input
US20110234490A1 (en) * 2009-01-30 2011-09-29 Microsoft Corporation Predictive Determination
US20120015699A1 (en) * 2010-07-16 2012-01-19 enVie Interactive LLC Unlocking content in a virtual environment
US8213680B2 (en) 2010-03-19 2012-07-03 Microsoft Corporation Proxy training data for human body tracking
US20120194549A1 (en) * 2010-02-28 2012-08-02 Osterhout Group, Inc. Ar glasses specific user interface based on a connected external device type
US8253746B2 (en) 2009-05-01 2012-08-28 Microsoft Corporation Determine intended motions
US8264536B2 (en) 2009-08-25 2012-09-11 Microsoft Corporation Depth-sensitive imaging via polarization-state mapping
US8267781B2 (en) 2009-01-30 2012-09-18 Microsoft Corporation Visual target tracking
US8279418B2 (en) 2010-03-17 2012-10-02 Microsoft Corporation Raster scanning for depth detection
US8284847B2 (en) 2010-05-03 2012-10-09 Microsoft Corporation Detecting motion for a multifunction sensor device
US8295546B2 (en) 2009-01-30 2012-10-23 Microsoft Corporation Pose tracking pipeline
US8296151B2 (en) 2010-06-18 2012-10-23 Microsoft Corporation Compound gesture-speech commands
US8320621B2 (en) 2009-12-21 2012-11-27 Microsoft Corporation Depth projector system with integrated VCSEL array
US8320619B2 (en) 2009-05-29 2012-11-27 Microsoft Corporation Systems and methods for tracking a model
US8325909B2 (en) 2008-06-25 2012-12-04 Microsoft Corporation Acoustic echo suppression
US8325984B2 (en) 2009-10-07 2012-12-04 Microsoft Corporation Systems and methods for tracking a model
US8330822B2 (en) 2010-06-09 2012-12-11 Microsoft Corporation Thermally-tuned depth camera light source
US8330134B2 (en) 2009-09-14 2012-12-11 Microsoft Corporation Optical fault monitoring
US8340432B2 (en) 2009-05-01 2012-12-25 Microsoft Corporation Systems and methods for detecting a tilt angle from a depth image
US8351651B2 (en) 2010-04-26 2013-01-08 Microsoft Corporation Hand-location post-process refinement in a tracking system
US8363212B2 (en) 2008-06-30 2013-01-29 Microsoft Corporation System architecture design for time-of-flight system having reduced differential pixel size, and time-of-flight systems so designed
US8374423B2 (en) 2009-12-18 2013-02-12 Microsoft Corporation Motion detection using depth images
US8379101B2 (en) 2009-05-29 2013-02-19 Microsoft Corporation Environment and/or target segmentation
US8379919B2 (en) 2010-04-29 2013-02-19 Microsoft Corporation Multiple centroid condensation of probability distribution clouds
US8381108B2 (en) 2010-06-21 2013-02-19 Microsoft Corporation Natural user input for driving interactive stories
US8385557B2 (en) 2008-06-19 2013-02-26 Microsoft Corporation Multichannel acoustic echo reduction
US8385596B2 (en) 2010-12-21 2013-02-26 Microsoft Corporation First person shooter control with virtual skeleton
US8390680B2 (en) 2009-07-09 2013-03-05 Microsoft Corporation Visual representation expression based on player expression
US8401242B2 (en) 2011-01-31 2013-03-19 Microsoft Corporation Real-time camera tracking using depth maps
US8401225B2 (en) 2011-01-31 2013-03-19 Microsoft Corporation Moving object segmentation using depth images
US8411948B2 (en) 2010-03-05 2013-04-02 Microsoft Corporation Up-sampling binary images for segmentation
US8408706B2 (en) 2010-12-13 2013-04-02 Microsoft Corporation 3D gaze tracker
US8418085B2 (en) 2009-05-29 2013-04-09 Microsoft Corporation Gesture coach
US8416187B2 (en) 2010-06-22 2013-04-09 Microsoft Corporation Item navigation using motion-capture data
US8428340B2 (en) 2009-09-21 2013-04-23 Microsoft Corporation Screen space plane identification
US8437506B2 (en) 2010-09-07 2013-05-07 Microsoft Corporation System for fast, probabilistic skeletal tracking
US8448056B2 (en) 2010-12-17 2013-05-21 Microsoft Corporation Validation analysis of human target
US8457353B2 (en) 2010-05-18 2013-06-04 Microsoft Corporation Gestures and gesture modifiers for manipulating a user-interface
US8487938B2 (en) 2009-01-30 2013-07-16 Microsoft Corporation Standard Gestures
US8488888B2 (en) 2010-12-28 2013-07-16 Microsoft Corporation Classification of posture states
US8487871B2 (en) 2009-06-01 2013-07-16 Microsoft Corporation Virtual desktop coordinate transformation
US8498481B2 (en) 2010-05-07 2013-07-30 Microsoft Corporation Image segmentation using star-convexity constraints
US8499257B2 (en) 2010-02-09 2013-07-30 Microsoft Corporation Handles interactions for human—computer interface
US8497838B2 (en) 2011-02-16 2013-07-30 Microsoft Corporation Push actuation of interface controls
US8503494B2 (en) 2011-04-05 2013-08-06 Microsoft Corporation Thermal management system
US8508919B2 (en) 2009-09-14 2013-08-13 Microsoft Corporation Separation of electrical and optical components
US8509479B2 (en) 2009-05-29 2013-08-13 Microsoft Corporation Virtual object
US8509545B2 (en) 2011-11-29 2013-08-13 Microsoft Corporation Foreground subject detection
US8514269B2 (en) 2010-03-26 2013-08-20 Microsoft Corporation De-aliasing depth images
US8526734B2 (en) 2011-06-01 2013-09-03 Microsoft Corporation Three-dimensional background removal for vision system
US8523667B2 (en) 2010-03-29 2013-09-03 Microsoft Corporation Parental control settings based on body dimensions
US8542910B2 (en) 2009-10-07 2013-09-24 Microsoft Corporation Human tracking system
US8548270B2 (en) 2010-10-04 2013-10-01 Microsoft Corporation Time-of-flight depth imaging
US8553934B2 (en) 2010-12-08 2013-10-08 Microsoft Corporation Orienting the position of a sensor
US8558873B2 (en) 2010-06-16 2013-10-15 Microsoft Corporation Use of wavefront coding to create a depth image
US8565477B2 (en) 2009-01-30 2013-10-22 Microsoft Corporation Visual target tracking
US8565476B2 (en) 2009-01-30 2013-10-22 Microsoft Corporation Visual target tracking
US8571263B2 (en) 2011-03-17 2013-10-29 Microsoft Corporation Predicting joint positions
US20130290139A1 (en) * 2012-02-13 2013-10-31 Dean Stark System and method for virtual display
US8577085B2 (en) 2009-01-30 2013-11-05 Microsoft Corporation Visual target tracking
US8577084B2 (en) 2009-01-30 2013-11-05 Microsoft Corporation Visual target tracking
US8587583B2 (en) 2011-01-31 2013-11-19 Microsoft Corporation Three-dimensional environment reconstruction
US8588465B2 (en) 2009-01-30 2013-11-19 Microsoft Corporation Visual target tracking
US8592739B2 (en) 2010-11-02 2013-11-26 Microsoft Corporation Detection of configuration changes of an optical element in an illumination system
US8597142B2 (en) 2011-06-06 2013-12-03 Microsoft Corporation Dynamic camera based practice mode
US8602887B2 (en) 2010-06-03 2013-12-10 Microsoft Corporation Synthesis of information from multiple audiovisual sources
US8605763B2 (en) 2010-03-31 2013-12-10 Microsoft Corporation Temperature measurement and control for laser and light-emitting diodes
US8613666B2 (en) 2010-08-31 2013-12-24 Microsoft Corporation User selection and navigation based on looped motions
US8620113B2 (en) 2011-04-25 2013-12-31 Microsoft Corporation Laser diode modes
US8618405B2 (en) 2010-12-09 2013-12-31 Microsoft Corp. Free-space gesture musical instrument digital interface (MIDI) controller
US8619122B2 (en) 2010-02-02 2013-12-31 Microsoft Corporation Depth camera compatibility
US8625837B2 (en) 2009-05-29 2014-01-07 Microsoft Corporation Protocol and format for communicating an image from a camera to a computing environment
US8629976B2 (en) 2007-10-02 2014-01-14 Microsoft Corporation Methods and systems for hierarchical de-aliasing time-of-flight (TOF) systems
US8631355B2 (en) 2010-01-08 2014-01-14 Microsoft Corporation Assigning gesture dictionaries
US8630457B2 (en) 2011-12-15 2014-01-14 Microsoft Corporation Problem states for pose tracking pipeline
US8635637B2 (en) 2011-12-02 2014-01-21 Microsoft Corporation User interface presenting an animated avatar performing a media reaction
US8633890B2 (en) 2010-02-16 2014-01-21 Microsoft Corporation Gesture detection based on joint skipping
US8649554B2 (en) 2009-05-01 2014-02-11 Microsoft Corporation Method to control perspective for a camera-controlled computer
US8660303B2 (en) 2009-05-01 2014-02-25 Microsoft Corporation Detection of body and props
US8659658B2 (en) 2010-02-09 2014-02-25 Microsoft Corporation Physical interaction zone for gesture-based user interfaces
US8667519B2 (en) 2010-11-12 2014-03-04 Microsoft Corporation Automatic passive and anonymous feedback system
US8670029B2 (en) 2010-06-16 2014-03-11 Microsoft Corporation Depth camera illuminator with superluminescent light-emitting diode
US8676581B2 (en) 2010-01-22 2014-03-18 Microsoft Corporation Speech recognition analysis via identification information
US8675981B2 (en) 2010-06-11 2014-03-18 Microsoft Corporation Multi-modal gender recognition including depth data
US8681321B2 (en) 2009-01-04 2014-03-25 Microsoft International Holdings B.V. Gated 3D camera
US8682028B2 (en) 2009-01-30 2014-03-25 Microsoft Corporation Visual target tracking
US8681255B2 (en) 2010-09-28 2014-03-25 Microsoft Corporation Integrated low power depth camera and projection device
US8687044B2 (en) 2010-02-02 2014-04-01 Microsoft Corporation Depth camera compatibility
US8693724B2 (en) 2009-05-29 2014-04-08 Microsoft Corporation Method and system implementing user-centric gesture control
US8702507B2 (en) 2011-04-28 2014-04-22 Microsoft Corporation Manual and camera-based avatar control
US8717469B2 (en) 2010-02-03 2014-05-06 Microsoft Corporation Fast gating photosurface
US8723118B2 (en) 2009-10-01 2014-05-13 Microsoft Corporation Imager for constructing color and depth images
US8724906B2 (en) 2011-11-18 2014-05-13 Microsoft Corporation Computing pose and/or shape of modifiable entities
US8724887B2 (en) 2011-02-03 2014-05-13 Microsoft Corporation Environmental modifications to mitigate environmental factors
US8745541B2 (en) 2003-03-25 2014-06-03 Microsoft Corporation Architecture for controlling a computer using hand gestures
US8744121B2 (en) 2009-05-29 2014-06-03 Microsoft Corporation Device for identifying and tracking multiple humans over time
US8749557B2 (en) 2010-06-11 2014-06-10 Microsoft Corporation Interacting with user interface via avatar
US8751215B2 (en) 2010-06-04 2014-06-10 Microsoft Corporation Machine based sign language interpreter
US20140168269A1 (en) * 2005-01-19 2014-06-19 International Business Machines Corporation Morphing a data center in a virtual world
US8762894B2 (en) 2009-05-01 2014-06-24 Microsoft Corporation Managing virtual ports
US8760571B2 (en) 2009-09-21 2014-06-24 Microsoft Corporation Alignment of lens and image sensor
US8760395B2 (en) 2011-05-31 2014-06-24 Microsoft Corporation Gesture recognition techniques
US8773355B2 (en) 2009-03-16 2014-07-08 Microsoft Corporation Adaptive cursor sizing
US8782567B2 (en) 2009-01-30 2014-07-15 Microsoft Corporation Gesture recognizer system architecture
US8788973B2 (en) 2011-05-23 2014-07-22 Microsoft Corporation Three-dimensional gesture controlled avatar configuration interface
US8786730B2 (en) 2011-08-18 2014-07-22 Microsoft Corporation Image exposure using exclusion regions
US8803800B2 (en) 2011-12-02 2014-08-12 Microsoft Corporation User interface control based on head orientation
US8803888B2 (en) 2010-06-02 2014-08-12 Microsoft Corporation Recognition system for sharing information
US8803952B2 (en) 2010-12-20 2014-08-12 Microsoft Corporation Plural detector time-of-flight depth mapping
US8811938B2 (en) 2011-12-16 2014-08-19 Microsoft Corporation Providing a user interface experience based on inferred vehicle state
US8818002B2 (en) 2007-03-22 2014-08-26 Microsoft Corp. Robust adaptive beamforming with enhanced noise suppression
US8824749B2 (en) 2011-04-05 2014-09-02 Microsoft Corporation Biometric recognition
US8856691B2 (en) 2009-05-29 2014-10-07 Microsoft Corporation Gesture tool
US8854426B2 (en) 2011-11-07 2014-10-07 Microsoft Corporation Time-of-flight camera with guided light
US8864581B2 (en) 2010-01-29 2014-10-21 Microsoft Corporation Visual based identitiy tracking
US8866889B2 (en) 2010-11-03 2014-10-21 Microsoft Corporation In-home depth camera calibration
US8879831B2 (en) 2011-12-15 2014-11-04 Microsoft Corporation Using high-level attributes to guide image processing
US8878656B2 (en) 2010-06-22 2014-11-04 Microsoft Corporation Providing directional force feedback in free space
US8882310B2 (en) 2012-12-10 2014-11-11 Microsoft Corporation Laser die light source module with low inductance
US8884968B2 (en) 2010-12-15 2014-11-11 Microsoft Corporation Modeling an object from image data
US8885890B2 (en) 2010-05-07 2014-11-11 Microsoft Corporation Depth map confidence filtering
US8892495B2 (en) 1991-12-23 2014-11-18 Blanding Hovenweep, Llc Adaptive pattern recognition based controller apparatus and method and human-interface therefore
US8888331B2 (en) 2011-05-09 2014-11-18 Microsoft Corporation Low inductance light source module
US8891067B2 (en) 2010-02-01 2014-11-18 Microsoft Corporation Multiple synchronized optical sources for time-of-flight range finding systems
US8898687B2 (en) 2012-04-04 2014-11-25 Microsoft Corporation Controlling a media program based on a media reaction
US8897491B2 (en) 2011-06-06 2014-11-25 Microsoft Corporation System for finger recognition and tracking
US8911294B2 (en) 2010-08-06 2014-12-16 Wms Gaming, Inc. Browser based heterogenous technology ecosystem
US8920241B2 (en) 2010-12-15 2014-12-30 Microsoft Corporation Gesture controlled persistent handles for interface guides
US8929612B2 (en) 2011-06-06 2015-01-06 Microsoft Corporation System for recognizing an open or closed hand
US8928579B2 (en) 2010-02-22 2015-01-06 Andrew David Wilson Interacting with an omni-directionally projected display
US8933884B2 (en) 2010-01-15 2015-01-13 Microsoft Corporation Tracking groups of users in motion capture system
US8942917B2 (en) 2011-02-14 2015-01-27 Microsoft Corporation Change invariant scene recognition by an agent
US8942428B2 (en) 2009-05-01 2015-01-27 Microsoft Corporation Isolate extraneous motions
US8959541B2 (en) 2012-05-04 2015-02-17 Microsoft Technology Licensing, Llc Determining a future portion of a currently presented media program
US8963829B2 (en) 2009-10-07 2015-02-24 Microsoft Corporation Methods and systems for determining and tracking extremities of a target
US8971612B2 (en) 2011-12-15 2015-03-03 Microsoft Corporation Learning image processing tasks from scene reconstructions
US8968091B2 (en) 2010-09-07 2015-03-03 Microsoft Technology Licensing, Llc Scalable real-time motion recognition
US8976986B2 (en) 2009-09-21 2015-03-10 Microsoft Technology Licensing, Llc Volume adjustment based on listener position
US8982151B2 (en) 2010-06-14 2015-03-17 Microsoft Technology Licensing, Llc Independently processing planes of display data
US8988432B2 (en) 2009-11-05 2015-03-24 Microsoft Technology Licensing, Llc Systems and methods for processing an image for target tracking
US8988437B2 (en) 2009-03-20 2015-03-24 Microsoft Technology Licensing, Llc Chaining animations
US8988508B2 (en) 2010-09-24 2015-03-24 Microsoft Technology Licensing, Llc. Wide angle field of view active illumination imaging system
US8994718B2 (en) 2010-12-21 2015-03-31 Microsoft Technology Licensing, Llc Skeletal control of three-dimensional virtual world
US9001118B2 (en) 2012-06-21 2015-04-07 Microsoft Technology Licensing, Llc Avatar construction using depth camera
US9008355B2 (en) 2010-06-04 2015-04-14 Microsoft Technology Licensing, Llc Automatic depth camera aiming
US9013489B2 (en) 2011-06-06 2015-04-21 Microsoft Technology Licensing, Llc Generation of avatar reflecting player appearance
US9015638B2 (en) 2009-05-01 2015-04-21 Microsoft Technology Licensing, Llc Binding users to a gesture based system and providing feedback to the users
US9019201B2 (en) 2010-01-08 2015-04-28 Microsoft Technology Licensing, Llc Evolving universal gesture sets
US9052746B2 (en) 2013-02-15 2015-06-09 Microsoft Technology Licensing, Llc User center-of-mass and mass distribution extraction using depth images
US9054764B2 (en) 2007-05-17 2015-06-09 Microsoft Technology Licensing, Llc Sensor array beamformer post-processor
US9067136B2 (en) 2011-03-10 2015-06-30 Microsoft Technology Licensing, Llc Push personalization of interface controls
US9069381B2 (en) 2010-03-12 2015-06-30 Microsoft Technology Licensing, Llc Interacting with a computer based application
US9075434B2 (en) 2010-08-20 2015-07-07 Microsoft Technology Licensing, Llc Translating user motion into multiple object responses
US9086727B2 (en) 2010-06-22 2015-07-21 Microsoft Technology Licensing, Llc Free space directional force feedback apparatus
US9091851B2 (en) 2010-02-28 2015-07-28 Microsoft Technology Licensing, Llc Light control in head mounted displays
US9092657B2 (en) 2013-03-13 2015-07-28 Microsoft Technology Licensing, Llc Depth image processing
US9097891B2 (en) 2010-02-28 2015-08-04 Microsoft Technology Licensing, Llc See-through near-eye display glasses including an auto-brightness control for the display brightness based on the brightness in the environment
US9098873B2 (en) 2010-04-01 2015-08-04 Microsoft Technology Licensing, Llc Motion-based interactive shopping environment
US9097890B2 (en) 2010-02-28 2015-08-04 Microsoft Technology Licensing, Llc Grating in a light transmissive illumination system for see-through near-eye display glasses
US9098110B2 (en) 2011-06-06 2015-08-04 Microsoft Technology Licensing, Llc Head rotation tracking from depth-based center of mass
US9100685B2 (en) 2011-12-09 2015-08-04 Microsoft Technology Licensing, Llc Determining audience state or interest using passive sensor data
US9117281B2 (en) 2011-11-02 2015-08-25 Microsoft Corporation Surface segmentation from RGB and depth images
US9123316B2 (en) 2010-12-27 2015-09-01 Microsoft Technology Licensing, Llc Interactive content creation
US9129295B2 (en) 2010-02-28 2015-09-08 Microsoft Technology Licensing, Llc See-through near-eye display glasses with a fast response photochromic film system for quick transition from dark to clear
US9128281B2 (en) 2010-09-14 2015-09-08 Microsoft Technology Licensing, Llc Eyepiece with uniformly illuminated reflective display
US9134534B2 (en) 2010-02-28 2015-09-15 Microsoft Technology Licensing, Llc See-through near-eye display glasses including a modular image source
US9137463B2 (en) 2011-05-12 2015-09-15 Microsoft Technology Licensing, Llc Adaptive high dynamic range camera
US9135516B2 (en) 2013-03-08 2015-09-15 Microsoft Technology Licensing, Llc User body angle, curvature and average extremity positions extraction using depth images
US9159151B2 (en) 2009-07-13 2015-10-13 Microsoft Technology Licensing, Llc Bringing a visual representation to life via learned input from the user
US20150302664A1 (en) * 2014-04-18 2015-10-22 Magic Leap, Inc. Avatar rendering for augmented or virtual reality
US9171264B2 (en) 2010-12-15 2015-10-27 Microsoft Technology Licensing, Llc Parallel processing machine learning decision tree training
US9182596B2 (en) 2010-02-28 2015-11-10 Microsoft Technology Licensing, Llc See-through near-eye display glasses with the optical assembly including absorptive polarizers or anti-reflective coatings to reduce stray light
US9182814B2 (en) 2009-05-29 2015-11-10 Microsoft Technology Licensing, Llc Systems and methods for estimating a non-visible or occluded body part
US9195305B2 (en) 2010-01-15 2015-11-24 Microsoft Technology Licensing, Llc Recognizing user intent in motion capture system
US9208571B2 (en) 2011-06-06 2015-12-08 Microsoft Technology Licensing, Llc Object digitization
US9210401B2 (en) 2012-05-03 2015-12-08 Microsoft Technology Licensing, Llc Projected visual cues for guiding physical movement
US9223134B2 (en) 2010-02-28 2015-12-29 Microsoft Technology Licensing, Llc Optical imperfections in a light transmissive illumination system for see-through near-eye display glasses
US9229227B2 (en) 2010-02-28 2016-01-05 Microsoft Technology Licensing, Llc See-through near-eye display glasses with a light transmissive wedge shaped illumination system
US9244533B2 (en) 2009-12-17 2016-01-26 Microsoft Technology Licensing, Llc Camera navigation for presentations
US9247238B2 (en) 2011-01-31 2016-01-26 Microsoft Technology Licensing, Llc Reducing interference between multiple infra-red depth cameras
US9251590B2 (en) 2013-01-24 2016-02-02 Microsoft Technology Licensing, Llc Camera pose estimation for 3D reconstruction
US9262673B2 (en) 2009-05-01 2016-02-16 Microsoft Technology Licensing, Llc Human body pose estimation
US9259643B2 (en) 2011-04-28 2016-02-16 Microsoft Technology Licensing, Llc Control of separate computer game elements
US20160049003A1 (en) * 2014-08-12 2016-02-18 Utherverse Digital Inc. Method, system and apparatus of recording and playing back an experience in a virtual worlds system
US9268404B2 (en) 2010-01-08 2016-02-23 Microsoft Technology Licensing, Llc Application gesture interpretation
US9274606B2 (en) 2013-03-14 2016-03-01 Microsoft Technology Licensing, Llc NUI video conference controls
US9285589B2 (en) 2010-02-28 2016-03-15 Microsoft Technology Licensing, Llc AR glasses with event and sensor triggered control of AR eyepiece applications
US9298287B2 (en) 2011-03-31 2016-03-29 Microsoft Technology Licensing, Llc Combined activation for natural user interface systems
US9313376B1 (en) 2009-04-01 2016-04-12 Microsoft Technology Licensing, Llc Dynamic depth power equalization
US9329689B2 (en) 2010-02-28 2016-05-03 Microsoft Technology Licensing, Llc Method and apparatus for biometric data capture
US9341843B2 (en) 2010-02-28 2016-05-17 Microsoft Technology Licensing, Llc See-through near-eye display glasses with a small scale image source
US9342139B2 (en) 2011-12-19 2016-05-17 Microsoft Technology Licensing, Llc Pairing a computing device to a user
US9349040B2 (en) 2010-11-19 2016-05-24 Microsoft Technology Licensing, Llc Bi-modal depth-image analysis
US9366862B2 (en) 2010-02-28 2016-06-14 Microsoft Technology Licensing, Llc System and method for delivering content to a group of see-through near eye display eyepieces
US9384329B2 (en) 2010-06-11 2016-07-05 Microsoft Technology Licensing, Llc Caloric burn determination from body movement
US9383823B2 (en) 2009-05-29 2016-07-05 Microsoft Technology Licensing, Llc Combining gestures beyond skeletal
US9400559B2 (en) 2009-05-29 2016-07-26 Microsoft Technology Licensing, Llc Gesture shortcuts
US9400548B2 (en) 2009-10-19 2016-07-26 Microsoft Technology Licensing, Llc Gesture personalization and profile roaming
US9442186B2 (en) 2013-05-13 2016-09-13 Microsoft Technology Licensing, Llc Interference reduction for TOF systems
US9443310B2 (en) 2013-10-09 2016-09-13 Microsoft Technology Licensing, Llc Illumination modules that emit structured light
US9462253B2 (en) 2013-09-23 2016-10-04 Microsoft Technology Licensing, Llc Optical modules that reduce speckle contrast and diffraction artifacts
US9470778B2 (en) 2011-03-29 2016-10-18 Microsoft Technology Licensing, Llc Learning from high quality depth measurements
US9484065B2 (en) 2010-10-15 2016-11-01 Microsoft Technology Licensing, Llc Intelligent determination of replays based on event identification
US9498718B2 (en) 2009-05-01 2016-11-22 Microsoft Technology Licensing, Llc Altering a view perspective within a display environment
US9508385B2 (en) 2013-11-21 2016-11-29 Microsoft Technology Licensing, Llc Audio-visual project generator
US9535563B2 (en) 1999-02-01 2017-01-03 Blanding Hovenweep, Llc Internet appliance system and method
US9551914B2 (en) 2011-03-07 2017-01-24 Microsoft Technology Licensing, Llc Illuminator with refractive optical element
US9557836B2 (en) 2011-11-01 2017-01-31 Microsoft Technology Licensing, Llc Depth image compression
US9557574B2 (en) 2010-06-08 2017-01-31 Microsoft Technology Licensing, Llc Depth illumination and detection optics
US9573068B1 (en) 2016-04-04 2017-02-21 Kipling Martin Virtual reality enhancement device
US9576330B2 (en) 2008-03-07 2017-02-21 Virtually Live (Switzerland) Gmbh Media system and method
US9594430B2 (en) 2011-06-01 2017-03-14 Microsoft Technology Licensing, Llc Three-dimensional foreground selection for vision system
US9597587B2 (en) 2011-06-08 2017-03-21 Microsoft Technology Licensing, Llc Locational node device
US9646340B2 (en) 2010-04-01 2017-05-09 Microsoft Technology Licensing, Llc Avatar-based virtual dressing room
US9674563B2 (en) 2013-11-04 2017-06-06 Rovi Guides, Inc. Systems and methods for recommending content
US9672691B2 (en) 2010-08-06 2017-06-06 Bally Gaming, Inc. Controlling wagering game system browser areas
US20170168651A1 (en) * 2014-09-02 2017-06-15 Sony Corporation Information processing apparatus, control method, and program
US9696427B2 (en) 2012-08-14 2017-07-04 Microsoft Technology Licensing, Llc Wide angle depth detection
US9720089B2 (en) 2012-01-23 2017-08-01 Microsoft Technology Licensing, Llc 3D zoom imager
US9724600B2 (en) 2011-06-06 2017-08-08 Microsoft Technology Licensing, Llc Controlling objects in a virtual environment
US9759917B2 (en) 2010-02-28 2017-09-12 Microsoft Technology Licensing, Llc AR glasses with event and sensor triggered AR eyepiece interface to external devices
US9769459B2 (en) 2013-11-12 2017-09-19 Microsoft Technology Licensing, Llc Power efficient laser diode driver circuit and method
US20170312632A1 (en) * 2016-04-29 2017-11-02 Activision Publishing, Inc. System and method for identifying spawn locations in a video game
US9821224B2 (en) 2010-12-21 2017-11-21 Microsoft Technology Licensing, Llc Driving simulator control with virtual skeleton
US9823339B2 (en) 2010-12-21 2017-11-21 Microsoft Technology Licensing, Llc Plural anode time-of-flight sensor
US9836590B2 (en) 2012-06-22 2017-12-05 Microsoft Technology Licensing, Llc Enhanced accuracy of user presence status determination
US9848106B2 (en) 2010-12-21 2017-12-19 Microsoft Technology Licensing, Llc Intelligent gameplay photo capture
US9857470B2 (en) 2012-12-28 2018-01-02 Microsoft Technology Licensing, Llc Using photometric stereo for 3D environment modeling
US9873055B2 (en) 2015-09-15 2018-01-23 Square Enix Holdings Co., Ltd. Game system including third party control
US9898675B2 (en) 2009-05-01 2018-02-20 Microsoft Technology Licensing, Llc User movement tracking feedback to improve tracking
US9902109B2 (en) 2008-10-07 2018-02-27 Tripetals, Llc Internet-enabled apparatus, system and methods for physically and virtually rendering three-dimensional objects
US9940553B2 (en) 2013-02-22 2018-04-10 Microsoft Technology Licensing, Llc Camera/object pose from predicted coordinates
US9953213B2 (en) 2013-03-27 2018-04-24 Microsoft Technology Licensing, Llc Self discovery of autonomous NUI devices
US9971491B2 (en) 2014-01-09 2018-05-15 Microsoft Technology Licensing, Llc Gesture library for natural user input
US10085072B2 (en) 2009-09-23 2018-09-25 Rovi Guides, Inc. Systems and methods for automatically detecting users within detection regions of media devices
US10127735B2 (en) 2012-05-01 2018-11-13 Augmented Reality Holdings 2, Llc System, method and apparatus of eye tracking or gaze detection applications including facilitating action on or interaction with a simulated object
US10180572B2 (en) 2010-02-28 2019-01-15 Microsoft Technology Licensing, Llc AR glasses with event and user action control of external applications
US10188890B2 (en) 2013-12-26 2019-01-29 Icon Health & Fitness, Inc. Magnetic resistance mechanism in a cable machine
US10220259B2 (en) 2012-01-05 2019-03-05 Icon Health & Fitness, Inc. System and method for controlling an exercise device
US10226396B2 (en) 2014-06-20 2019-03-12 Icon Health & Fitness, Inc. Post workout massage device
US10234545B2 (en) 2010-12-01 2019-03-19 Microsoft Technology Licensing, Llc Light source module
US20190099667A1 (en) * 2017-09-29 2019-04-04 Gree, Inc. Game processing program, game processing method, and game processing device
US10257932B2 (en) 2016-02-16 2019-04-09 Microsoft Technology Licensing, Llc. Laser diode chip on printed circuit board
US10252109B2 (en) 2016-05-13 2019-04-09 Icon Health & Fitness, Inc. Weight platform treadmill
US10258828B2 (en) 2015-01-16 2019-04-16 Icon Health & Fitness, Inc. Controls for an exercise device
US10272317B2 (en) 2016-03-18 2019-04-30 Icon Health & Fitness, Inc. Lighted pace feature in a treadmill
US10279212B2 (en) 2013-03-14 2019-05-07 Icon Health & Fitness, Inc. Strength training apparatus with flywheel and related methods
US10293211B2 (en) 2016-03-18 2019-05-21 Icon Health & Fitness, Inc. Coordinated weight selection
US10296587B2 (en) 2011-03-31 2019-05-21 Microsoft Technology Licensing, Llc Augmented conversational understanding agent to identify conversation context between two humans and taking an agent action thereof
US10343017B2 (en) 2016-11-01 2019-07-09 Icon Health & Fitness, Inc. Distance sensor for console positioning
US10357715B2 (en) 2017-07-07 2019-07-23 Buxton Global Enterprises, Inc. Racing simulation
US10376736B2 (en) 2016-10-12 2019-08-13 Icon Health & Fitness, Inc. Cooling an exercise device during a dive motor runway condition
US10391361B2 (en) 2015-02-27 2019-08-27 Icon Health & Fitness, Inc. Simulating real-world terrain on an exercise device
US10412280B2 (en) 2016-02-10 2019-09-10 Microsoft Technology Licensing, Llc Camera with light valve over sensor array
US10426989B2 (en) 2014-06-09 2019-10-01 Icon Health & Fitness, Inc. Cable system incorporated into a treadmill
US10433612B2 (en) 2014-03-10 2019-10-08 Icon Health & Fitness, Inc. Pressure sensor to quantify work
US10441844B2 (en) 2016-07-01 2019-10-15 Icon Health & Fitness, Inc. Cooling systems and methods for exercise equipment
US10462452B2 (en) 2016-03-16 2019-10-29 Microsoft Technology Licensing, Llc Synchronizing active illumination cameras
US10471299B2 (en) 2016-07-01 2019-11-12 Icon Health & Fitness, Inc. Systems and methods for cooling internal exercise equipment components
US10493349B2 (en) 2016-03-18 2019-12-03 Icon Health & Fitness, Inc. Display on exercise device
US10500473B2 (en) 2016-10-10 2019-12-10 Icon Health & Fitness, Inc. Console positioning
US10537764B2 (en) 2015-08-07 2020-01-21 Icon Health & Fitness, Inc. Emergency stop with magnetic brake for an exercise device
US10539787B2 (en) 2010-02-28 2020-01-21 Microsoft Technology Licensing, Llc Head-worn adaptive display
US10543395B2 (en) 2016-12-05 2020-01-28 Icon Health & Fitness, Inc. Offsetting treadmill deck weight during operation
US20200042160A1 (en) * 2018-06-18 2020-02-06 Alessandro Gabbi System and Method for Providing Virtual-Reality Based Interactive Archives for Therapeutic Interventions, Interactions and Support
US10561894B2 (en) 2016-03-18 2020-02-18 Icon Health & Fitness, Inc. Treadmill with removable supports
US10561877B2 (en) 2016-11-01 2020-02-18 Icon Health & Fitness, Inc. Drop-in pivot configuration for stationary bike
US10585957B2 (en) 2011-03-31 2020-03-10 Microsoft Technology Licensing, Llc Task driven user intents
US10625114B2 (en) 2016-11-01 2020-04-21 Icon Health & Fitness, Inc. Elliptical and stationary bicycle apparatus including row functionality
US10625137B2 (en) 2016-03-18 2020-04-21 Icon Health & Fitness, Inc. Coordinated displays in an exercise device
US10642934B2 (en) 2011-03-31 2020-05-05 Microsoft Technology Licensing, Llc Augmented conversational understanding architecture
US10661114B2 (en) 2016-11-01 2020-05-26 Icon Health & Fitness, Inc. Body weight lift mechanism on treadmill
US10671705B2 (en) 2016-09-28 2020-06-02 Icon Health & Fitness, Inc. Customizing recipe recommendations
US10671841B2 (en) 2011-05-02 2020-06-02 Microsoft Technology Licensing, Llc Attribute state classification
US10685153B2 (en) * 2018-06-15 2020-06-16 Syscend, Inc. Bicycle sizer
US10702736B2 (en) 2017-01-14 2020-07-07 Icon Health & Fitness, Inc. Exercise cycle
US10726861B2 (en) 2010-11-15 2020-07-28 Microsoft Technology Licensing, Llc Semi-private communication in open environments
US10729965B2 (en) 2017-12-22 2020-08-04 Icon Health & Fitness, Inc. Audible belt guide in a treadmill
US10796494B2 (en) 2011-06-06 2020-10-06 Microsoft Technology Licensing, Llc Adding attributes to virtual representations of real-world objects
US10860100B2 (en) 2010-02-28 2020-12-08 Microsoft Technology Licensing, Llc AR glasses with predictive control of external device based on event input
US10878009B2 (en) 2012-08-23 2020-12-29 Microsoft Technology Licensing, Llc Translating natural language utterances to keyword search queries
US10953305B2 (en) 2015-08-26 2021-03-23 Icon Health & Fitness, Inc. Strength exercise mechanisms
US11153472B2 (en) 2005-10-17 2021-10-19 Cutting Edge Vision, LLC Automatic upload of pictures from a camera
US11451108B2 (en) 2017-08-16 2022-09-20 Ifit Inc. Systems and methods for axial impact resistance in electric motors
US20230077369A1 (en) * 2017-03-13 2023-03-16 Holodia Inc. Method for generating multimedia data associated with a system for practicing sports
CN117292094A (en) * 2023-11-23 2023-12-26 南昌菱形信息技术有限公司 Digitalized application method and system for performance theatre in karst cave
US11876685B1 (en) 2021-05-19 2024-01-16 Amazon Technologies, Inc. Locally predicting state using a componentized entity simulation
US11909601B1 (en) 2021-06-17 2024-02-20 Amazon Technologies, Inc. Implementing a scalable 3D simulation using a distributed 3D keyspace

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6132314A (en) * 1997-05-23 2000-10-17 Namco Limited Operational input device for simulator

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6132314A (en) * 1997-05-23 2000-10-17 Namco Limited Operational input device for simulator

Cited By (512)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8892495B2 (en) 1991-12-23 2014-11-18 Blanding Hovenweep, Llc Adaptive pattern recognition based controller apparatus and method and human-interface therefore
US9535563B2 (en) 1999-02-01 2017-01-03 Blanding Hovenweep, Llc Internet appliance system and method
US8707216B2 (en) 2002-02-07 2014-04-22 Microsoft Corporation Controlling objects via gesturing
US20080204411A1 (en) * 2002-02-07 2008-08-28 Microsoft Corporation Recognizing a movement of a pointing device
US20080204410A1 (en) * 2002-02-07 2008-08-28 Microsoft Corporation Recognizing a motion of a pointing device
US10488950B2 (en) 2002-02-07 2019-11-26 Microsoft Technology Licensing, Llc Manipulating an object utilizing a pointing device
US20080259055A1 (en) * 2002-02-07 2008-10-23 Microsoft Corporation Manipulating An Object Utilizing A Pointing Device
US20090198354A1 (en) * 2002-02-07 2009-08-06 Microsoft Corporation Controlling objects via gesturing
US8456419B2 (en) 2002-02-07 2013-06-04 Microsoft Corporation Determining a position of a pointing device
US10331228B2 (en) 2002-02-07 2019-06-25 Microsoft Technology Licensing, Llc System and method for determining 3D orientation of a pointing device
US9454244B2 (en) 2002-02-07 2016-09-27 Microsoft Technology Licensing, Llc Recognizing a movement of a pointing device
US10551930B2 (en) 2003-03-25 2020-02-04 Microsoft Technology Licensing, Llc System and method for executing a process using accelerometer signals
US8745541B2 (en) 2003-03-25 2014-06-03 Microsoft Corporation Architecture for controlling a computer using hand gestures
US9652042B2 (en) 2003-03-25 2017-05-16 Microsoft Technology Licensing, Llc Architecture for controlling a computer using hand gestures
US20100146464A1 (en) * 2003-03-25 2010-06-10 Microsoft Corporation Architecture For Controlling A Computer Using Hand Gestures
US20140168269A1 (en) * 2005-01-19 2014-06-19 International Business Machines Corporation Morphing a data center in a virtual world
US9390467B2 (en) * 2005-01-19 2016-07-12 International Business Machines Corporation Morphing a data center in a virtual world
US11153472B2 (en) 2005-10-17 2021-10-19 Cutting Edge Vision, LLC Automatic upload of pictures from a camera
US11818458B2 (en) 2005-10-17 2023-11-14 Cutting Edge Vision, LLC Camera touchpad
US20080001951A1 (en) * 2006-05-07 2008-01-03 Sony Computer Entertainment Inc. System and method for providing affective characteristics to computer generated avatar during gameplay
US20080096665A1 (en) * 2006-10-18 2008-04-24 Ariel Cohen System and a method for a reality role playing game genre
US20120129600A1 (en) * 2007-03-01 2012-05-24 Sony Computer Entertainment Europe Limited Entertainment device and method
US20080215974A1 (en) * 2007-03-01 2008-09-04 Phil Harrison Interactive user controlled avatar animations
US9446320B2 (en) * 2007-03-01 2016-09-20 Sony Computer Entertainment Europe Limited Inserting an operator avatar into an online virtual environment
US8678922B2 (en) * 2007-03-01 2014-03-25 Sony Computer Entertainment Europe Limited Entertainment device and method
US9345970B2 (en) 2007-03-01 2016-05-24 Sony Computer Entertainment Europe Limited Switching operation of an entertainment device and method thereof
US20090318224A1 (en) * 2007-03-01 2009-12-24 Sony Computer Entertainment Europe Limited Entertainment device and method
US8818002B2 (en) 2007-03-22 2014-08-26 Microsoft Corp. Robust adaptive beamforming with enhanced noise suppression
US9054764B2 (en) 2007-05-17 2015-06-09 Microsoft Technology Licensing, Llc Sensor array beamformer post-processor
US8629976B2 (en) 2007-10-02 2014-01-14 Microsoft Corporation Methods and systems for hierarchical de-aliasing time-of-flight (TOF) systems
US8128469B2 (en) * 2008-03-07 2012-03-06 Virtually Live Ltd. Media system and method
US10272340B2 (en) 2008-03-07 2019-04-30 Virtually Live (Switzerland) Gmbh Media system and method
US20100331065A1 (en) * 2008-03-07 2010-12-30 Virtually Live Limited Media System and Method
US20170182421A1 (en) * 2008-03-07 2017-06-29 Virtually Live (Switzerland) Gmbh Media system and method
US9576330B2 (en) 2008-03-07 2017-02-21 Virtually Live (Switzerland) Gmbh Media system and method
US9968853B2 (en) * 2008-03-07 2018-05-15 Virtually Live (Switzerland) Gmbh Media system and method
US8385557B2 (en) 2008-06-19 2013-02-26 Microsoft Corporation Multichannel acoustic echo reduction
US9264807B2 (en) 2008-06-19 2016-02-16 Microsoft Technology Licensing, Llc Multichannel acoustic echo reduction
US8325909B2 (en) 2008-06-25 2012-12-04 Microsoft Corporation Acoustic echo suppression
US9052382B2 (en) 2008-06-30 2015-06-09 Microsoft Technology Licensing, Llc System architecture design for time-of-flight system having reduced differential pixel size, and time-of-flight systems so designed
US8363212B2 (en) 2008-06-30 2013-01-29 Microsoft Corporation System architecture design for time-of-flight system having reduced differential pixel size, and time-of-flight systems so designed
US8587773B2 (en) 2008-06-30 2013-11-19 Microsoft Corporation System architecture design for time-of-flight system having reduced differential pixel size, and time-of-flight systems so designed
US20100029384A1 (en) * 2008-07-22 2010-02-04 Sony Online Entertainment Llc System and method for physics interactions in a simulation
US8554526B2 (en) 2008-07-22 2013-10-08 Sony Online Entertainment Llc System and method for physics interactions in a simulation
US8444463B2 (en) * 2008-08-05 2013-05-21 Konami Digital Entertainment Co., Ltd. Game device, method for controlling game device, program, and information storage medium
US20110136557A1 (en) * 2008-08-05 2011-06-09 Konami Digital Entertainment Co., Ltd. Game device, method for controlling game device, program, and information storage medium
US8388442B2 (en) * 2008-09-04 2013-03-05 International Business Machines Corporation Prevention of a user mimicking another user in a virtual world
US20100057715A1 (en) * 2008-09-04 2010-03-04 International Business Machines Corporation Prevention of a User Mimicking Another User in a Virtual World
US20100088650A1 (en) * 2008-10-07 2010-04-08 Christopher Kaltenbach Internet-enabled apparatus, system and methods for physically and virtually rendering three-dimensional objects
US10486365B2 (en) 2008-10-07 2019-11-26 Tripetals, Llc Internet-enabled apparatus, system and methods for physically and virtually rendering three-dimensional objects
US11235530B2 (en) 2008-10-07 2022-02-01 Tripetals, Llc Internet-enabled apparatus, system and methods for physically and virtually rendering three-dimensional objects
US9902109B2 (en) 2008-10-07 2018-02-27 Tripetals, Llc Internet-enabled apparatus, system and methods for physically and virtually rendering three-dimensional objects
US8460107B2 (en) 2008-10-09 2013-06-11 Wms Gaming, Inc. Controlling and presenting virtual wagering game environments
WO2010042773A1 (en) * 2008-10-09 2010-04-15 Wms Gaming, Inc. Controlling and presenting virtual wagering game environments
US20110190066A1 (en) * 2008-10-09 2011-08-04 Wms Gaming, Inc. Controlling and presenting virtual wagering game environments
US9641825B2 (en) 2009-01-04 2017-05-02 Microsoft International Holdings B.V. Gated 3D camera
US8681321B2 (en) 2009-01-04 2014-03-25 Microsoft International Holdings B.V. Gated 3D camera
US8295546B2 (en) 2009-01-30 2012-10-23 Microsoft Corporation Pose tracking pipeline
US8553939B2 (en) 2009-01-30 2013-10-08 Microsoft Corporation Pose tracking pipeline
US8267781B2 (en) 2009-01-30 2012-09-18 Microsoft Corporation Visual target tracking
US8860663B2 (en) 2009-01-30 2014-10-14 Microsoft Corporation Pose tracking pipeline
US8565476B2 (en) 2009-01-30 2013-10-22 Microsoft Corporation Visual target tracking
US8294767B2 (en) 2009-01-30 2012-10-23 Microsoft Corporation Body scan
US8565485B2 (en) 2009-01-30 2013-10-22 Microsoft Corporation Pose tracking pipeline
US8577085B2 (en) 2009-01-30 2013-11-05 Microsoft Corporation Visual target tracking
US8565477B2 (en) 2009-01-30 2013-10-22 Microsoft Corporation Visual target tracking
US20110234490A1 (en) * 2009-01-30 2011-09-29 Microsoft Corporation Predictive Determination
US8897493B2 (en) 2009-01-30 2014-11-25 Microsoft Corporation Body scan
US8577084B2 (en) 2009-01-30 2013-11-05 Microsoft Corporation Visual target tracking
US20110032336A1 (en) * 2009-01-30 2011-02-10 Microsoft Corporation Body scan
US20100199229A1 (en) * 2009-01-30 2010-08-05 Microsoft Corporation Mapping a natural input device to a legacy system
US9465980B2 (en) 2009-01-30 2016-10-11 Microsoft Technology Licensing, Llc Pose tracking pipeline
US9007417B2 (en) 2009-01-30 2015-04-14 Microsoft Technology Licensing, Llc Body scan
US8448094B2 (en) 2009-01-30 2013-05-21 Microsoft Corporation Mapping a natural input device to a legacy system
US8869072B2 (en) 2009-01-30 2014-10-21 Microsoft Corporation Gesture recognizer system architecture
US8578302B2 (en) 2009-01-30 2013-11-05 Microsoft Corporation Predictive determination
US9280203B2 (en) 2009-01-30 2016-03-08 Microsoft Technology Licensing, Llc Gesture recognizer system architecture
US8782567B2 (en) 2009-01-30 2014-07-15 Microsoft Corporation Gesture recognizer system architecture
US8588465B2 (en) 2009-01-30 2013-11-19 Microsoft Corporation Visual target tracking
US8682028B2 (en) 2009-01-30 2014-03-25 Microsoft Corporation Visual target tracking
US9842405B2 (en) 2009-01-30 2017-12-12 Microsoft Technology Licensing, Llc Visual target tracking
US9039528B2 (en) 2009-01-30 2015-05-26 Microsoft Technology Licensing, Llc Visual target tracking
US8467574B2 (en) 2009-01-30 2013-06-18 Microsoft Corporation Body scan
US8487938B2 (en) 2009-01-30 2013-07-16 Microsoft Corporation Standard Gestures
US9607213B2 (en) 2009-01-30 2017-03-28 Microsoft Technology Licensing, Llc Body scan
US8773355B2 (en) 2009-03-16 2014-07-08 Microsoft Corporation Adaptive cursor sizing
US9256282B2 (en) 2009-03-20 2016-02-09 Microsoft Technology Licensing, Llc Virtual object manipulation
US8988437B2 (en) 2009-03-20 2015-03-24 Microsoft Technology Licensing, Llc Chaining animations
US9824480B2 (en) 2009-03-20 2017-11-21 Microsoft Technology Licensing, Llc Chaining animations
US9478057B2 (en) 2009-03-20 2016-10-25 Microsoft Technology Licensing, Llc Chaining animations
US20100241998A1 (en) * 2009-03-20 2010-09-23 Microsoft Corporation Virtual object manipulation
US9313376B1 (en) 2009-04-01 2016-04-12 Microsoft Technology Licensing, Llc Dynamic depth power equalization
US20100281433A1 (en) * 2009-04-29 2010-11-04 International Business Machines Corporation Computer Method and Apparatus Specifying Avatar Entrance and Exit
US9015638B2 (en) 2009-05-01 2015-04-21 Microsoft Technology Licensing, Llc Binding users to a gesture based system and providing feedback to the users
US9519828B2 (en) 2009-05-01 2016-12-13 Microsoft Technology Licensing, Llc Isolate extraneous motions
US9910509B2 (en) 2009-05-01 2018-03-06 Microsoft Technology Licensing, Llc Method to control perspective for a camera-controlled computer
US8253746B2 (en) 2009-05-01 2012-08-28 Microsoft Corporation Determine intended motions
US8451278B2 (en) 2009-05-01 2013-05-28 Microsoft Corporation Determine intended motions
US9191570B2 (en) 2009-05-01 2015-11-17 Microsoft Technology Licensing, Llc Systems and methods for detecting a tilt angle from a depth image
US9377857B2 (en) 2009-05-01 2016-06-28 Microsoft Technology Licensing, Llc Show body position
US20110210915A1 (en) * 2009-05-01 2011-09-01 Microsoft Corporation Human Body Pose Estimation
US9298263B2 (en) 2009-05-01 2016-03-29 Microsoft Technology Licensing, Llc Show body position
US8340432B2 (en) 2009-05-01 2012-12-25 Microsoft Corporation Systems and methods for detecting a tilt angle from a depth image
US9498718B2 (en) 2009-05-01 2016-11-22 Microsoft Technology Licensing, Llc Altering a view perspective within a display environment
US8762894B2 (en) 2009-05-01 2014-06-24 Microsoft Corporation Managing virtual ports
US9262673B2 (en) 2009-05-01 2016-02-16 Microsoft Technology Licensing, Llc Human body pose estimation
US20100281432A1 (en) * 2009-05-01 2010-11-04 Kevin Geisner Show body position
US9898675B2 (en) 2009-05-01 2018-02-20 Microsoft Technology Licensing, Llc User movement tracking feedback to improve tracking
US9519970B2 (en) 2009-05-01 2016-12-13 Microsoft Technology Licensing, Llc Systems and methods for detecting a tilt angle from a depth image
US8660303B2 (en) 2009-05-01 2014-02-25 Microsoft Corporation Detection of body and props
US10210382B2 (en) 2009-05-01 2019-02-19 Microsoft Technology Licensing, Llc Human body pose estimation
US8649554B2 (en) 2009-05-01 2014-02-11 Microsoft Corporation Method to control perspective for a camera-controlled computer
US8503766B2 (en) 2009-05-01 2013-08-06 Microsoft Corporation Systems and methods for detecting a tilt angle from a depth image
US8638985B2 (en) 2009-05-01 2014-01-28 Microsoft Corporation Human body pose estimation
US8942428B2 (en) 2009-05-01 2015-01-27 Microsoft Corporation Isolate extraneous motions
US9524024B2 (en) 2009-05-01 2016-12-20 Microsoft Technology Licensing, Llc Method to control perspective for a camera-controlled computer
US20100295847A1 (en) * 2009-05-21 2010-11-25 Microsoft Corporation Differential model analysis within a virtual world
US20100299640A1 (en) * 2009-05-21 2010-11-25 Microsoft Corporation Tracking in a virtual world
WO2010138344A3 (en) * 2009-05-27 2011-04-07 Lucid Ventures, Inc. System and method for control of a simulated object that is associated with a physical location in the real world environment
US20100306825A1 (en) * 2009-05-27 2010-12-02 Lucid Ventures, Inc. System and method for facilitating user interaction with a simulated object associated with a physical location
US20100304804A1 (en) * 2009-05-27 2010-12-02 Lucid Ventures, Inc. System and method of simulated objects and applications thereof
US10855683B2 (en) 2009-05-27 2020-12-01 Samsung Electronics Co., Ltd. System and method for facilitating user interaction with a simulated object associated with a physical location
US20100302143A1 (en) * 2009-05-27 2010-12-02 Lucid Ventures, Inc. System and method for control of a simulated object that is associated with a physical location in the real world environment
US8303387B2 (en) 2009-05-27 2012-11-06 Zambala Lllp System and method of simulated objects and applications thereof
US8745494B2 (en) 2009-05-27 2014-06-03 Zambala Lllp System and method for control of a simulated object that is associated with a physical location in the real world environment
WO2010138344A2 (en) * 2009-05-27 2010-12-02 Lucid Ventures, Inc. System and method for control of a simulated object that is associated with a physical location in the real world environment
US11765175B2 (en) 2009-05-27 2023-09-19 Samsung Electronics Co., Ltd. System and method for facilitating user interaction with a simulated object associated with a physical location
US20100306084A1 (en) * 2009-05-28 2010-12-02 Yunus Ciptawilangga Need-based online virtual reality ecommerce system
US20100306121A1 (en) * 2009-05-28 2010-12-02 Yunus Ciptawilangga Selling and delivering real goods and services within a virtual reality world
US20100306120A1 (en) * 2009-05-28 2010-12-02 Yunus Ciptawilangga Online merchandising and ecommerce with virtual reality simulation of an actual retail location
US20110078052A1 (en) * 2009-05-28 2011-03-31 Yunus Ciptawilangga Virtual reality ecommerce with linked user and avatar benefits
US8320619B2 (en) 2009-05-29 2012-11-27 Microsoft Corporation Systems and methods for tracking a model
US9215478B2 (en) 2009-05-29 2015-12-15 Microsoft Technology Licensing, Llc Protocol and format for communicating an image from a camera to a computing environment
US9943755B2 (en) 2009-05-29 2018-04-17 Microsoft Technology Licensing, Llc Device for identifying and tracking multiple humans over time
US8856691B2 (en) 2009-05-29 2014-10-07 Microsoft Corporation Gesture tool
US9400559B2 (en) 2009-05-29 2016-07-26 Microsoft Technology Licensing, Llc Gesture shortcuts
US8542252B2 (en) 2009-05-29 2013-09-24 Microsoft Corporation Target digitization, extraction, and tracking
US8803889B2 (en) 2009-05-29 2014-08-12 Microsoft Corporation Systems and methods for applying animations or motions to a character
US9383823B2 (en) 2009-05-29 2016-07-05 Microsoft Technology Licensing, Llc Combining gestures beyond skeletal
US8351652B2 (en) 2009-05-29 2013-01-08 Microsoft Corporation Systems and methods for tracking a model
US9861886B2 (en) 2009-05-29 2018-01-09 Microsoft Technology Licensing, Llc Systems and methods for applying animations or motions to a character
US8009022B2 (en) 2009-05-29 2011-08-30 Microsoft Corporation Systems and methods for immersive interaction with virtual objects
US8379101B2 (en) 2009-05-29 2013-02-19 Microsoft Corporation Environment and/or target segmentation
US20100302365A1 (en) * 2009-05-29 2010-12-02 Microsoft Corporation Depth Image Noise Reduction
US20100302247A1 (en) * 2009-05-29 2010-12-02 Microsoft Corporation Target digitization, extraction, and tracking
US8896721B2 (en) 2009-05-29 2014-11-25 Microsoft Corporation Environment and/or target segmentation
US10691216B2 (en) 2009-05-29 2020-06-23 Microsoft Technology Licensing, Llc Combining gestures beyond skeletal
US20100302257A1 (en) * 2009-05-29 2010-12-02 Microsoft Corporation Systems and Methods For Applying Animations or Motions to a Character
US8625837B2 (en) 2009-05-29 2014-01-07 Microsoft Corporation Protocol and format for communicating an image from a camera to a computing environment
US8509479B2 (en) 2009-05-29 2013-08-13 Microsoft Corporation Virtual object
US8744121B2 (en) 2009-05-29 2014-06-03 Microsoft Corporation Device for identifying and tracking multiple humans over time
US8418085B2 (en) 2009-05-29 2013-04-09 Microsoft Corporation Gesture coach
US9656162B2 (en) 2009-05-29 2017-05-23 Microsoft Technology Licensing, Llc Device for identifying and tracking multiple humans over time
US9569005B2 (en) 2009-05-29 2017-02-14 Microsoft Technology Licensing, Llc Method and system implementing user-centric gesture control
US8693724B2 (en) 2009-05-29 2014-04-08 Microsoft Corporation Method and system implementing user-centric gesture control
US9182814B2 (en) 2009-05-29 2015-11-10 Microsoft Technology Licensing, Llc Systems and methods for estimating a non-visible or occluded body part
US10486065B2 (en) 2009-05-29 2019-11-26 Microsoft Technology Licensing, Llc Systems and methods for immersive interaction with virtual objects
US8660310B2 (en) 2009-05-29 2014-02-25 Microsoft Corporation Systems and methods for tracking a model
US20100303297A1 (en) * 2009-05-30 2010-12-02 Anton Mikhailov Color calibration for object tracking
US8917240B2 (en) 2009-06-01 2014-12-23 Microsoft Corporation Virtual desktop coordinate transformation
US8487871B2 (en) 2009-06-01 2013-07-16 Microsoft Corporation Virtual desktop coordinate transformation
US20100325189A1 (en) * 2009-06-23 2010-12-23 Microsoft Corportation Evidence-based virtual world visualization
US8972476B2 (en) 2009-06-23 2015-03-03 Microsoft Technology Licensing, Llc Evidence-based virtual world visualization
US8390680B2 (en) 2009-07-09 2013-03-05 Microsoft Corporation Visual representation expression based on player expression
US9519989B2 (en) 2009-07-09 2016-12-13 Microsoft Technology Licensing, Llc Visual representation expression based on player expression
US9159151B2 (en) 2009-07-13 2015-10-13 Microsoft Technology Licensing, Llc Bringing a visual representation to life via learned input from the user
US8264536B2 (en) 2009-08-25 2012-09-11 Microsoft Corporation Depth-sensitive imaging via polarization-state mapping
US9141193B2 (en) 2009-08-31 2015-09-22 Microsoft Technology Licensing, Llc Techniques for using human gestures to control gesture unaware programs
US20110055846A1 (en) * 2009-08-31 2011-03-03 Microsoft Corporation Techniques for using human gestures to control gesture unaware programs
US9063001B2 (en) 2009-09-14 2015-06-23 Microsoft Technology Licensing, Llc Optical fault monitoring
US8508919B2 (en) 2009-09-14 2013-08-13 Microsoft Corporation Separation of electrical and optical components
US8330134B2 (en) 2009-09-14 2012-12-11 Microsoft Corporation Optical fault monitoring
US8428340B2 (en) 2009-09-21 2013-04-23 Microsoft Corporation Screen space plane identification
US8976986B2 (en) 2009-09-21 2015-03-10 Microsoft Technology Licensing, Llc Volume adjustment based on listener position
US8760571B2 (en) 2009-09-21 2014-06-24 Microsoft Corporation Alignment of lens and image sensor
US8908091B2 (en) 2009-09-21 2014-12-09 Microsoft Corporation Alignment of lens and image sensor
US10085072B2 (en) 2009-09-23 2018-09-25 Rovi Guides, Inc. Systems and methods for automatically detecting users within detection regions of media devices
US8452087B2 (en) 2009-09-30 2013-05-28 Microsoft Corporation Image selection techniques
US20110075921A1 (en) * 2009-09-30 2011-03-31 Microsoft Corporation Image Selection Techniques
US8723118B2 (en) 2009-10-01 2014-05-13 Microsoft Corporation Imager for constructing color and depth images
US9522328B2 (en) 2009-10-07 2016-12-20 Microsoft Technology Licensing, Llc Human tracking system
US8891827B2 (en) 2009-10-07 2014-11-18 Microsoft Corporation Systems and methods for tracking a model
US8867820B2 (en) 2009-10-07 2014-10-21 Microsoft Corporation Systems and methods for removing a background of an image
US8861839B2 (en) 2009-10-07 2014-10-14 Microsoft Corporation Human tracking system
US9659377B2 (en) 2009-10-07 2017-05-23 Microsoft Technology Licensing, Llc Methods and systems for determining and tracking extremities of a target
US9679390B2 (en) 2009-10-07 2017-06-13 Microsoft Technology Licensing, Llc Systems and methods for removing a background of an image
US8963829B2 (en) 2009-10-07 2015-02-24 Microsoft Corporation Methods and systems for determining and tracking extremities of a target
US9821226B2 (en) 2009-10-07 2017-11-21 Microsoft Technology Licensing, Llc Human tracking system
US8564534B2 (en) 2009-10-07 2013-10-22 Microsoft Corporation Human tracking system
US8325984B2 (en) 2009-10-07 2012-12-04 Microsoft Corporation Systems and methods for tracking a model
US8483436B2 (en) 2009-10-07 2013-07-09 Microsoft Corporation Systems and methods for tracking a model
US9582717B2 (en) 2009-10-07 2017-02-28 Microsoft Technology Licensing, Llc Systems and methods for tracking a model
US8897495B2 (en) 2009-10-07 2014-11-25 Microsoft Corporation Systems and methods for tracking a model
US20110081044A1 (en) * 2009-10-07 2011-04-07 Microsoft Corporation Systems And Methods For Removing A Background Of An Image
US8970487B2 (en) 2009-10-07 2015-03-03 Microsoft Technology Licensing, Llc Human tracking system
US8542910B2 (en) 2009-10-07 2013-09-24 Microsoft Corporation Human tracking system
US9400548B2 (en) 2009-10-19 2016-07-26 Microsoft Technology Licensing, Llc Gesture personalization and profile roaming
US8988432B2 (en) 2009-11-05 2015-03-24 Microsoft Technology Licensing, Llc Systems and methods for processing an image for target tracking
US8843857B2 (en) 2009-11-19 2014-09-23 Microsoft Corporation Distance scalable no touch computing
US20110119640A1 (en) * 2009-11-19 2011-05-19 Microsoft Corporation Distance scalable no touch computing
US10048763B2 (en) 2009-11-19 2018-08-14 Microsoft Technology Licensing, Llc Distance scalable no touch computing
US20110143834A1 (en) * 2009-12-15 2011-06-16 Wms Gaming, Inc. Location-based customization of avatars in gaming systems
US9244533B2 (en) 2009-12-17 2016-01-26 Microsoft Technology Licensing, Llc Camera navigation for presentations
US8588517B2 (en) 2009-12-18 2013-11-19 Microsoft Corporation Motion detection using depth images
US8374423B2 (en) 2009-12-18 2013-02-12 Microsoft Corporation Motion detection using depth images
US8320621B2 (en) 2009-12-21 2012-11-27 Microsoft Corporation Depth projector system with integrated VCSEL array
US9019201B2 (en) 2010-01-08 2015-04-28 Microsoft Technology Licensing, Llc Evolving universal gesture sets
US10398972B2 (en) 2010-01-08 2019-09-03 Microsoft Technology Licensing, Llc Assigning gesture dictionaries
US9468848B2 (en) 2010-01-08 2016-10-18 Microsoft Technology Licensing, Llc Assigning gesture dictionaries
US9268404B2 (en) 2010-01-08 2016-02-23 Microsoft Technology Licensing, Llc Application gesture interpretation
US8631355B2 (en) 2010-01-08 2014-01-14 Microsoft Corporation Assigning gesture dictionaries
US8933884B2 (en) 2010-01-15 2015-01-13 Microsoft Corporation Tracking groups of users in motion capture system
US9195305B2 (en) 2010-01-15 2015-11-24 Microsoft Technology Licensing, Llc Recognizing user intent in motion capture system
US8676581B2 (en) 2010-01-22 2014-03-18 Microsoft Corporation Speech recognition analysis via identification information
US8265341B2 (en) * 2010-01-25 2012-09-11 Microsoft Corporation Voice-body identity correlation
US20110182481A1 (en) * 2010-01-25 2011-07-28 Microsoft Corporation Voice-body identity correlation
CN102135882A (en) * 2010-01-25 2011-07-27 微软公司 Voice-body identity correlation
US20120327193A1 (en) * 2010-01-25 2012-12-27 Microsoft Corporation Voice-body identity correlation
US8781156B2 (en) * 2010-01-25 2014-07-15 Microsoft Corporation Voice-body identity correlation
US9278287B2 (en) 2010-01-29 2016-03-08 Microsoft Technology Licensing, Llc Visual based identity tracking
US8926431B2 (en) 2010-01-29 2015-01-06 Microsoft Corporation Visual based identity tracking
US8864581B2 (en) 2010-01-29 2014-10-21 Microsoft Corporation Visual based identitiy tracking
US8891067B2 (en) 2010-02-01 2014-11-18 Microsoft Corporation Multiple synchronized optical sources for time-of-flight range finding systems
US10113868B2 (en) 2010-02-01 2018-10-30 Microsoft Technology Licensing, Llc Multiple synchronized optical sources for time-of-flight range finding systems
US9566503B2 (en) 2010-02-02 2017-02-14 Nintendo Co., Ltd. Massively single-playing online game
US10500500B2 (en) 2010-02-02 2019-12-10 Nintendo Co., Ltd. Massively single-playing online game
US20110190062A1 (en) * 2010-02-02 2011-08-04 Nintendo Of America Inc. Massively single-playing online game
US8687044B2 (en) 2010-02-02 2014-04-01 Microsoft Corporation Depth camera compatibility
US10994207B2 (en) 2010-02-02 2021-05-04 Nintendo Co., Ltd. Massively single-playing online game
US8619122B2 (en) 2010-02-02 2013-12-31 Microsoft Corporation Depth camera compatibility
US8717469B2 (en) 2010-02-03 2014-05-06 Microsoft Corporation Fast gating photosurface
US8659658B2 (en) 2010-02-09 2014-02-25 Microsoft Corporation Physical interaction zone for gesture-based user interfaces
US8499257B2 (en) 2010-02-09 2013-07-30 Microsoft Corporation Handles interactions for human—computer interface
US20110199302A1 (en) * 2010-02-16 2011-08-18 Microsoft Corporation Capturing screen objects using a collision volume
US8633890B2 (en) 2010-02-16 2014-01-21 Microsoft Corporation Gesture detection based on joint skipping
US8928579B2 (en) 2010-02-22 2015-01-06 Andrew David Wilson Interacting with an omni-directionally projected display
US9182596B2 (en) 2010-02-28 2015-11-10 Microsoft Technology Licensing, Llc See-through near-eye display glasses with the optical assembly including absorptive polarizers or anti-reflective coatings to reduce stray light
US9759917B2 (en) 2010-02-28 2017-09-12 Microsoft Technology Licensing, Llc AR glasses with event and sensor triggered AR eyepiece interface to external devices
US9097890B2 (en) 2010-02-28 2015-08-04 Microsoft Technology Licensing, Llc Grating in a light transmissive illumination system for see-through near-eye display glasses
US9091851B2 (en) 2010-02-28 2015-07-28 Microsoft Technology Licensing, Llc Light control in head mounted displays
US10539787B2 (en) 2010-02-28 2020-01-21 Microsoft Technology Licensing, Llc Head-worn adaptive display
US10860100B2 (en) 2010-02-28 2020-12-08 Microsoft Technology Licensing, Llc AR glasses with predictive control of external device based on event input
US9129295B2 (en) 2010-02-28 2015-09-08 Microsoft Technology Licensing, Llc See-through near-eye display glasses with a fast response photochromic film system for quick transition from dark to clear
US9229227B2 (en) 2010-02-28 2016-01-05 Microsoft Technology Licensing, Llc See-through near-eye display glasses with a light transmissive wedge shaped illumination system
US9223134B2 (en) 2010-02-28 2015-12-29 Microsoft Technology Licensing, Llc Optical imperfections in a light transmissive illumination system for see-through near-eye display glasses
US10268888B2 (en) 2010-02-28 2019-04-23 Microsoft Technology Licensing, Llc Method and apparatus for biometric data capture
US9134534B2 (en) 2010-02-28 2015-09-15 Microsoft Technology Licensing, Llc See-through near-eye display glasses including a modular image source
US9329689B2 (en) 2010-02-28 2016-05-03 Microsoft Technology Licensing, Llc Method and apparatus for biometric data capture
US9341843B2 (en) 2010-02-28 2016-05-17 Microsoft Technology Licensing, Llc See-through near-eye display glasses with a small scale image source
US9875406B2 (en) 2010-02-28 2018-01-23 Microsoft Technology Licensing, Llc Adjustable extension for temple arm
US10180572B2 (en) 2010-02-28 2019-01-15 Microsoft Technology Licensing, Llc AR glasses with event and user action control of external applications
US9285589B2 (en) 2010-02-28 2016-03-15 Microsoft Technology Licensing, Llc AR glasses with event and sensor triggered control of AR eyepiece applications
US9097891B2 (en) 2010-02-28 2015-08-04 Microsoft Technology Licensing, Llc See-through near-eye display glasses including an auto-brightness control for the display brightness based on the brightness in the environment
US20120194549A1 (en) * 2010-02-28 2012-08-02 Osterhout Group, Inc. Ar glasses specific user interface based on a connected external device type
US9366862B2 (en) 2010-02-28 2016-06-14 Microsoft Technology Licensing, Llc System and method for delivering content to a group of see-through near eye display eyepieces
US8655069B2 (en) 2010-03-05 2014-02-18 Microsoft Corporation Updating image segmentation following user input
US20110216976A1 (en) * 2010-03-05 2011-09-08 Microsoft Corporation Updating Image Segmentation Following User Input
US8644609B2 (en) 2010-03-05 2014-02-04 Microsoft Corporation Up-sampling binary images for segmentation
US20110216965A1 (en) * 2010-03-05 2011-09-08 Microsoft Corporation Image Segmentation Using Reduced Foreground Training Data
US8787658B2 (en) 2010-03-05 2014-07-22 Microsoft Corporation Image segmentation using reduced foreground training data
US8422769B2 (en) 2010-03-05 2013-04-16 Microsoft Corporation Image segmentation using reduced foreground training data
US8411948B2 (en) 2010-03-05 2013-04-02 Microsoft Corporation Up-sampling binary images for segmentation
US9069381B2 (en) 2010-03-12 2015-06-30 Microsoft Technology Licensing, Llc Interacting with a computer based application
US8279418B2 (en) 2010-03-17 2012-10-02 Microsoft Corporation Raster scanning for depth detection
US9147253B2 (en) 2010-03-17 2015-09-29 Microsoft Technology Licensing, Llc Raster scanning for depth detection
US8213680B2 (en) 2010-03-19 2012-07-03 Microsoft Corporation Proxy training data for human body tracking
US8514269B2 (en) 2010-03-26 2013-08-20 Microsoft Corporation De-aliasing depth images
US8523667B2 (en) 2010-03-29 2013-09-03 Microsoft Corporation Parental control settings based on body dimensions
US9031103B2 (en) 2010-03-31 2015-05-12 Microsoft Technology Licensing, Llc Temperature measurement and control for laser and light-emitting diodes
US8605763B2 (en) 2010-03-31 2013-12-10 Microsoft Corporation Temperature measurement and control for laser and light-emitting diodes
US9646340B2 (en) 2010-04-01 2017-05-09 Microsoft Technology Licensing, Llc Avatar-based virtual dressing room
US9098873B2 (en) 2010-04-01 2015-08-04 Microsoft Technology Licensing, Llc Motion-based interactive shopping environment
US8452051B1 (en) 2010-04-26 2013-05-28 Microsoft Corporation Hand-location post-process refinement in a tracking system
US8351651B2 (en) 2010-04-26 2013-01-08 Microsoft Corporation Hand-location post-process refinement in a tracking system
US8611607B2 (en) 2010-04-29 2013-12-17 Microsoft Corporation Multiple centroid condensation of probability distribution clouds
US8379919B2 (en) 2010-04-29 2013-02-19 Microsoft Corporation Multiple centroid condensation of probability distribution clouds
US8284847B2 (en) 2010-05-03 2012-10-09 Microsoft Corporation Detecting motion for a multifunction sensor device
US8885890B2 (en) 2010-05-07 2014-11-11 Microsoft Corporation Depth map confidence filtering
US8498481B2 (en) 2010-05-07 2013-07-30 Microsoft Corporation Image segmentation using star-convexity constraints
US8457353B2 (en) 2010-05-18 2013-06-04 Microsoft Corporation Gestures and gesture modifiers for manipulating a user-interface
US9958952B2 (en) 2010-06-02 2018-05-01 Microsoft Technology Licensing, Llc Recognition system for sharing information
US9491226B2 (en) 2010-06-02 2016-11-08 Microsoft Technology Licensing, Llc Recognition system for sharing information
US8803888B2 (en) 2010-06-02 2014-08-12 Microsoft Corporation Recognition system for sharing information
US8602887B2 (en) 2010-06-03 2013-12-10 Microsoft Corporation Synthesis of information from multiple audiovisual sources
US9098493B2 (en) 2010-06-04 2015-08-04 Microsoft Technology Licensing, Llc Machine based sign language interpreter
US8751215B2 (en) 2010-06-04 2014-06-10 Microsoft Corporation Machine based sign language interpreter
US9008355B2 (en) 2010-06-04 2015-04-14 Microsoft Technology Licensing, Llc Automatic depth camera aiming
US9557574B2 (en) 2010-06-08 2017-01-31 Microsoft Technology Licensing, Llc Depth illumination and detection optics
US8330822B2 (en) 2010-06-09 2012-12-11 Microsoft Corporation Thermally-tuned depth camera light source
US9384329B2 (en) 2010-06-11 2016-07-05 Microsoft Technology Licensing, Llc Caloric burn determination from body movement
US8675981B2 (en) 2010-06-11 2014-03-18 Microsoft Corporation Multi-modal gender recognition including depth data
US8749557B2 (en) 2010-06-11 2014-06-10 Microsoft Corporation Interacting with user interface via avatar
US9292083B2 (en) 2010-06-11 2016-03-22 Microsoft Technology Licensing, Llc Interacting with user interface via avatar
US8982151B2 (en) 2010-06-14 2015-03-17 Microsoft Technology Licensing, Llc Independently processing planes of display data
US8558873B2 (en) 2010-06-16 2013-10-15 Microsoft Corporation Use of wavefront coding to create a depth image
US8670029B2 (en) 2010-06-16 2014-03-11 Microsoft Corporation Depth camera illuminator with superluminescent light-emitting diode
US10534438B2 (en) 2010-06-18 2020-01-14 Microsoft Technology Licensing, Llc Compound gesture-speech commands
US8296151B2 (en) 2010-06-18 2012-10-23 Microsoft Corporation Compound gesture-speech commands
US9274747B2 (en) 2010-06-21 2016-03-01 Microsoft Technology Licensing, Llc Natural user input for driving interactive stories
US8381108B2 (en) 2010-06-21 2013-02-19 Microsoft Corporation Natural user input for driving interactive stories
US8416187B2 (en) 2010-06-22 2013-04-09 Microsoft Corporation Item navigation using motion-capture data
US8878656B2 (en) 2010-06-22 2014-11-04 Microsoft Corporation Providing directional force feedback in free space
US9086727B2 (en) 2010-06-22 2015-07-21 Microsoft Technology Licensing, Llc Free space directional force feedback apparatus
US20140287828A1 (en) * 2010-07-16 2014-09-25 enVie Interactive LLC Unlocking content in a virtual environment
US20120015699A1 (en) * 2010-07-16 2012-01-19 enVie Interactive LLC Unlocking content in a virtual environment
US9619959B2 (en) 2010-08-06 2017-04-11 Bally Gaming, Inc. Wagering game presentation with multiple technology containers in a web browser
US9269220B2 (en) 2010-08-06 2016-02-23 Bally Gaming, Inc. Web page constructions with different technology containers
US9672691B2 (en) 2010-08-06 2017-06-06 Bally Gaming, Inc. Controlling wagering game system browser areas
US10186111B2 (en) 2010-08-06 2019-01-22 Bally Gaming, Inc. Controlling wagering game system browser areas
US8911294B2 (en) 2010-08-06 2014-12-16 Wms Gaming, Inc. Browser based heterogenous technology ecosystem
US9075434B2 (en) 2010-08-20 2015-07-07 Microsoft Technology Licensing, Llc Translating user motion into multiple object responses
US8613666B2 (en) 2010-08-31 2013-12-24 Microsoft Corporation User selection and navigation based on looped motions
US8437506B2 (en) 2010-09-07 2013-05-07 Microsoft Corporation System for fast, probabilistic skeletal tracking
US8968091B2 (en) 2010-09-07 2015-03-03 Microsoft Technology Licensing, Llc Scalable real-time motion recognition
US8953844B2 (en) 2010-09-07 2015-02-10 Microsoft Technology Licensing, Llc System for fast, probabilistic skeletal tracking
US9128281B2 (en) 2010-09-14 2015-09-08 Microsoft Technology Licensing, Llc Eyepiece with uniformly illuminated reflective display
US8988508B2 (en) 2010-09-24 2015-03-24 Microsoft Technology Licensing, Llc. Wide angle field of view active illumination imaging system
US8681255B2 (en) 2010-09-28 2014-03-25 Microsoft Corporation Integrated low power depth camera and projection device
US8548270B2 (en) 2010-10-04 2013-10-01 Microsoft Corporation Time-of-flight depth imaging
US8983233B2 (en) 2010-10-04 2015-03-17 Microsoft Technology Licensing, Llc Time-of-flight depth imaging
US9484065B2 (en) 2010-10-15 2016-11-01 Microsoft Technology Licensing, Llc Intelligent determination of replays based on event identification
US9291449B2 (en) 2010-11-02 2016-03-22 Microsoft Technology Licensing, Llc Detection of configuration changes among optical elements of illumination system
US8592739B2 (en) 2010-11-02 2013-11-26 Microsoft Corporation Detection of configuration changes of an optical element in an illumination system
US8866889B2 (en) 2010-11-03 2014-10-21 Microsoft Corporation In-home depth camera calibration
US8667519B2 (en) 2010-11-12 2014-03-04 Microsoft Corporation Automatic passive and anonymous feedback system
US10726861B2 (en) 2010-11-15 2020-07-28 Microsoft Technology Licensing, Llc Semi-private communication in open environments
US9349040B2 (en) 2010-11-19 2016-05-24 Microsoft Technology Licensing, Llc Bi-modal depth-image analysis
US10234545B2 (en) 2010-12-01 2019-03-19 Microsoft Technology Licensing, Llc Light source module
US8553934B2 (en) 2010-12-08 2013-10-08 Microsoft Corporation Orienting the position of a sensor
US8618405B2 (en) 2010-12-09 2013-12-31 Microsoft Corp. Free-space gesture musical instrument digital interface (MIDI) controller
US8408706B2 (en) 2010-12-13 2013-04-02 Microsoft Corporation 3D gaze tracker
US9171264B2 (en) 2010-12-15 2015-10-27 Microsoft Technology Licensing, Llc Parallel processing machine learning decision tree training
US8920241B2 (en) 2010-12-15 2014-12-30 Microsoft Corporation Gesture controlled persistent handles for interface guides
US8884968B2 (en) 2010-12-15 2014-11-11 Microsoft Corporation Modeling an object from image data
US8775916B2 (en) 2010-12-17 2014-07-08 Microsoft Corporation Validation analysis of human target
US8448056B2 (en) 2010-12-17 2013-05-21 Microsoft Corporation Validation analysis of human target
US8803952B2 (en) 2010-12-20 2014-08-12 Microsoft Corporation Plural detector time-of-flight depth mapping
US8385596B2 (en) 2010-12-21 2013-02-26 Microsoft Corporation First person shooter control with virtual skeleton
US9823339B2 (en) 2010-12-21 2017-11-21 Microsoft Technology Licensing, Llc Plural anode time-of-flight sensor
US9821224B2 (en) 2010-12-21 2017-11-21 Microsoft Technology Licensing, Llc Driving simulator control with virtual skeleton
US8994718B2 (en) 2010-12-21 2015-03-31 Microsoft Technology Licensing, Llc Skeletal control of three-dimensional virtual world
US9489053B2 (en) 2010-12-21 2016-11-08 Microsoft Technology Licensing, Llc Skeletal control of three-dimensional virtual world
US9848106B2 (en) 2010-12-21 2017-12-19 Microsoft Technology Licensing, Llc Intelligent gameplay photo capture
US9123316B2 (en) 2010-12-27 2015-09-01 Microsoft Technology Licensing, Llc Interactive content creation
US9529566B2 (en) 2010-12-27 2016-12-27 Microsoft Technology Licensing, Llc Interactive content creation
US8488888B2 (en) 2010-12-28 2013-07-16 Microsoft Corporation Classification of posture states
US8401242B2 (en) 2011-01-31 2013-03-19 Microsoft Corporation Real-time camera tracking using depth maps
US8587583B2 (en) 2011-01-31 2013-11-19 Microsoft Corporation Three-dimensional environment reconstruction
US9247238B2 (en) 2011-01-31 2016-01-26 Microsoft Technology Licensing, Llc Reducing interference between multiple infra-red depth cameras
US10049458B2 (en) 2011-01-31 2018-08-14 Microsoft Technology Licensing, Llc Reducing interference between multiple infra-red depth cameras
US8401225B2 (en) 2011-01-31 2013-03-19 Microsoft Corporation Moving object segmentation using depth images
US9242171B2 (en) 2011-01-31 2016-01-26 Microsoft Technology Licensing, Llc Real-time camera tracking using depth maps
US8724887B2 (en) 2011-02-03 2014-05-13 Microsoft Corporation Environmental modifications to mitigate environmental factors
US9619561B2 (en) 2011-02-14 2017-04-11 Microsoft Technology Licensing, Llc Change invariant scene recognition by an agent
US8942917B2 (en) 2011-02-14 2015-01-27 Microsoft Corporation Change invariant scene recognition by an agent
US8497838B2 (en) 2011-02-16 2013-07-30 Microsoft Corporation Push actuation of interface controls
US9551914B2 (en) 2011-03-07 2017-01-24 Microsoft Technology Licensing, Llc Illuminator with refractive optical element
US9067136B2 (en) 2011-03-10 2015-06-30 Microsoft Technology Licensing, Llc Push personalization of interface controls
US8571263B2 (en) 2011-03-17 2013-10-29 Microsoft Corporation Predicting joint positions
US9470778B2 (en) 2011-03-29 2016-10-18 Microsoft Technology Licensing, Llc Learning from high quality depth measurements
US9298287B2 (en) 2011-03-31 2016-03-29 Microsoft Technology Licensing, Llc Combined activation for natural user interface systems
US10642934B2 (en) 2011-03-31 2020-05-05 Microsoft Technology Licensing, Llc Augmented conversational understanding architecture
US10296587B2 (en) 2011-03-31 2019-05-21 Microsoft Technology Licensing, Llc Augmented conversational understanding agent to identify conversation context between two humans and taking an agent action thereof
US10585957B2 (en) 2011-03-31 2020-03-10 Microsoft Technology Licensing, Llc Task driven user intents
US8824749B2 (en) 2011-04-05 2014-09-02 Microsoft Corporation Biometric recognition
US9539500B2 (en) 2011-04-05 2017-01-10 Microsoft Technology Licensing, Llc Biometric recognition
US8503494B2 (en) 2011-04-05 2013-08-06 Microsoft Corporation Thermal management system
US8620113B2 (en) 2011-04-25 2013-12-31 Microsoft Corporation Laser diode modes
US9259643B2 (en) 2011-04-28 2016-02-16 Microsoft Technology Licensing, Llc Control of separate computer game elements
US8702507B2 (en) 2011-04-28 2014-04-22 Microsoft Corporation Manual and camera-based avatar control
US10671841B2 (en) 2011-05-02 2020-06-02 Microsoft Technology Licensing, Llc Attribute state classification
US8888331B2 (en) 2011-05-09 2014-11-18 Microsoft Corporation Low inductance light source module
US9137463B2 (en) 2011-05-12 2015-09-15 Microsoft Technology Licensing, Llc Adaptive high dynamic range camera
US8788973B2 (en) 2011-05-23 2014-07-22 Microsoft Corporation Three-dimensional gesture controlled avatar configuration interface
US9372544B2 (en) 2011-05-31 2016-06-21 Microsoft Technology Licensing, Llc Gesture recognition techniques
US10331222B2 (en) 2011-05-31 2019-06-25 Microsoft Technology Licensing, Llc Gesture recognition techniques
US8760395B2 (en) 2011-05-31 2014-06-24 Microsoft Corporation Gesture recognition techniques
US9594430B2 (en) 2011-06-01 2017-03-14 Microsoft Technology Licensing, Llc Three-dimensional foreground selection for vision system
US8526734B2 (en) 2011-06-01 2013-09-03 Microsoft Corporation Three-dimensional background removal for vision system
US10796494B2 (en) 2011-06-06 2020-10-06 Microsoft Technology Licensing, Llc Adding attributes to virtual representations of real-world objects
US9098110B2 (en) 2011-06-06 2015-08-04 Microsoft Technology Licensing, Llc Head rotation tracking from depth-based center of mass
US9724600B2 (en) 2011-06-06 2017-08-08 Microsoft Technology Licensing, Llc Controlling objects in a virtual environment
US8897491B2 (en) 2011-06-06 2014-11-25 Microsoft Corporation System for finger recognition and tracking
US9953426B2 (en) 2011-06-06 2018-04-24 Microsoft Technology Licensing, Llc Object digitization
US8597142B2 (en) 2011-06-06 2013-12-03 Microsoft Corporation Dynamic camera based practice mode
US9013489B2 (en) 2011-06-06 2015-04-21 Microsoft Technology Licensing, Llc Generation of avatar reflecting player appearance
US9208571B2 (en) 2011-06-06 2015-12-08 Microsoft Technology Licensing, Llc Object digitization
US8929612B2 (en) 2011-06-06 2015-01-06 Microsoft Corporation System for recognizing an open or closed hand
US9597587B2 (en) 2011-06-08 2017-03-21 Microsoft Technology Licensing, Llc Locational node device
US8786730B2 (en) 2011-08-18 2014-07-22 Microsoft Corporation Image exposure using exclusion regions
US9557836B2 (en) 2011-11-01 2017-01-31 Microsoft Technology Licensing, Llc Depth image compression
US9117281B2 (en) 2011-11-02 2015-08-25 Microsoft Corporation Surface segmentation from RGB and depth images
US9056254B2 (en) 2011-11-07 2015-06-16 Microsoft Technology Licensing, Llc Time-of-flight camera with guided light
US8854426B2 (en) 2011-11-07 2014-10-07 Microsoft Corporation Time-of-flight camera with guided light
US8724906B2 (en) 2011-11-18 2014-05-13 Microsoft Corporation Computing pose and/or shape of modifiable entities
US8929668B2 (en) 2011-11-29 2015-01-06 Microsoft Corporation Foreground subject detection
US8509545B2 (en) 2011-11-29 2013-08-13 Microsoft Corporation Foreground subject detection
US8635637B2 (en) 2011-12-02 2014-01-21 Microsoft Corporation User interface presenting an animated avatar performing a media reaction
US8803800B2 (en) 2011-12-02 2014-08-12 Microsoft Corporation User interface control based on head orientation
US9154837B2 (en) 2011-12-02 2015-10-06 Microsoft Technology Licensing, Llc User interface presenting an animated avatar performing a media reaction
US9100685B2 (en) 2011-12-09 2015-08-04 Microsoft Technology Licensing, Llc Determining audience state or interest using passive sensor data
US10798438B2 (en) 2011-12-09 2020-10-06 Microsoft Technology Licensing, Llc Determining audience state or interest using passive sensor data
US9628844B2 (en) 2011-12-09 2017-04-18 Microsoft Technology Licensing, Llc Determining audience state or interest using passive sensor data
US8879831B2 (en) 2011-12-15 2014-11-04 Microsoft Corporation Using high-level attributes to guide image processing
US8630457B2 (en) 2011-12-15 2014-01-14 Microsoft Corporation Problem states for pose tracking pipeline
US8971612B2 (en) 2011-12-15 2015-03-03 Microsoft Corporation Learning image processing tasks from scene reconstructions
US9596643B2 (en) 2011-12-16 2017-03-14 Microsoft Technology Licensing, Llc Providing a user interface experience based on inferred vehicle state
US8811938B2 (en) 2011-12-16 2014-08-19 Microsoft Corporation Providing a user interface experience based on inferred vehicle state
US9342139B2 (en) 2011-12-19 2016-05-17 Microsoft Technology Licensing, Llc Pairing a computing device to a user
US10220259B2 (en) 2012-01-05 2019-03-05 Icon Health & Fitness, Inc. System and method for controlling an exercise device
US9720089B2 (en) 2012-01-23 2017-08-01 Microsoft Technology Licensing, Llc 3D zoom imager
US9367869B2 (en) * 2012-02-13 2016-06-14 Dean Stark System and method for virtual display
US20130290139A1 (en) * 2012-02-13 2013-10-31 Dean Stark System and method for virtual display
US8898687B2 (en) 2012-04-04 2014-11-25 Microsoft Corporation Controlling a media program based on a media reaction
US10127735B2 (en) 2012-05-01 2018-11-13 Augmented Reality Holdings 2, Llc System, method and apparatus of eye tracking or gaze detection applications including facilitating action on or interaction with a simulated object
US10878636B2 (en) 2012-05-01 2020-12-29 Samsung Electronics Co., Ltd. System and method for selecting targets in an augmented reality environment
US10388070B2 (en) 2012-05-01 2019-08-20 Samsung Electronics Co., Ltd. System and method for selecting targets in an augmented reality environment
US11417066B2 (en) 2012-05-01 2022-08-16 Samsung Electronics Co., Ltd. System and method for selecting targets in an augmented reality environment
US9210401B2 (en) 2012-05-03 2015-12-08 Microsoft Technology Licensing, Llc Projected visual cues for guiding physical movement
US9788032B2 (en) 2012-05-04 2017-10-10 Microsoft Technology Licensing, Llc Determining a future portion of a currently presented media program
US8959541B2 (en) 2012-05-04 2015-02-17 Microsoft Technology Licensing, Llc Determining a future portion of a currently presented media program
US9001118B2 (en) 2012-06-21 2015-04-07 Microsoft Technology Licensing, Llc Avatar construction using depth camera
US9836590B2 (en) 2012-06-22 2017-12-05 Microsoft Technology Licensing, Llc Enhanced accuracy of user presence status determination
US10089454B2 (en) 2012-06-22 2018-10-02 Microsoft Technology Licensing, Llc Enhanced accuracy of user presence status determination
US9696427B2 (en) 2012-08-14 2017-07-04 Microsoft Technology Licensing, Llc Wide angle depth detection
US10878009B2 (en) 2012-08-23 2020-12-29 Microsoft Technology Licensing, Llc Translating natural language utterances to keyword search queries
US8882310B2 (en) 2012-12-10 2014-11-11 Microsoft Corporation Laser die light source module with low inductance
US9857470B2 (en) 2012-12-28 2018-01-02 Microsoft Technology Licensing, Llc Using photometric stereo for 3D environment modeling
US11215711B2 (en) 2012-12-28 2022-01-04 Microsoft Technology Licensing, Llc Using photometric stereo for 3D environment modeling
US9251590B2 (en) 2013-01-24 2016-02-02 Microsoft Technology Licensing, Llc Camera pose estimation for 3D reconstruction
US9052746B2 (en) 2013-02-15 2015-06-09 Microsoft Technology Licensing, Llc User center-of-mass and mass distribution extraction using depth images
US9940553B2 (en) 2013-02-22 2018-04-10 Microsoft Technology Licensing, Llc Camera/object pose from predicted coordinates
US11710309B2 (en) 2013-02-22 2023-07-25 Microsoft Technology Licensing, Llc Camera/object pose from predicted coordinates
US9959459B2 (en) 2013-03-08 2018-05-01 Microsoft Technology Licensing, Llc Extraction of user behavior from depth images
US9311560B2 (en) 2013-03-08 2016-04-12 Microsoft Technology Licensing, Llc Extraction of user behavior from depth images
US9135516B2 (en) 2013-03-08 2015-09-15 Microsoft Technology Licensing, Llc User body angle, curvature and average extremity positions extraction using depth images
US9824260B2 (en) 2013-03-13 2017-11-21 Microsoft Technology Licensing, Llc Depth image processing
US9092657B2 (en) 2013-03-13 2015-07-28 Microsoft Technology Licensing, Llc Depth image processing
US9274606B2 (en) 2013-03-14 2016-03-01 Microsoft Technology Licensing, Llc NUI video conference controls
US9787943B2 (en) 2013-03-14 2017-10-10 Microsoft Technology Licensing, Llc Natural user interface having video conference controls
US10279212B2 (en) 2013-03-14 2019-05-07 Icon Health & Fitness, Inc. Strength training apparatus with flywheel and related methods
US9953213B2 (en) 2013-03-27 2018-04-24 Microsoft Technology Licensing, Llc Self discovery of autonomous NUI devices
US9442186B2 (en) 2013-05-13 2016-09-13 Microsoft Technology Licensing, Llc Interference reduction for TOF systems
US9462253B2 (en) 2013-09-23 2016-10-04 Microsoft Technology Licensing, Llc Optical modules that reduce speckle contrast and diffraction artifacts
US10024968B2 (en) 2013-09-23 2018-07-17 Microsoft Technology Licensing, Llc Optical modules that reduce speckle contrast and diffraction artifacts
US9443310B2 (en) 2013-10-09 2016-09-13 Microsoft Technology Licensing, Llc Illumination modules that emit structured light
US9674563B2 (en) 2013-11-04 2017-06-06 Rovi Guides, Inc. Systems and methods for recommending content
US10205931B2 (en) 2013-11-12 2019-02-12 Microsoft Technology Licensing, Llc Power efficient laser diode driver circuit and method
US9769459B2 (en) 2013-11-12 2017-09-19 Microsoft Technology Licensing, Llc Power efficient laser diode driver circuit and method
US10325628B2 (en) 2013-11-21 2019-06-18 Microsoft Technology Licensing, Llc Audio-visual project generator
US9508385B2 (en) 2013-11-21 2016-11-29 Microsoft Technology Licensing, Llc Audio-visual project generator
US10188890B2 (en) 2013-12-26 2019-01-29 Icon Health & Fitness, Inc. Magnetic resistance mechanism in a cable machine
US9971491B2 (en) 2014-01-09 2018-05-15 Microsoft Technology Licensing, Llc Gesture library for natural user input
US10433612B2 (en) 2014-03-10 2019-10-08 Icon Health & Fitness, Inc. Pressure sensor to quantify work
US20150302664A1 (en) * 2014-04-18 2015-10-22 Magic Leap, Inc. Avatar rendering for augmented or virtual reality
US10426989B2 (en) 2014-06-09 2019-10-01 Icon Health & Fitness, Inc. Cable system incorporated into a treadmill
US10226396B2 (en) 2014-06-20 2019-03-12 Icon Health & Fitness, Inc. Post workout massage device
US20170333792A1 (en) * 2014-08-12 2017-11-23 Utherverse Digital, Inc. Method. system and apparatus of recording and playing back an experience in a virtual worlds system
US20160049003A1 (en) * 2014-08-12 2016-02-18 Utherverse Digital Inc. Method, system and apparatus of recording and playing back an experience in a virtual worlds system
US11638871B2 (en) 2014-08-12 2023-05-02 Utherverse Gaming Llc Method, system and apparatus of recording and playing back an experience in a virtual worlds system
US9724605B2 (en) * 2014-08-12 2017-08-08 Utherverse Digital Inc. Method, system and apparatus of recording and playing back an experience in a virtual worlds system
US11452938B2 (en) 2014-08-12 2022-09-27 Utherverse Gaming Llc Method, system and apparatus of recording and playing back an experience in a virtual worlds system
US10585531B2 (en) * 2014-09-02 2020-03-10 Sony Corporation Information processing apparatus, control method, and program
US20170168651A1 (en) * 2014-09-02 2017-06-15 Sony Corporation Information processing apparatus, control method, and program
US10258828B2 (en) 2015-01-16 2019-04-16 Icon Health & Fitness, Inc. Controls for an exercise device
US10391361B2 (en) 2015-02-27 2019-08-27 Icon Health & Fitness, Inc. Simulating real-world terrain on an exercise device
US10537764B2 (en) 2015-08-07 2020-01-21 Icon Health & Fitness, Inc. Emergency stop with magnetic brake for an exercise device
US10953305B2 (en) 2015-08-26 2021-03-23 Icon Health & Fitness, Inc. Strength exercise mechanisms
US10004993B2 (en) 2015-09-15 2018-06-26 Square Enix Holdings Co., Ltd. Game system including third party control
US9873055B2 (en) 2015-09-15 2018-01-23 Square Enix Holdings Co., Ltd. Game system including third party control
US10412280B2 (en) 2016-02-10 2019-09-10 Microsoft Technology Licensing, Llc Camera with light valve over sensor array
US10257932B2 (en) 2016-02-16 2019-04-09 Microsoft Technology Licensing, Llc. Laser diode chip on printed circuit board
US10462452B2 (en) 2016-03-16 2019-10-29 Microsoft Technology Licensing, Llc Synchronizing active illumination cameras
US10625137B2 (en) 2016-03-18 2020-04-21 Icon Health & Fitness, Inc. Coordinated displays in an exercise device
US10493349B2 (en) 2016-03-18 2019-12-03 Icon Health & Fitness, Inc. Display on exercise device
US10272317B2 (en) 2016-03-18 2019-04-30 Icon Health & Fitness, Inc. Lighted pace feature in a treadmill
US10293211B2 (en) 2016-03-18 2019-05-21 Icon Health & Fitness, Inc. Coordinated weight selection
US10561894B2 (en) 2016-03-18 2020-02-18 Icon Health & Fitness, Inc. Treadmill with removable supports
US9573068B1 (en) 2016-04-04 2017-02-21 Kipling Martin Virtual reality enhancement device
US20190224569A1 (en) * 2016-04-29 2019-07-25 Activision Publishing, Inc. System and Method for Identifying Spawn Locations in a Video Game
US20170312632A1 (en) * 2016-04-29 2017-11-02 Activision Publishing, Inc. System and method for identifying spawn locations in a video game
US10226701B2 (en) * 2016-04-29 2019-03-12 Activision Publishing, Inc. System and method for identifying spawn locations in a video game
US10807003B2 (en) * 2016-04-29 2020-10-20 Activision Publishing, Inc. Systems and methods for determining distances required to achieve a line of site between nodes
US10252109B2 (en) 2016-05-13 2019-04-09 Icon Health & Fitness, Inc. Weight platform treadmill
US10471299B2 (en) 2016-07-01 2019-11-12 Icon Health & Fitness, Inc. Systems and methods for cooling internal exercise equipment components
US10441844B2 (en) 2016-07-01 2019-10-15 Icon Health & Fitness, Inc. Cooling systems and methods for exercise equipment
US10671705B2 (en) 2016-09-28 2020-06-02 Icon Health & Fitness, Inc. Customizing recipe recommendations
US10500473B2 (en) 2016-10-10 2019-12-10 Icon Health & Fitness, Inc. Console positioning
US10376736B2 (en) 2016-10-12 2019-08-13 Icon Health & Fitness, Inc. Cooling an exercise device during a dive motor runway condition
US10561877B2 (en) 2016-11-01 2020-02-18 Icon Health & Fitness, Inc. Drop-in pivot configuration for stationary bike
US10661114B2 (en) 2016-11-01 2020-05-26 Icon Health & Fitness, Inc. Body weight lift mechanism on treadmill
US10625114B2 (en) 2016-11-01 2020-04-21 Icon Health & Fitness, Inc. Elliptical and stationary bicycle apparatus including row functionality
US10343017B2 (en) 2016-11-01 2019-07-09 Icon Health & Fitness, Inc. Distance sensor for console positioning
US10543395B2 (en) 2016-12-05 2020-01-28 Icon Health & Fitness, Inc. Offsetting treadmill deck weight during operation
US10702736B2 (en) 2017-01-14 2020-07-07 Icon Health & Fitness, Inc. Exercise cycle
US20230077369A1 (en) * 2017-03-13 2023-03-16 Holodia Inc. Method for generating multimedia data associated with a system for practicing sports
US20200171386A1 (en) * 2017-07-07 2020-06-04 Buxton Global Enterprises, Inc. Reality vs virtual reality racing
US10953330B2 (en) * 2017-07-07 2021-03-23 Buxton Global Enterprises, Inc. Reality vs virtual reality racing
US10357715B2 (en) 2017-07-07 2019-07-23 Buxton Global Enterprises, Inc. Racing simulation
US11451108B2 (en) 2017-08-16 2022-09-20 Ifit Inc. Systems and methods for axial impact resistance in electric motors
US20190099667A1 (en) * 2017-09-29 2019-04-04 Gree, Inc. Game processing program, game processing method, and game processing device
EP3461542B1 (en) * 2017-09-29 2023-03-29 Gree, Inc. Game processing program, game processing method, and game processing device
US11318376B2 (en) 2017-09-29 2022-05-03 Gree, Inc. Game processing program, game processing method, and game processing device
EP4201496A1 (en) * 2017-09-29 2023-06-28 Gree, Inc. Game processing program, game processing method, and game processing device
US10695666B2 (en) * 2017-09-29 2020-06-30 Gree, Inc. Game processing program, game processing method, and game processing device
US11839811B2 (en) 2017-09-29 2023-12-12 Gree, Inc. Game processing program, game processing method, and game processing device
US10729965B2 (en) 2017-12-22 2020-08-04 Icon Health & Fitness, Inc. Audible belt guide in a treadmill
US10685153B2 (en) * 2018-06-15 2020-06-16 Syscend, Inc. Bicycle sizer
US20200042160A1 (en) * 2018-06-18 2020-02-06 Alessandro Gabbi System and Method for Providing Virtual-Reality Based Interactive Archives for Therapeutic Interventions, Interactions and Support
US11876685B1 (en) 2021-05-19 2024-01-16 Amazon Technologies, Inc. Locally predicting state using a componentized entity simulation
US11909601B1 (en) 2021-06-17 2024-02-20 Amazon Technologies, Inc. Implementing a scalable 3D simulation using a distributed 3D keyspace
CN117292094A (en) * 2023-11-23 2023-12-26 南昌菱形信息技术有限公司 Digitalized application method and system for performance theatre in karst cave

Similar Documents

Publication Publication Date Title
US20080026838A1 (en) Multi-player non-role-playing virtual world games: method for two-way interaction between participants and multi-player virtual world games
Hämäläinen et al. Martial arts in artificial reality
US20140274564A1 (en) Devices, systems and methods for interaction in a virtual environment
Hernandez et al. Designing action-based exergames for children with cerebral palsy
US9804672B2 (en) Human-computer user interaction
Stach et al. Exploring haptic feedback in exergames
Hämäläinen et al. Utilizing gravity in movement-based games and play
Kajastila et al. Motion games in real sports environments
KR20170105335A (en) Exercise device and system based on virtual reality
Burkett Sport mechanics for coaches
EP2252378A2 (en) Interactive exercising system
Fogtmann Designing bodily engaging games: learning from sports
Mozgovoy et al. Team sports for Game AI benchmarking revisited
Reel Working out: The psychology of sport and exercise
US20220226742A1 (en) A playground interactive gaming system
Höysniemi et al. Children's and parents' perception of full-body interaction and violence in a martial arts game
Van Delden et al. Hang in there: A novel body-centric interactive playground
Ketcheson Designing for exertion: using heart rate power-ups to improve energy expenditure in exergames
Macák When Game Is the Exercise and Exercise Is the Game: Design Analysis of Ring Fit Adventure
Mikalsen Creation and Evaluation of Exer Dungeon-A multi-player exergame using exercise bikes
Mozgovoy et al. Research Article Team Sports for Game AI Benchmarking Revisited
Gao et al. How do linear and nonlinear levels inspire game flow in cooperative gameplay?: comparative analysis of collaborative mechanics design in ItTakes Two
Karinch Lessons from the edge: Extreme athletes show you how to take on high risk and succeed
Naubert Cybersport 2.0: Ethical dimensions of videogames as sport
Meisler Design, creation, and evaluation of CyberSteamPunkHoverWar 2088.-A multiplayer racing exercise bicycle game.

Legal Events

Date Code Title Description
STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION