WO2016070192A1 - Interactive gaming using wearable optical devices - Google Patents

Interactive gaming using wearable optical devices Download PDF

Info

Publication number
WO2016070192A1
WO2016070192A1 PCT/US2015/058672 US2015058672W WO2016070192A1 WO 2016070192 A1 WO2016070192 A1 WO 2016070192A1 US 2015058672 W US2015058672 W US 2015058672W WO 2016070192 A1 WO2016070192 A1 WO 2016070192A1
Authority
WO
WIPO (PCT)
Prior art keywords
game
virtual
display
gameplay
emitter
Prior art date
Application number
PCT/US2015/058672
Other languages
French (fr)
Inventor
Mark J. Ladd
Tuomas Ketola
Original Assignee
LyteShot Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by LyteShot Inc. filed Critical LyteShot Inc.
Publication of WO2016070192A1 publication Critical patent/WO2016070192A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/30Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers
    • A63F13/31Communication aspects specific to video games, e.g. between several handheld game devices at close range
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/30Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers
    • A63F13/35Details of game servers
    • A63F13/355Performing operations on behalf of clients with restricted processing capabilities, e.g. servers transform changing game scene into an MPEG-stream for transmitting to a mobile phone or a thin client
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/212Input arrangements for video game devices characterised by their sensors, purposes or types using sensors worn by the player, e.g. for measuring heart beat or leg activity
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/214Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/25Output arrangements for video game devices
    • A63F13/26Output arrangements for video game devices having at least one additional display device, e.g. on the game controller or outside a game booth
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/30Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/53Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/014Head-up displays characterised by optical features comprising information/image processing systems
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B2027/0178Eyeglass type

Definitions

  • the technical field relates to systems and methods for interactive gaming on digital devices. More particularly, the technical field relates to systems and methods for sensor-based interactive gaming using optical wearable devices.
  • Electronic games have long entertained many people. Many electronic games are hosted on personal computers or dedicated game consoles that have a processing or control unit, a display device, and a joystick, a keyboard, a mouse, trackpad, or other input device.
  • the electronic games themselves typically relate to one or more genres, such as adventure genres, first-person shooting genres, automotive or aviation genres, role-playing or fantasy genres, sports genres, and collaborative social genres.
  • the electronic games typically utilize gameplay, in-game objectives and virtual in-game objects (such as virtual characters, virtual items, virtual points, and video game levels) to facilitate competition or collaboration between one or more game players and a computer, and/or between two or more game players.
  • the game players may further interact with these virtual objects using the sensor-based gaming hardware, the wireless computing devices, the wearable optical devices, or some combination thereof.
  • these interactive inputs may be used to change the state(s) of the electronic game, state(s) of the virtual objects, etc.
  • a communications device may be configured to support gameplay.
  • a display may be configured to display a view of a physical environment proximate to the display, and to display at a first location a virtual in-game object associated with the gameplay, the first position being over the view of the physical environment.
  • An emitter may be configured to provide an emitter interaction signal related to a first in-game interaction with the virtual in- game object.
  • a gameplay server may be configured to determine whether a projection of the in-game interaction overlaps a projection of the virtual in-game object in a virtual space, and to change a state of the virtual in-game object if the projection of the in-game interaction overlaps the projection of the virtual in-game object in the virtual space.
  • a method may comprise: supporting gameplay on a communications device, a display, and an emitter; displaying a view of a physical environment proximate to the display; displaying at a first location a virtual in-game object associated with gameplay, the first position being over the view of the physical environment; providing an emitter interaction signal related to a first in-game interaction with the virtual in-game object; projecting the first location onto a first area in a virtual space; identifying a second area in the virtual space, the second area associated with the first in-game interaction; determining whether the second area overlaps with the first area in the virtual space; and changing a state of the virtual in-game object if the second area overlaps with the first area in the virtual space.
  • a system may comprise: a communications device configured to support gameplay; means for displaying a view of a physical environment proximate to the display, and to display at a first location a virtual in-game object associated with the gameplay, the first position being over the view of the physical environment; means for providing an emitter interaction signal related to a first in-game interaction with the virtual in-game object; and a gameplay server coupled to the communications device, the gameplay server configured to: project the first location onto a first area in a virtual space; identify a second area in the virtual space, the second area associated with the first in-game interaction based on the emitter interaction signal; determine whether the second area overlaps with the first area in the virtual space; and change a state of the virtual in-game object if the second area overlaps with the first area in the virtual space.
  • the emitter interaction signal is based on one or more of: a squeeze of a trigger of a toy gun incorporating the emitter, a specified motion of a toy sword incorporating the emitter, and a specified motion of a toy wand incorporating the emitter.
  • a communications device interaction may be provided from the communications device, the communications device interaction signal being related to a second in-game interaction with the virtual in-game object.
  • the communications device interaction signal may be based on one or more gestures into a user interface of the communications device.
  • the display is incorporated into an wearable optical device, and display is configured to provide a display interaction signal, the display interaction signal being related to a second in-game interaction signal with the virtual in-game object.
  • the display interaction signal may be based on one or more of: eye motion of a user of the wearable optical device, touch interaction with the wearable optical device, and voice input into the wearable optical device.
  • the display interaction signal may be based on a tilt or a translation of the wearable optical device.
  • the gameplay server is configured to select the virtual in-game object for the display based on a location of the display.
  • the gameplay server may be configured to select the virtual in-game object for the display based on a gameplay state of the gameplay.
  • the display may be incorporated into a head mounted device (HMD) or a heads-up display (HUD).
  • HMD head mounted device
  • HUD heads-up display
  • the display may be incorporated into the communications device.
  • FIG. 1A depicts an example of an augmented reality gaming system, with one or more wearable optical devices, according to some embodiments.
  • FIG. IB depicts an example of an augmented reality gaming environment, according to some embodiments.
  • FIG. 1C depicts an example of an interior view of a wearable optical device, according to some embodiments.
  • FIG. ID depicts an example of an interior view of a wearable optical device, according to some embodiments.
  • FIG. IE depicts an example of an interior view of a wearable optical device, according to some embodiments.
  • FIG. 2 depicts an example of a emitter, according to some embodiments.
  • FIG. 3 depicts an example of a receiver, according to some embodiments.
  • FIG. 4 depicts an example of a communications device, according to some embodiments.
  • FIG. 5 depicts an example of a wearable optical device, according to some embodiments.
  • FIG. 6 depicts an example of a gameplay system, according to some embodiments.
  • FIG. 7 depicts an example of a gameplay display management module, according to some embodiments.
  • FIG. 8 depicts a flowchart of an example of a method for facilitating an augmented reality electronic game having one or more virtual objects, according to some embodiments.
  • FIG. 9 depicts a flowchart of an example of a method for rendering a virtual object in an augmented reality electronic game, according to some embodiments.
  • FIG. 10 depicts a flowchart of an example of a method for modifying a state of a virtual object in an augmented reality electronic game, according to some embodiments.
  • FIG. 11 depicts a flowchart of an example of a method for facilitating an augmented reality electronic game having one or more virtual objects, according to some embodiments.
  • FIG. 12 depicts an example of a digital device, according to some embodiments.
  • FIG. 13A depicts an example of an augmented reality gaming system, according to some embodiments.
  • FIG. 13B depicts an example of an augmented reality gaming system, according to some embodiments.
  • FIG. 14 depicts an example of an augmented reality gaming environment, according to some embodiments.
  • FIG. 15 depicts an example of an emitter and a communications device in an augmented reality gaming environment, according to some embodiments.
  • FIG. 16 depicts an example of a display and a communications device in an augmented reality gaming environment, according to some embodiments.
  • FIG. 17 depicts an example screen of an augmented reality electronic game, according to some embodiments.
  • FIG. 18 depicts an example screen of an augmented reality electronic game, according to some embodiments.
  • FIG. 19 depicts an example screen of an augmented reality electronic game, according to some embodiments.
  • FIG. 20 depicts an example screen of an augmented reality electronic game, according to some embodiments.
  • FIG. 1A depicts an example of an augmented reality gaming system 100, with one or more wearable optical devices, according to some embodiments.
  • the augmented reality gaming system 100A includes a plurality of player environments 105 (illustrated in FIG. 1 as a first player environment 105-1 through an Nth player environment 105-N), a network 110, and a gameplay system 115.
  • the first player environment 105-1 may comprise one or more devices associated with a first person or set of persons.
  • the first player environment 105-1 may include a first emitter 120-1, a first receiver 125-1, a first communications device 130-1, and a first wearable optical device 135-1.
  • the first emitter 120-1, the first receiver 125-1, and/or the first wearable optical device 135-1 may be coupled to the first communications device 130-1.
  • the coupling may use any known or convenient format (a Bluetooth® connection (e.g., a Bluetooth Low Energy® connection), a 802.11 connection, a cellular connection, a bus, wire, or wires, etc.).
  • the first emitter 120-1, the first receiver 125-1, and the first communications device 130-1 may be used by a first player to engage in augmented reality electronic gameplay and/or augmented reality electronic gameplay.
  • the first emitter 120-1 may comprise a digital device having a transmitter that emits an emitter signal to a receiver.
  • a digital device as used herein, may comprise any device having a processor and a memory.
  • a digital device may comprise some or all of the components of the digital device 1200, shown in FIG. 12.
  • the emitter signal may comprise one or more of a variety of electromagnetic signals.
  • the emitter signal may include an infrared signal, a Near Field Communications (NFC) signal, etc.
  • the emitter signal may comprise a beam that is directed at the receiver.
  • the beam may be encoded with a unique identifier corresponding to the first emitter 120-1.
  • the first emitter 120-1 may provide information related to the emitter signal to the first communications device 130-1.
  • the first emitter 120-1 may be controlled by the first communications device 130-1.
  • the first emitter 120-1 may be incorporated into a modular peripheral device, that is, a device that is provided using a hardware development kit.
  • An example of a hardware development kit includes a set of plans that players can print on a three-dimensional (3D) printer using a template in the kit.
  • the first emitter 120-1 may take the form of a weapon used in augmented reality electronic gameplay.
  • the first emitter 120-1 may be a gun, a bow, a sword, a wand, a grenade, or other weapon.
  • the first emitter 120-1 may have an interaction recognition mechanism that recognizes interactions with the first emitter 120-1 and/or instructs the transmitter of the first emitter 120-1 to emit the emitter signal.
  • the interaction recognition mechanism may have a variety of forms.
  • the interaction recognition mechanism may comprise: a shoot mechanism corresponding to a trigger on a gun, a motion recognition mechanism that recognizes body movements that correspond to motions taken by a user of a bow, a sword, a wand, grenade, etc.
  • the interaction recognition mechanism may appear as a finger-based trigger. When the finger-based trigger is activated, the first emitter 120-1 may emit the emitter signal.
  • the interaction recognition mechanism may appear as a grenade clip that instructs emission of the emitter signal after expiration of a specified time.
  • the first emitter 120-1 need not have an interaction recognition mechanism, and may emit the emitter signal upon occurrence of any number of specified events. It is further noted, in various embodiments, the first emitter 120-1 need not take the form of a weapon, and may instead take some other form. For instance, in some embodiments, the first emitter 120-1 may take the form of a search device used in scavenger-hunting gameplay. In various embodiments, the first emitter 120-1 may be wearable. For example, the first emitter 120-1 may be integrated into a piece of clothing to be worn on a player.
  • the first emitter 120-1 may include hardware, software, and/or firmware to trigger data export to the first communications device 130-1 at various times, including: when the first emitter 120-1 is initially coupled to the first communications device 130-1, when a player has taken an action on the first emitter 120-1, and when the first emitter 120-1 is decoupled from the first communications device 130-1.
  • the first emitter 120-1 may have some or all of the components of the emitter 120, shown in FIG. 2.
  • the first receiver 125-1 may comprise a digital device configured to receive an emitter signal.
  • the first receiver 125-1 may receive the emitter signal from an emitter associated with another player (e.g., the Nth emitter 120-N). If the emitter signal is encoded with the identity of an emitter, the first receiver 125-1 may decode the emitter signal.
  • the first receiver 125-1 may provide to the first communications device 130-1 a receiver signal corresponding to the received emitter signal.
  • the first receiver 125-1 may be controlled by the first communications device 130-1.
  • the first receiver 125-1 may be incorporated into a modular peripheral device.
  • the first receiver 125-1 may have a form compatible with augmented reality electronic gameplay. More specifically, the first receiver 125-1 may be configured to register in-game actions, such as shots, hits, outcomes of spells, etc. In gameplay where the first emitter 120-1 is configured as a gun, for instance, the first receiver 125-1 may be configured to receive a beam from the emitter 120-1. In gameplay where the first emitter 120-1 is configured as a sword, the first receiver 125-1 may be configured as a tunic or other wearable item configured to receive a touch by the first emitter 120-1, in one example.
  • the first receiver 125-1 may be configured to receive emitter signals from an approximate point source corresponding to the location of the first emitter 120-1.
  • the first receiver 125-1 may include an identifier, such as a Quick Response (QR) Code that facilitates access to items in gameplay.
  • QR Quick Response
  • the first receiver 125-1 may provide such an identifier.
  • the first receiver 125-1 may comprise a disk, puck, biscuit, etc. that is placed within the geographical region and receives information related position from the first emitter 120-1.
  • the first receiver 125-1 may include BLE or Wi-Fi hardware that allows distance to the first emitter 120-1 to be determined with a specified degree of accuracy.
  • the first receiver 125-1 may trigger data export to the first communications device 130-1 at various times, including: when the first receiver 125-1 is initially coupled to the first communications device 130-1, when the first receiver 125-1 has indicated some action (e.g., a valid hit) has been taken on the first receiver 125-, and when the first receiver 125-1 is decoupled from the first communications device 130-1.
  • the first receiver 125-1 may have some or all of the components of the receiver 125, shown in FIG. 3.
  • the first communications device 130-1 may comprise a digital device configured to control the first emitter 120-1, the first receiver 125-1, and/or the first optical wearable device 135-1.
  • the first communications device 130-1 may be one or more of: a mobile phone, a tablet computing device, a desktop computer, a laptop computer, or other digital device.
  • the first communications device 130-1 may have some or all of the components of the communications device 130, shown in FIG. 4.
  • the first communications device 130-1 supports augmented reality electronic gameplay using the first emitter 120-1, the first receiver 125-1, the gameplay system 115, and/or the first optical wearable device 135-1.
  • the first communications device 130-1 may receive emitter signals from the first emitter 120-1.
  • the first communications device 130-1 may further receive the receiver signal from the first receiver 125-1.
  • the first communications device 130-1 may provide the first player with an application that presents augmented reality electronic gameplay.
  • the application may include data, services, and other information obtained from the gameplay system 115.
  • the application may have been downloaded from an application store or installed using other methodologies.
  • the application may support in-game purchases and/or in-game advertising.
  • the application may give any venue (retail stores, restaurants, stadiums, movie theaters, etc.) the ability to run promotions, drive advertisement revenue, and encourage the social sharing of their brand to the player' s game app on their phone.
  • FIG. 1 shows the first communications device 130-1 associated with a first player
  • the first communications device 130-1 need not be associated with a human being. Rather, in various embodiments, the first communications device 130-1 may be associated with and/or controlled by a digital device.
  • the first communications device 130-1 may be controlled by an inanimate entity that, in turn receives instructions from the gameplay system 115.
  • the first emitter 120-1 and/or the first receiver 125-1 may be associated with the inanimate entity.
  • the first receiver 125-1 may correspond to an inanimate object that is to be discovered as an object of gameplay.
  • the first communications device 130-1 may not have access or may have only limited access to the network 110 while gameplay is underway.
  • the first communications device 130-1 may not have access to a cellular or Wi-Fi network during augmented reality electronic gameplay.
  • the first communications device 130-1 may cache or otherwise store data associated with the augmented reality electronic gameplay and provide the data to the gameplay system 115 when there is connectivity or sufficient connectivity to the network 110.
  • the first wearable optical device 135-1 may comprise a digital device configured to display virtual objects to the first user.
  • a "virtual object,” as used herein, may refer to any object that is displayed on a display of a digital device and that is not part of the physical world. Virtual objects may include portions of a graphical user interface (GUI), such as menus, radio buttons, text fields, visible web and/or application components, or the like.
  • GUI graphical user interface
  • Virtual objects may, but need not, comprise virtual in-game objects, such as elements of an electronic game that change state in response to a user's inputs/interactions.
  • virtual in-game objects further include virtual characters, virtual items, virtual points, game levels, or the like that are part of gameplay of an electronic game.
  • the first wearable optical device 135-1 renders virtual objects onto a display.
  • the display may be transparent, translucent, opaque, etc.
  • the first wearable optical device 135-1 may superimpose virtual objects over a first perspective of the physical world.
  • the first wearable optical device 135-1 may include or be coupled to external cameras (e.g., depth-sensing cameras) or other hardware configured to provide the first user with the first perspective.
  • the display may superimpose virtual objects over representations (images, video, streaming video, etc.) of the physical world.
  • the first wearable optical device 135-1 provides one or more of augmented reality and virtual reality to the first user.
  • Examples embodiments of the first wearable optical device 135-1 include an Optical Head Mounted Display (e.g., a heads up display (HUD)), or an optical device mounted on or coupled to some portion of the first user's body or clothing.
  • the first wearable optical device 135-1 has some or all of the components of the wearable optical device 135, shown in FIG. 5.
  • FIG. 1 and portions of the description herein may describe the first wearable optical device 135-1 as separate from the first emitter 120-1, the first receiver 125-1, and the first communications device 130-1, it is noted that in various implementations, the wearable optical device 135-1 may be part of or connected to the first emitter 120-1, the first receiver 125-1, or the first communications device 130-1.
  • the first wearable optical device 135-1 may include at least a portion of the display of the first communications device 130-1.
  • the first wearable optical device 135-1 may include at least a portion of the display of the first communications device 130-1.
  • first communications device 130-1 may be incorporated into (e.g., embedded in circuitry within) the first wearable optical device 135-1. It is noted that in various embodiments, the first wearable optical device 135-1 may also reside within one or more of the first emitter 120-1 and the first receiver 125-1.
  • the Nth player environment 105-N represents a set of devices associated with an Nth person or set of persons. It is noted the letter "N" represents an arbitrary number, and may correspond to any integer greater than 1.
  • the Nth player environment 105-N comprises an Nth emitter 120-N, an Nth receiver 125-N, an Nth communications device 130-N, and an Nth wearable optical device 135-N.
  • the Nth emitter 120-N may be similar to the first emitter 120-1, discussed herein.
  • the Nth receiver 125-N may be similar to the first receiver 125-1, discussed herein.
  • the Nth communications device 130-N may be similar to the first communications device 130-1.
  • the Nth wearable optical device 135-N may be similar to the first optical wearable device 135-1, discussed herein.
  • the devices in the Nth player environment 105-N engage in augmented reality electronic gameplay with the devices in the first player environment 105-1.
  • the network 110 may comprise a computer network.
  • the network 110 may include technologies such as Ethernet, 802. l lx, worldwide interoperability for microwave access WiMAX, 2G, 3G, 4G, CDMA, GSM, LTE, digital subscriber line (DSL), and/or the like.
  • the network 110 may further include networking protocols such as multiprotocol label switching (MPLS), transmission control protocol/Internet protocol (TCP/IP), User Datagram Protocol (UDP), hypertext transport protocol (HTTP), simple mail transfer protocol (SMTP), file transfer protocol (FTP), and/or the like.
  • MPLS multiprotocol label switching
  • TCP/IP Transmission control protocol/Internet protocol
  • UDP User Datagram Protocol
  • HTTP hypertext transport protocol
  • SMTP simple mail transfer protocol
  • FTP file transfer protocol
  • the data exchanged over the network 110 can be represented using technologies and/or formats including hypertext markup language (HTML) and extensible markup language (XML).
  • the network 110 may be coupled to the first communications device 130-1, to the Nth communications device 130-N, and to the gameplay system 115. In various embodiments, though not shown in FIG. 1, the network 110 may be coupled to one or more of the first emitter 120-1, the first receiver 125-1, the Nth emitter 120-N, and the Nth receiver 125-N.
  • SSL secure sockets layer
  • TLS transport layer security
  • IPsec Internet Protocol security
  • the gameplay system 115 may comprise one or more digital devices configured to support processes, applications, etc. on the communication devices 130.
  • the gameplay system 115 may include dedicated, shared, or distributed servers.
  • the gameplay system 115 supports augmented reality electronic gameplay by the communications devices 130.
  • the gameplay system 115 may facilitate creation of new games, and/or may manage player accounts.
  • the gameplay system 115 may also allow for the management of aspects of existing electronic games. For instance, in some embodiments, the gameplay system 115 may track successful or unsuccessful actions by emitters associated with players.
  • the gameplay system 115 may provide to communications devices whether an action by a particular emitter successfully registered at a particular receiver.
  • the gameplay system 115 may further provide instructions to the wearable optical devices 135 to display virtual objects.
  • the gameplay system 115 may have some or all of the components of the gameplay system 115, shown in FIG. 5.
  • FIG. 1 depicts a first emitter 120-1 through an Nth emitter 120-N, a first receiver 125-1 through an Nth receiver 125-N, a first communications device 130-1 through an Nth communications device 130-N, and a first wearable optical device 135-1 through an Nth wearable optical device 135-N, in order to illustrate various implications of multiple players of sensor-based mobile gameplay.
  • portions of the discussion herein refer to an "emitter 120" or “emitters 120,” a “receiver 125” or “receivers 125", a “communications device 130" or “communication devices 130,” and a "wearable optical device 135" or
  • FIG. IB depicts an example of an augmented reality gaming environment 100B, according to some embodiments.
  • the augmented reality gaming environment 100B may include the first player environment 105-1 (having therein the first emitter 120-1, the first receiver 125-1, the first communications device 130-1, and the first wearable optical device 135-1) and the Nth player environment 105-N (having therein the Nth emitter 120-N, the Nth receiver 125-N, the Nth communications device 130-N, and the Nth wearable optical device 135-N).
  • the augmented reality gaming environment 100B may further include a virtual in- game object 140 that is displayed on the first wearable optical device 135-1 and the Nth wearable optical device 135-N but is not present in the physical world.
  • the virtual in-game object 140 may be seen by the first wearable optical device 135-1 at a first perspective 145-1, and may be seen by the Nth wearable optical device 135-N at an Nth perspective 145-N.
  • the augmented reality gaming environment 100B may be defined by geo-fences 150. Each of the geo-fences 150 may limit the areas the augmented reality electronic game can be played.
  • the virtual in-game object 140 is depicted as a ball, it will be appreciated that the virtual in-game object 140 may be any creature (e.g., alien, human, animal, dragon, or the like), animated object, or inanimate object. There may be any number of virtual in-game objects 140 in the augmented reality gaming environment 100B.
  • FIG. 1C depicts an example of an interior view of a wearable optical device 135, according to some embodiments.
  • the interior view in FIG. 1C includes a virtual inventory 155 of virtual items and a menu 160 for selecting actions.
  • Each of the virtual inventory 155 and the menu 160 may be formed from virtual objects for an augmented reality electronic game.
  • FIG. ID depicts an example of an interior view of a wearable optical device 135, according to some embodiments.
  • the interior view in FIG. ID includes a virtual health monitor 165, a virtual map 170, and a notification object 175.
  • the virtual health monitor 165 may depict the health of a game player in an augmented reality electronic game;
  • the virtual map 170 may depict a map of a virtual world in the augmented reality electronic game;
  • the notification object 175 may provide notifications related to the an augmented reality electronic game.
  • Each of the virtual health monitor 165, the virtual map 170, and the notification object 175 may be formed from virtual objects for an augmented reality electronic game.
  • FIG. IE depicts an example of an interior view of a wearable optical device 135, according to some embodiments.
  • the interior view in FIG. IE includes a representation of the virtual in-game object 140 and the virtual inventory 155.
  • the representation of the virtual in- game object 140 and the virtual inventory 155 may be formed from virtual objects for an augmented reality electronic game.
  • the augmented reality gaming system 100A allows one or more game players to play augmented reality electronic games that are supported by the data available over the network 110 (e.g., over the Internet).
  • the augmented reality electronic games may comprise forms of alternate reality gaming in which aspects of the physical world are incorporated into mobile gameplay, and/or in which the physical world is augmented with virtual in-game objects 140 from the electronic game.
  • the gaming experience provided by the augmented reality gaming system 100 A may provide new dimensions to outdoor games by leveraging smartphone technologies and the Internet, and bridging conventional gaming divides between the real world and digital worlds by combining physical participation, geolocational data, social networking data, and elements of games (such as action and/or role- playing games).
  • the gameplay system 115 may also provide messaging and/or social media capabilities for players to communicate with each other.
  • the augmented reality electronic game may be developed using a Game Development Kit (GDK).
  • GDK Game Development Kit
  • Augmented reality electronic games supported by the augmented reality gaming system 100 A and/or the augmented reality gaming environment 100B may include actions game players take against each other as well as actions game players take against virtual in- game objects 140 rendered in wearable optical device(s) 135.
  • the augmented reality electronic games may allow game players can use emitter(s) 120 to register hits against receiver(s) 125 (e.g., combat or adventure genres that allow players to simulate battles with one another).
  • players may use emitters to attempt in-game actions, and receivers to register successful in-game actions.
  • the first emitter 120-1 may emit an emitter signal toward the Nth receiver 125-N each time the first player attempts to attack the Nth player.
  • the in-game actions may correspond to a gun being shot, a sword being swung, or a grenade being launched.
  • Emitter signals from the first emitter 120-1 may be encoded with the identity of the first emitter 120-1.
  • the first emitter 120-1 may provide the first communications device 130-1 with information about in-game action attempts.
  • the augmented reality gaming system 100 A may allow players to verify the actions of other players. Players need not wonder whether, for instance, the first emitter 120-1 accurately took an action with respect to the Nth receiver 125-N. More specifically, the augmented reality gaming system 100A may allow users to use technologies such as geolocational technologies, infrared technologies, and data available over the network 110 to provide real-time feedback of gameplay between players.
  • the Nth receiver 125-N may register successful in-game actions each time the emitter signal successfully contacts the Nth receiver 125-N. For each successful in-game action, the Nth receiver 125-N may decode received emitter signals as needed. The Nth receiver 125-N may further provide information about successful in-game actions to the Nth communications device 130-N, which in turn may provide this information to the gameplay system 115. In these embodiments, the gameplay system 115 may provide information about the in-game actions, whether successful or not, to the first communications device 130-1 and the Nth communications device 130-N. The first communications device 130-1 and the Nth communications device 130-N may update user interface elements thereon accordingly.
  • the augmented reality electronic games supported by the augmented reality gaming system 100 A and/or the augmented reality gaming environment 100B may render the virtual in-game objects 140 in game players' wearable optical device(s) 135 and may allow game players to take actions against the virtual in-game objects 140. More specifically, the gameplay system 115 may determine the location of a game player using one or more location determination techniques.
  • location determination techniques include obtaining the game player' s location through Global
  • GPS Positioning System
  • Another example of location determination techniques includes placing physical sensors in one or more of the receiver(s) 125, and identifying locations of emitter(s) 120 within a geo- fenced region around those physical sensors (e.g., the region within the geo-fences 150). In various embodiments, the physical sensors may determine attributes such as altitude, distance, angular orientation, etc. of the emitter(s) 120 within the geo-fenced region.
  • Yet another example of location determination techniques includes placing beacons (e.g., BLE beacons) within a geo-fenced region and using proximity of emitter(s) 120 to beacons to determine locations of game players. It is noted that some combination of these techniques may be employed in various implementations.
  • the gameplay system 115 may select virtual in-game objects 140 to render in wearable optical device(s) 135.
  • the selection of virtual in-game objects 140 may depend on a variety of factors, such as a state of gameplay and the location of a game player in the physical world.
  • the gameplay system 115 may select virtual in-game objects 140 based on a level a game player is encountering in an augmented reality electronic game, the status of a game player within the level, the health of the game player, the number of points or virtual items the game player has earned, the status, levels, etc. of another player in the augmented reality electronic game, etc.
  • the gameplay system 115 may select virtual items such as graphical elements that represent a game player' s health, points, and virtual goods if these virtual items are associated with a gameplay status of the game player at a given time and/or physical location.
  • the gameplay system 115 may select a virtual in-game object 140 corresponding to a three-dimensional representation of a dragon if game players in an augmented reality electronic game are to fight a dragon as part of gameplay.
  • the gameplay system 115 renders the selected virtual in-game objects 140 in wearable optical device(s) 135.
  • the rendering of virtual in-game objects 140 may depend on a variety of factors, such as a state of gameplay and a perspective of a game player viewing the virtual in-game object 140 through a wearable optical device associated with the game player.
  • the gameplay system 115 may render perspectives of virtual in-game objects 140 based on a level a game player is encountering in an augmented reality electronic game, the status of a game player within the level, the health of the game player, the number of points or virtual items the game player has earned, the status, levels, etc. of another player in the augmented reality electronic game, etc.
  • the gameplay system 115 may render portions of a three-dimensional representation of a dragon that game players are expected to see based on an estimated perspective(s) of the game players.
  • the gameplay system 115 may render multiple perspectives of the dragon; each of the multiple perspectives may depend on angles, distances, etc. between game players and the coordinates of the dragon in augmented reality.
  • the gameplay system 115 accesses Computer Aided Design (CAD) files (e.g., Unity® files) related to in-game objects for rendering into wearable optical device(s) 135.
  • CAD Computer Aided Design
  • the gameplay system 115 may allow game players to interact with the virtual in- game object 140 by taking one or more actions against the virtual object. More specifically, in some embodiments, the gameplay system 115 allows game players to take actions against virtual in-game objects 140 using the emitter interaction mechanism on the emitter 120.
  • Examples of such actions may correspond to shooting of a gun, swinging of a sword, making a motion corresponding to casting a spell using a wand, throwing a grenade, and picking up an item during a scavenger hunt.
  • the gameplay system 115 allows game players to take actions against virtual in-game objects 140 using gestures or other user input on the communications device 130. Examples of such actions include switching weapons or reloading a weapon using radio buttons on the graphical user interface of the communications device 130.
  • the gameplay system 115 allows game players to take actions against virtual in-game objects 140 using the wearable optical device 135.
  • Examples of such actions include voice commands, touch gestures on hardware on the wearable optical device 135, eye movements that are tracked by the wearable optical device 135, and motions detected by the wearable optical device 135 (pitch, roll, yaw, tilts, translations, any motion observed by an accelerometer or gyroscope, etc.).
  • the gameplay system 115 may register the game player's actions against the virtual in-game object 140 by recording the actions against the virtual object. The gameplay system 115 may further modify the state of the virtual in-game object 140 based on the actions. To continue the foregoing examples, in augmented reality electronic games involving virtual in- game objects 140 corresponding to representations of dragons, successful "hits" by the emitter 120 may be registered as injuries to the dragon. In response to such hits, the gameplay system 115 may render the dragon in a diminished capacity. As yet another example, if a game player could not defend against an attack by the dragon, the gameplay system 115 may reduce a virtual representation of the game player's in-game health.
  • Example Emitter 120
  • FIG. 2 depicts an example of an emitter 120, according to some embodiments.
  • the emitter 120 may include a communications interface module 205, a emitter interaction mechanism 210, a speaker 215, a short-range infrared transmitter 220, a long-range infrared transmitter 225, a beam encoder module 230, and a controller 235.
  • the emitter 120 may include sensors and/or components not identified explicitly in FIG. 2.
  • the communications interface module 205 may facilitate communications between the emitter 120 and the communications device 130. In an embodiment, the communications interface module 205 facilitates pairing between the emitter 120 and the communications device 130. In various embodiments, the communications interface module 205 may be configured as a Bluetooth® pairing module that allows the emitter 120 to be wirelessly coupled to the communications device 130. The communications interface module 205 may also include any wireless or wired network hardware and/or software in various embodiments. The communications interface module 205 may receive instructions from the controller 235.
  • the emitter interaction mechanism 210 may allow a player to initiate an action.
  • the emitter interaction mechanism 210 may correspond to a trigger of a gun.
  • the emitter interaction mechanism 210 may also correspond to a portion (e.g., a blade portion) of a sword or a grenade, depending on a type of weapon the emitter 120 is intended to model.
  • the emitter interaction mechanism 210 may also correspond to a portion of a metal detector for a scavenger-hunt game.
  • the emitter interaction mechanism 210 may provide a signal to the controller 235 when an action has been initiated.
  • the speaker 215 may provide an audible sound.
  • the speaker 215 may provide sounds related to sensor-based mobile gameplay when the emitter interaction mechanism 210 has been activated. The sound may correspond to the nature of the action initiated. For instance, the speaker 215 may provide sounds similar to the shooting of a gun, the clash of a sword on armor, or the explosion of a grenade
  • the speaker 215 may provide in-game information such as in-game sounds, story narration, clues, and/or other information to enhance gameplay experiences.
  • the speaker 215 may receive instructions from the controller 235.
  • the short-range infrared transmitter 220 and the long-range infrared transmitter 225 may each emit an infrared signal corresponding to an emitter signal.
  • the short-range infrared transmitter 220 and the long-range infrared transmitter 225 may have different ranges, or may have partially overlapping ranges.
  • the short-range infrared transmitter 220 and the long-range infrared transmitter 225 may provide infrared signals in response to the emitter interaction mechanism 210.
  • the short-range infrared transmitter 220 and the long-range infrared transmitter 225 may receive instructions from the controller 235.
  • the beam encoder module 230 may encode emitter signals with an identifier corresponding to the identity of the emitter 120.
  • the beam encoder module 230 may receive a unique identifier of the emitter 120 from the controller 235.
  • the beam encoder may further encode emitter signals with the unique identifier. Encoding may involve frequency selection frequency modulation of the emitter signal, or encoding particular sequences of data into the emitter signal from the emitter 120.
  • the beam encoder module 230 may provide the code to the short-range infrared transmitter 220 and/or the long-range infrared transmitter 225.
  • the controller 235 may control other components of the emitter 120.
  • the controller 235 may provide instructions to one or more of the communications interface module 205, the emitter interaction mechanism 210, the speaker 215, the short-range infrared transmitter 220, the long-range infrared transmitter 225, and the beam encoder module 230.
  • the controller 235 may include a processor and memory.
  • the controller 235 may include a mobile device processor and static or dynamic memory.
  • FIG. 3 depicts an example of a receiver 125, according to some embodiments.
  • the receiver 125 may include a communications interface module 305, an infrared receiver 310, a beam decoder 315, a vibrator 320, a speaker 325, Light Emitting Diodes (LEDs) 330, and a controller 335.
  • the receiver 125 may include sensors and/or components not identified explicitly in FIG. 3.
  • the communications interface module 305 may facilitate communications between the receiver 125 and the communications device 130. In an embodiment, the communications interface module 305 facilitates pairing between the receiver 125 and the communications device 130. In various embodiments, the communications interface module 305 may be configured as a Bluetooth® pairing module that allows the receiver 125 to be wirelessly coupled to the communications device 130. The communications interface module 305 may also include any wireless or wired network hardware and/or software in various embodiments. The communications interface module 305 may receive instructions from the controller 335.
  • the infrared receiver 310 may receive infrared signals.
  • the infrared receiver 310 may be implemented as an electromagnetic receiver that filters out frequencies other than infrared signals. It is noted the infrared receiver 310 may be replaced or augmented by non-infrared technologies, such as other wireless technologies and/or NFC technologies, without departing from the scope and substance of the inventive concepts herein.
  • the infrared receiver 310 may provide received infrared signals to the beam decoder 315 and/or other modules of the receiver 125.
  • the beam decoder 315 may decode received emitter signals. More specifically, the beam decoder 315 may identify an emitter identifier encoded in emitter signals received by the infrared receiver 310. In various embodiments, the beam decoder 315 may receive instructions from the controller 335.
  • the vibrator 320 may cause the receiver 125 to physically move.
  • the speaker 325 may make an audible noise.
  • the LEDs 330 may cause all or a part of the receiver 125 to appear to light up.
  • the vibrator 320, the speaker 325, and the LEDs 330 may receive instructions from the controller 335 to be activated when the infrared receiver 310 has received an emitter signal that indicates a gameplay action by an emitter.
  • the controller 335 may control other components of the receiver 125.
  • the controller 235 may provide instructions to one or more of the communications interface module 305, the infrared receiver 310, the beam decoder 315, the vibrator 320, the speaker 325, and the Light Emitting Diodes (LEDs) 330.
  • the controller 335 may include a processor (e.g., a mobile device processor) and memory (e.g., static or dynamic memory).
  • FIG. 4 depicts an example of a communications device 130, according to some embodiments.
  • the communications device 130 may include a pairing management module 405, a user interface module 410, an emitter interface module 415, a receiver interface module 420, a gameplay cloud interface module 425, a gameplay memory datastore 430, a wearable optical device interface module 435, a communication device interaction recognition module 440, and a local environment determination module 445.
  • One or more of the pairing management module 405, the user interface module 410, the emitter interface module 415, the receiver interface module 420, the gameplay cloud interface module 425, the gameplay memory datastore 430, the wearable optical device interface module 435, the communication device interaction recognition module 440, and the local environment determination module 445 may include hardware and/or software, in various embodiments.
  • One or more of the pairing management module 405, the user interface module 410, the emitter interface module 415, the receiver interface module 420, the gameplay cloud interface module 425, the gameplay memory datastore 430, the wearable optical device interface module 435, the communication device interaction recognition module 440, and the local environment determination module 445 may be coupled to one another or to components external to the communications device 130.
  • the pairing management module 405 may configure the communications device 130 to be paired with other devices.
  • the pairing management module 405 may include a Bluetooth® pairing module that facilitates wireless pairing with other devices.
  • the pairing management module 405 may also perform other types of pairing to couple the communications device 130 to other devices without departing from the scope and the substance of the inventive concepts herein.
  • the pairing management module 405 may facilitate pairing with one or more of the emitter 120, the receiver 125, and the wearable optical device 135.
  • the user interface module 410 may facilitate user interaction with the
  • the user interface module 410 may configure a display of the communications device 130 to provide one or more user interface elements with which a player can interact.
  • the user interface module 410 may further provide scenes, views, perspectives, and other attributes of gameplay to a user.
  • the user interface module 410 may also facilitate user input to the communications device 130.
  • the user interface module 410 may include video processing hardware and/or software, in various embodiments.
  • the emitter interface module 415 may facilitate interfacing with the emitter 120. In various embodiments, the emitter interface module 415 may receive and/or provide data to the emitter 120.
  • the receiver interface module 420 may facilitate interfacing with the receiver 125. In various embodiments, the receiver interface module 420 may receive and/or provide data to the receiver 125.
  • the gameplay cloud interface module 425 may facilitate coupling the
  • the gameplay cloud interface module 425 may receive and/or provide data to the gameplay system 115.
  • the gameplay cloud interface module 425 may, in various embodiments, provide player information (e.g., player information related to the emitter 120) to the gameplay system 115.
  • the gameplay cloud interface module 425 may incorporate network interface hardware and/or software to facilitate interfacing with the network 110.
  • the wearable optical device interface module 435 may facilitate interfacing with the wearable optical device 135. In various embodiments, the wearable optical device interface module 435 may receive and/or provide data to the wearable optical device 135.
  • the communication device interaction recognition module 440 may receive user interactions. In some embodiments, the communication device interaction recognition module 440 receives and/or identifies gestures or other user input to the communications device 130. As examples, the communication device interaction recognition module 440 may receive and/or identify switching weapons or reloading a weapon using radio buttons on the graphical user interface of the communications device 130.
  • the local environment determination module 445 may provide data (such as a location of the communications device 130) that used to recognize parameters of the physical world around the communications device 130.
  • the local environment determination module 445 includes a GPS receiver that identifies GPS coordinates of the communications device 130.
  • the local environment determination module 445 may include hardware and/or software that interface with physical sensors on receiver(s) 125 and allows determination of location based on proximity and/or other physical parameters to the receiver(s) 125.
  • the local environment determination module 445 includes BLE hardware and/or software that provides a location of the communications device 130 based on proximity to locational beacons.
  • FIG. 5 depicts an example of a wearable optical device 135, according to some embodiments.
  • the wearable optical device 135 may include a communications interface module 505, a display rendering module 510, an eye movement recognition module 515, a touch input recognition module 520, a voice input recognition module 525, an emitter interaction recognition module 530, a motion detection module 535, and a controller 540.
  • One or more of the communications interface module 505, the display rendering module 510, the eye movement recognition module 515, the touch input recognition module 520, the voice input recognition module 525, the emitter interaction recognition module 530, the motion detection module 535, and the controller 540 may include hardware and/or software, in various embodiments.
  • One or more of the communications interface module 505, the display rendering module 510, the eye movement recognition module 515, the touch input recognition module 520, the voice input recognition module 525, the emitter interaction recognition module 530, the motion detection module 535, and the controller 540 may be coupled to one another or to components external to the wearable optical device 135.
  • the communications interface module 505 may facilitate communications between the wearable optical device 135 and the communications device 130.
  • the communications interface module 505 facilitates pairing between the wearable optical device 135 and the communications device 130.
  • the communications interface module 505 may be configured as a Bluetooth® pairing module that allows the wearable optical device 135 to be wirelessly coupled to the communications device 130.
  • the communications interface module 505 may also include any wireless or wired network hardware and/or software in various embodiments.
  • the communications interface module 505 may receive instructions from the controller 540.
  • the display rendering module 510 may render virtual objects onto a display of the wearable optical device 135.
  • the display rendering module 510 addresses pixels and/or other portions of a display of the wearable optical device 135 to show virtual objects.
  • the eye movement recognition module 515 may track eye movements of a user of the wearable optical device 135. In some implementations, the eye movement recognition module 515 recognizes commands, actions, etc. based on eye movements.
  • the touch input recognition module 520 may recognize touch input by a user of the wearable optical device 135. In various implementations, the touch input recognition module 520 recognizes commands, actions, etc. based on touches (e.g., touches to various external surfaces of the wearable optical device 135).
  • the voice input recognition module 525 may recognize voice input by a user of the wearable optical device 135. In various implementations, the voice input recognition module 525 recognizes commands, actions, etc. based on natural language commands provided by the user of the wearable optical device 135.
  • the emitter interaction recognition module 530 may recognize actions based on touches, motions, etc. of the emitter 120. In some implementations, the emitter interaction recognition module 530 recognizes commands, actions, etc. based on touches, motions, etc. of the emitter 120.
  • the motion detection module 535 may recognize motion (pitch, roll, yaw, tilts, translations, any motion observed by an accelerometer or gyroscope, etc.) of the wearable optical device 135. In various implementations, the motion detection module 535 recognizes commands, actions, etc. based on how the user of the wearable optical device 135 moves the wearable optical device 135.
  • the controller 540 may control other components of the wearable optical device 135.
  • the controller 540 may provide instructions to one or more of the communications interface module 505, the display rendering module 510, the eye movement recognition module 515, the touch input recognition module 520, the voice input recognition module 525, the emitter interaction recognition module 530, and the motion detection module 535.
  • the controller 540 may include a processor (e.g., a mobile device processor) and memory (e.g., static or dynamic memory).
  • FIG. 6 shows an example of a gameplay system 115, according to some embodiments.
  • the gameplay system 115 may include a mobile device interface module 605, an account management module 610, a new game creation module 615, a game code distribution module 620, a gameplay display management module 625, an account datastore 630, a device datastore 635, and a game datastore 640.
  • One or more of the mobile device interface module 605, the account management module 610, the new game creation module 615, the game code distribution module 620, the gameplay display management module 625, the account datastore 630, the device datastore 635, and the game datastore 640 may include hardware and/or software.
  • One or more of the mobile device interface module 605, the account management module 610, the new game creation module 615, the game code distribution module 620, the gameplay display management module 625, the account datastore 630, the device datastore 635, and the game datastore 640 may be coupled to one another or to components external to the gameplay system 115.
  • the mobile device interface module 605 may facilitate coupling the gameplay system 115 to the communications device 130.
  • the mobile device interface module 605 may receive and/or provide data to the communications device 130.
  • the mobile device interface module 605 may incorporate network interface hardware and/or software to facilitate interfacing with the network 110.
  • the account management module 610 may manage accounts for players of sensor- based mobile gameplay.
  • the account management module 610 may manage information such as players' points, usernames, and levels.
  • the account management module 610 may also manage players' relationships with each other. For example, the account management module 610 may manage actions specific players have taken with respect to other players.
  • the account management module 610 may manage player accounts based on information about players stored in the account datastore 630.
  • the account management module 610 may also manage player accounts based on information about devices stored in the device datastore 635.
  • the new game creation module 615 may facilitate creation of new games.
  • the new game creation module 615 may receive instructions to create a new game from a player.
  • the instructions may include identifiers of all players who are invited to play the game.
  • the new game creation module 615 may obtain a game instance from the game datastore 640, and place the game instance into memory of the gameplay system 115.
  • the new game creation module 615 may further associate the instance of the game with the identifiers of the players invited to play the game.
  • the new game creation module 615 may create a game code for the instance of the new game.
  • the new game creation module 615 may provide the game code to the game code distribution module 620.
  • the game code distribution module 620 may distribute the game code to all players who have been invited to play the instance of the new game.
  • the game code distribution module 620 may receive from the new game creation module 615 a game code for a new game.
  • the game code distribution module 620 may further obtain, from the account management module 610 or otherwise, contact information of each of the players who were invited to play the game.
  • the game code distribution module 620 may provide the game code for a new game to the contact information of each of the players who were invited to play the game.
  • the gameplay display management module 625 may manage aspects of gameplay related to a new or existing augmented reality electronic game.
  • the gameplay display management module 625 may identify actions one player has taken with respect to another player. For example, the gameplay display management module 625 may identify whether a receiver of a second player has registered an in-game action from an emitter of a first player.
  • the gameplay display management module 625 may also identify movements or evasive actions on the part of the second player.
  • the gameplay display management module 625 may associate points with specific actions by players of the game.
  • the gameplay display management module 625 may also manage lives, levels, and coordinate group gameplay between players of the game.
  • the gameplay display management module 625 may manage a storyline underlying the gameplay.
  • the gameplay display management module 625 may manage a storyline associated with players entering into combat with one another.
  • the gameplay display management module 625 may support messaging between players.
  • the gameplay display management module 625 may further render scenes, views, perspectives, and other attributes of gameplay on the user interface module 410, shown in FIG. 4.
  • the gameplay display management module 625 manages display of virtual objects in the wearable optical device 135 as part of augmented reality electronic gaming. To this end, the gameplay display management module 625 may select virtual objects for a game player based on one or more factors (a state of gameplay, the location of a game player in the physical world, etc.). The gameplay display management module 625 may identify one or more perspectives a game player is likely to have with respect to a virtual object, and may render those perspectives of the virtual object on the wearable optical device 135 associated with that game player. The gameplay display management module 625 may further receive interactions from the game player with respect to the virtual object.
  • factors a state of gameplay, the location of a game player in the physical world, etc.
  • the gameplay display management module 625 may identify one or more perspectives a game player is likely to have with respect to a virtual object, and may render those perspectives of the virtual object on the wearable optical device 135 associated with that game player.
  • the gameplay display management module 625 may further receive interactions from the game player with respect to the virtual object.
  • Examples of interactions may include actions using the emitter 120 (activity related to the emitter interaction mechanism 210, etc.), actions using the communication device 130 (activity related to the communications device 130, etc.) , and actions using the wearable optical device 135 (eye movements, touch inputs, voice inputs, movement(s), etc.).
  • the gameplay display management module 625 registers actions against virtual objects by modifying the state of the virtual objects.
  • FIG. 7 shows the gameplay display management module 625 in greater detail.
  • the account datastore 630 may store information related to player accounts.
  • the account datastore 630 may store information such as players' points, usernames, and players' relationships with each other, actions specific players have taken with respect to other players, and other information.
  • the device datastore 635 may store devices that have participated in gameplay.
  • the game datastore 640 may store game instances. In various embodiments, game instances are implemented as data structures in the game datastore 640 that can be instantiated and placed into memory by the new game creation module 615.
  • FIG. 7 depicts an example of a gameplay display management module 625, according to some embodiments.
  • the gameplay display management module 625 may include a gameplay state management module 705, a user location determination module 710, a user perspective selection module 715, a virtual object management module 720, a virtual object perspective module 725, a virtual object rendering module 730, an interaction management module 735, a virtual space mapping module 740, a gameplay state datastore 745, a physical environment mapping datastore 750, a virtual object datastore 755, and a virtual space mapping datastore 760.
  • One or more of the gameplay state management module 705, the user location determination module 710, the user perspective selection module 715, the virtual object management module 720, the virtual object perspective module 725, the virtual object rendering module 730, the interaction management module 735, the virtual space mapping module 740, the gameplay state datastore 745, the physical environment mapping datastore 750, the virtual object datastore 755, and the virtual space mapping datastore 760 may include hardware and/or software.
  • One or more of the gameplay state management module 705, the user location determination module 710, the user perspective selection module 715, the virtual object management module 720, the virtual object perspective module 725, the virtual object rendering module 730, the interaction management module 735, the virtual space mapping module 740, the gameplay state datastore 745, the physical environment mapping datastore 750, the virtual object datastore 755, and the virtual space mapping datastore 760 may be coupled to one another or to components external to the gameplay display management module 625.
  • the gameplay state management module 705 may manage state(s) of augmented reality electronic gameplay.
  • the gameplay state management module 705 retrieves, modifies, updates, etc. state(s) of augmented reality electronic games in the gameplay state datastore 745.
  • the gameplay state management module 705 may receive instructions from the virtual object rendering module 730 to modify gameplay state(s) based on virtual objects, and/or the interaction management module 735 to modify gameplay state(s) based on interactions with the emitter 120, the communications device 130, and the wearable optical device 135.
  • the user location determination module 710 may identify locations of game players. In some embodiments, the user location determination module 710 gathers GPS coordinates of game players from GPS devices on emitter(s) 120, receiver(s) 125, and/or communication device(s) 130. In various embodiments, the user location determination module 710 may determine the locations of game players based on the orientations of receiver(s) 125 and/or communication device(s) 130 in relation to receiver(s) 125 in a geo-fenced region(e.g., by determining the proximity of an emitter 120 or a communication device 130 to a receiver 125 in a geo-fenced region). In some embodiments, the user location determination module 710 receives information from beacons (e.g., BLE beacons) on emitter(s) 120 and/or
  • beacons e.g., BLE beacons
  • the user location determination module 710 may determine location of game players using some combination of the techniques herein or using techniques not described explicitly herein.
  • the user perspective selection module 715 may select one or more perspectives game players may have of the physical world. In various embodiments, the user perspective selection module 715 gather s information about the physical world from the physical environment mapping datastore 750. The user perspective selection module 715 may further identify a game player's distances, orientations, etc. with respect to obstacles, contours, etc. in the game player's physical environment. In various embodiments, the user perspective selection module 715 may provide information about game players' perspectives regarding a physical environment to other modules. [0123] The virtual object management module 720 may select virtual objects to be displayed on the communications device(s) 130 and/or the wearable optical device(s) 135.
  • the virtual object management module 720 gathers relevant virtual objects from the virtual object datastore 755 based on gameplay state(s) and/or physical location(s) of game players.
  • the virtual object management module 720 may gather specific virtual objects for game players who have reached specific game levels, accrued specific amounts of game points, and/or confronted specific virtual characters or virtual items.
  • the virtual object management module 720 may gather a virtual object containing a representation of a dragon or other mythical creature in an augmented reality electronic fantasy game in which a game player has passed a certain game level.
  • the virtual object management module 720 may gather virtual objects related to specific physical locations or environments of game players. For instance, in an augmented reality electronic game in which game players are in the desert, the virtual object management module 720 may select clay targets to display on the wearable optical device(s) 135 of game players.
  • the virtual object perspective module 725 may select perspectives of virtual objects for rendering. In some embodiments, the selection of perspective may depend on the angles, distances, and orientations of game player(s) from a projection of a virtual object. As an example of operation, the virtual object perspective module 725 may determine that the virtual object management module 720 selected a virtual object that projects an image of a fifty foot dragon approximately twenty feet in the air above two game players. To continue this example, the wearable optical device 135 of the first game player may need to view the right side of the dragon, while the wearable optical device 135 of the second game player may need to view the front of the dragon. The virtual object perspective module 725 may identify, based on properties of the CAD file corresponding to the virtual object, a first perspective
  • the virtual object rendering module 730 may render virtual objects in the communication device(s) 130 and/or the wearable optical device(s) 135. In some embodiments,
  • the virtual object rendering module 730 may receive a virtual object from the virtual object management module 720, and receive a perspective of that virtual object from the virtual object perspective module 725.
  • the virtual object rendering module 730 may instruct relevant displays on the communication device(s) 130 and/or the wearable optical device(s) 135 to display the virtual object from the selected perspective.
  • the interaction management module 735 may detect interactions by game players, in some embodiments, the interaction management module 735 may monitor the emitter interaction mechanism 210 on the emitter 120 for actions taken in response to a virtual object. Examples of such actions may correspond to shooting of a gun, swinging of a sword, making a motion corresponding to casting a spell using a wand, throwing a grenade, and picking up an item during a scavenger hunt. In various embodiments, the interaction management module 735 may monitor gestures or other input on the communications device 130. Examples of such actions include switching weapons or reloading a weapon using radio buttons on the graphical user interface of the communications device 130. Further, in some embodiments, the interaction management module 735 may monitor actions on the wearable optical device 135.
  • Examples of such actions include voice commands, touch gestures on hardware on the wearable optical device 135, eye movements that are tracked by the wearable optical device 135, and motions detected by the wearable optical device 135 (pitch, roll, yaw, tilts, translations, any motion observed by an accelerometer or gyroscope, etc.).
  • the interaction management module 735 may provide information related to detected interactions to other modules, such as the gameplay state management module 705.
  • the virtual space mapping module 740 may map models of user interactions and virtual objects into a virtual space.
  • the virtual space may be indexed by a relevant coordinate system (e.g., a Cartesian coordinate system) that specifies the distance and direction models of user interactions and/or virtual objects are projected away from a game player.
  • the maps of virtual spaces may be gathered from the virtual space mapping datastore 760.
  • the virtual space mapping module 740 may identify one or more areas in a virtual space that corresponds to models of user interactions and/or virtual objects.
  • the virtual space mapping module 740 may further determine whether one area in a virtual space overlaps with another area in the virtual space.
  • the gameplay state datastore 745 may store the various states of one or more augmented reality electronic games.
  • the gameplay state datastore 745 stores sequences of actions, levels, triggers, conditions, etc. that may form the basis of the states of augmented reality electronic games.
  • the states of augmented reality electronic games may be updated, modified, etc. as the game players progress through the augmented reality electronic games.
  • the states of augmented reality electronic games may change as the gameplay state management module 705 receives information about user actions with virtual objects, as discussed further herein.
  • the physical environment mapping datastore 750 may store files that have information related to one or more physical environments.
  • the files provide information about what the physical world around game players looks like.
  • the files may provide information about open areas, obstacles, and contours of physical items within a particular physical environment.
  • the physical environment mapping datastore 750 gathers relevant geographical information from
  • the physical environment mapping datastore 750 gathers geographical information about game players' environments from meshes, such as predetermined meshes that provide information about open areas, obstacles, and contours of physical items within a particular physical environment as well as meshes generated using cameras on wearable optical device(s) 135.
  • the virtual object datastore 755 may store files that represent virtual objects.
  • the virtual object datastore 755 stores libraries of CAD files (e.g., Unity® files) that represent virtual objects.
  • the CAD files may further specify how virtual objects appear from various perspectives, including various angles, distances, and orientations.
  • the virtual object datastore 755 obtains the CAD files from external sources, such as third-party illustrators and/or publishers.
  • representation of virtual objects in the virtual object datastore 755 may relate to a particular augmented reality electronic game or genre of augmented reality electronic games (e.g., the virtual object datastore 755 may store representation of fantasy creatures for an augmented reality electronic game having fantasy themes, representations of combat vehicles for a augmented reality electronic game having a combat theme, representations of inanimate objects for a augmented reality electronic game implementing a scavenger hunt, etc.).
  • the virtual space mapping datastore 760 may store maps of the virtual spaces used to project models of user interactions and virtual objects.
  • the virtual spaces in the virtual space mapping datastore 760 are indexed by a relevant coordinate system (e.g., a Cartesian coordinate system) that specifies the distance and direction models of user interactions and/or virtual objects are projected away from a game player.
  • the virtual space mapping datastore 760 may store a map of a virtual space that represents all items within the field of view of a game player.
  • the map may contain a virtual object of a dragon that is represented about fifty feet directly East of the game player at a height of fifty feet.
  • the map may further contain objects of user interactions with the dragon, such as objects that represent a specified number of shots (and the directions of such shots) the game player has taken at the object using an emitter 120.
  • FIG. 8 depicts a flowchart of an example of a method 800 for facilitating an augmented reality electronic game having one or more virtual objects, according to some embodiments.
  • the method 800 is discussed in conjunction with the gameplay display management module 625, shown in FIG. 7, and discussed further herein. It is noted that at least some of the operations of the method 800 may be optional, and that the method 800 need not include all of the operations shown in FIG. 8.
  • the user location determination module 710 may determine a location of a game player of an augmented reality electronic game.
  • the user location determination module 710 gathers user location information from a GPS transmitter on the communication device 130, from proximity data between the emitter 120 and the receiver 125, from BLE beacons coupled to the emitter 120, the receiver 125, the communications device 130, and/or the wearable optical device 135, and/or other techniques described herein.
  • the virtual object management module 720 may identify a virtual in-game object to be rendered in a display used to display at least a portion of the augmented reality electronic game.
  • the virtual object management module 720 selects virtual in-game objects for the augmented reality electronic game based on one or more factors, such as locations of game players and game states of the augmented reality electronic game.
  • the virtual object management module 720 may select virtual in-game objects for game players based on specific locations of the game players in the physical world.
  • the virtual object management module 720 may select virtual in-game objects for game players based on levels/points/etc. the game players have achieved in the augmented reality electronic game.
  • the virtual object perspective module 725 may identify a game player perspective of a game player in relation to the virtual in-game object. In some implementations, the virtual object perspective module 725 gathers angles, orientations, distances, etc. from the game player to a projection of the virtual object. The virtual object perspective module 725 may further evaluate, based on parameters of the virtual object, how the virtual object would appear to the display of the game player if the virtual object were projected into the physical environment around the game player.
  • the virtual object rendering module 730 may render the virtual in-game object in the display in accordance with the game player perspective. More particularly, the virtual object rendering module 730 may instruct the display to display the virtual in-game object in accordance with the user perspective identified by the virtual object perspective module 725.
  • the interaction management module 735 may receive user interaction with the virtual in-game object in the augmented reality electronic game.
  • Interactions may include input to the emitter 120, the communication device 130, and/or the wearable optical device 135.
  • the interaction management module 735 may provide this input to the gameplay state management module 705, so that the state of the augmented reality electronic game may be appropriately updated and/or modified.
  • the virtual space mapping module 740 may identify a first area in a virtual space corresponding to the virtual in-game object.
  • the virtual space mapping module 740 may identify a second area in the virtual space corresponding to the virtual in-game object.
  • the virtual space mapping module 740 may determine whether the second area overlaps the first area.
  • the gameplay state management module 705 may modify a state of the virtual in-game object based on the user interaction.
  • the gameplay state management module 705 may provide instructions to modify the virtual in-game object to the other modules of the gameplay display management module 625.
  • the modified state of the virtual in-game object may be stored in the virtual object datastore 755.
  • the virtual object rendering module 730 may render a modified virtual in-game object on the display based on the modified state.
  • the virtual object rendering module 730 may also instruct the display to display the modified virtual in-game object in accordance with the modifications.
  • FIG. 9 depicts a flowchart of an example of a method 900 for rendering a virtual object in an augmented reality electronic game, according to some embodiments.
  • the method 900 is discussed in conjunction with the gameplay display management module 625, shown in FIG. 7, and discussed further herein. It is noted that at least some of the operations of the method 900 may be optional, and that the method 900 need not include all of the operations shown in FIG. 9.
  • the user location determination module 710 may identify a physical location of a game player of an augmented reality electronic game.
  • the user location determination module 710 gathers user location information from a GPS transmitter on the communication device 130, from proximity data between the emitter 120 and the receiver 125, from BLE beacons coupled to the emitter 120, the receiver 125, the communications device 130, and/or the wearable optical device 135, and/or other techniques described herein.
  • the gameplay state management module 705 may identify a gameplay state of the augmented reality electronic game. More particularly, the gameplay state management module 705 may relevant gameplay levels, points, etc. associated with the gameplay state of the augmented reality electronic game.
  • the virtual object management module 720 may identify in the virtual object datastore 755 a virtual in-game object associated with the physical location or the gameplay state. More particularly, the virtual object management module 720 may select virtual in-game objects that gameplay rules indicate may be projected at the identified location and/or in response to the identified gameplay state of the augmented reality electronic game. At an operation 920, the virtual object management module 720 may gather the virtual in-game object from the virtual object datastore 755.
  • FIG. 10 depicts a flowchart of an example of a method 1000 for modifying a state of a virtual object in an augmented reality electronic game, according to some embodiments.
  • the method 1000 is discussed in conjunction with the gameplay display management module 625, shown in FIG. 7, and discussed further herein. It is noted that at least some of the operations of the method 1000 may be optional, and that the method 1000 need not include all of the operations shown in FIG. 10.
  • the virtual object management module 720 may identify a virtual in-game object displayed in accordance with an augmented reality electronic game. More particularly, in some embodiments, the virtual object management module 720 may receive from the gameplay state management module 705 identifiers of virtual in-game objects that have been displayed in an augmented reality electronic game. For instance, the virtual object management module 720 may receive from the gameplay state management module 705 an identifier of a dragon or other virtual object displayed in the communications device 130 and/or the wearable optical device 135.
  • the interaction management module 735 may receive user interactions in the augmented reality electronic game. Interactions may include input to the emitter 120, the communication device 130, and/or the wearable optical device 135. The interaction management module 735 may provide this input to the gameplay state management module 705, so that the state of the augmented reality electronic game may be appropriately updated and/or modified.
  • the virtual object management module 720 may associate the user interactions with one or more parameters of the virtual in-game object. More particularly, the virtual object management module 720 may determine the extent these user interactions correspond to changes in the virtual in-game object. As an example, if a user uses an emitter 120 to "shoot" at a virtual in-game object that represents a dragon, the virtual object management module 720 may determine where the shots from the emitter 120 would project on the dragon.
  • the virtual object management module 720 may modify the one or more parameters of the virtual in-game object.
  • a user uses an emitter 120 to "shoot" at a virtual in-game object that represents a dragon, the virtual object management module 720 may modify portions of an image that represents where the shot would have projected on the dragon.
  • the virtual object management module 720 may store the virtual in-game object with the modified parameters.
  • FIG. 11 depicts a flowchart of an example of a method for facilitating an augmented reality electronic game having one or more virtual objects, according to some embodiments. The method 1100 is discussed in conjunction with the gameplay display management module 625, shown in FIG. 7, and discussed further herein. It is noted that at least some of the operations of the method 1100 may be optional, and that the method 1100 need not include all of the operations shown in FIG. 11.
  • FIG. 11 depicts a flowchart of an example of a method 1100 for facilitating an augmented reality electronic game having one or more virtual objects, according to some embodiments.
  • the method 1100 is discussed in conjunction with the gameplay display management module 625, shown in FIG. 7, and discussed further herein. It is noted that at least some of the operations of the method 1100 may be optional, and that the method 1100 need not include all of the operations shown in FIG. 11.
  • the user location determination module 710 may determine a location of a game player of an augmented reality electronic shooting game.
  • the user location determination module 710 gathers user location information from a GPS transmitter on the communication device 130, from proximity data between the emitter 120 and the receiver 125, from BLE beacons coupled to the emitter 120, the receiver 125, the communications device 130, and/or the wearable optical device 135, and/or other techniques described herein.
  • the virtual object management module 720 may identify a virtual in-game object corresponding to virtual shooting targets to be rendered in a display used to display at least a portion of the augmented reality electronic game. In various embodiments, the virtual object management module 720 selects virtual in-game objects of virtual shooting targets for the augmented reality electronic game based on one or more factors, such as locations of game players and game states of the augmented reality electronic game. As an example, the virtual object management module 720 may select virtual in-game objects for game players based on specific locations of the game players in the physical world. As another example, the virtual object management module 720 may select virtual in-game objects for game players based on levels/points/etc. the game players have achieved in the augmented reality electronic game.
  • the virtual object perspective module 725 may identify a game player perspective of a game player in relation to the virtual shooting targets. In some implementations, the virtual object perspective module 725 gathers angles, orientations, distances, etc. from the game player to a projection of the virtual object. The virtual object perspective module 725 may further evaluate, based on parameters of the virtual object, how the virtual object would appear to the display of the game player if the virtual object were projected into the physical environment around the game player.
  • the virtual object rendering module 730 may render the virtual shooting targets in the display in accordance with the user perspective. More particularly, the virtual object rendering module 730 may instruct the display to display the virtual in-game object in accordance with the user perspective identified by the virtual object perspective module 725.
  • the interaction management module 735 may receive through the emitter 120 user interaction with the virtual shooting targets in the augmented reality electronic game. For instance, the interaction management module 735 may receive indication that a game player squeezed at trigger of the emitter 120 to shoot at the virtual shooting targets. The interaction management module 735 may provide this input to the gameplay state management module 705, so that the state of the augmented reality electronic game may be appropriately updated and/or modified (e.g., so that the virtual shooting targets can register hits against them).
  • the gameplay state management module 705 may modify an appearance of the virtual shooting targets based on the user interaction.
  • the gameplay state management module 705 may provide instructions to modify the virtual in-game object to the other modules of the gameplay display management module 625.
  • the modified state of the virtual in-game object may be stored in the virtual object datastore 755.
  • the virtual object rendering module 730 may render a modified virtual shooting target on the display based on the modified appearance.
  • the virtual object rendering module 730 may also instruct the display to display the modified virtual in- game object in accordance with the modifications.
  • the virtual object rendering module 730 may render virtual shooting targets that have been hit or have exploded as a result of being shot by the game player in the augmented reality electronic game.
  • FIG. 12 depicts an example of a digital device 1200, according to some
  • the digital device 1200 comprises a processor 1205, a memory system 1210, a storage system 1215, a communication network interface 1220, an Input/output (I/O) interface 1225, a display interface 1230, and a bus 1235.
  • the bus 1235 may be communicatively coupled to the processor 1205, the memory system 1210, the storage system 1215, the communication network interface 1220, the I/O interface 1225, and the display interface 1230.
  • the processor 1205 comprises circuitry or any processor capable of processing the executable instructions.
  • the memory system 1210 comprises any memory configured to store data. Some examples of the memory system 1210 are storage devices, such as RAM or ROM.
  • the memory system 1210 may comprise the RAM cache.
  • data is stored within the memory system 1210. The data within the memory system 1210 may be cleared or ultimately transferred to the storage system 1215.
  • the storage system 1215 comprises any storage configured to retrieve and store data. Some examples of the storage system 1215 are flash drives, hard drives, optical drives, and/or magnetic tape.
  • the digital device 1200 includes a memory system 1210 in the form of RAM and a storage system 1215 in the form of flash data. Both the memory system 1210 and the storage system 1215 comprise computer readable media which may store instructions or programs that are executable by a computer processor including the processor 1205.
  • the communication network interface (com. network interface) 1220 may be coupled to a data network.
  • the communication network interface 1220 may support communication over an Ethernet connection, a serial connection, a parallel connection, or an ATA connection, for example.
  • the communication network interface 1220 may also support wireless communication (e.g., 802.12 a/b/g/n, WiMAX, LTE, 3G, 2G). It will be apparent to those skilled in the art that the communication network interface 1220 may support many wired and wireless standards.
  • the optional input/output (I/O) interface 1225 is any device that receives input from the user and output data.
  • the display interface 1230 is any device that may be configured to output graphics and data to a display. In one example, the display interface 1230 is a graphics adapter.
  • the hardware elements of the digital device 1200 are not limited to those depicted in FIG. 12. A digital device 1200 may comprise more or less hardware elements than those depicted. Further, hardware elements may share functionality and still be within various embodiments described herein. In one example, encoding and/or decoding may be performed by the processor 1205 and/or a coprocessor located on a GPU.
  • FIG. 13A depicts an example of an augmented reality gaming system 1300, according to some embodiments.
  • the augmented reality gaming system 1300 may include a peripheral system 1305, a communications device 1310, and a gameplay system 1315.
  • the peripheral system 1305 may include any peripheral system, such as a receiver or an emitter, as discussed herein.
  • the peripheral system 1305 may correspond to one or more of the emitter 120 and/or the receiver 125, shown in FIG. 1.
  • the peripheral system 1305 may include a transmitter, a receiver, a lens, and other hardware to facilitate sensor-based mobile gaming.
  • the peripheral system 1305 may be paired to the communications device 1310, as discussed herein.
  • the peripheral system 1305 may be coupled to the communications device 1310.
  • the peripheral system 1305 is coupled to the communications device 1310 using a Bluetooth connection or other wireless connection.
  • the communications device 1310 may include any digital device, an example of which is the digital device 1200 shown in FIG. 12. In various embodiments, the
  • the communications device 1310 may correspond to the communications device 130, shown in FIG. 1.
  • the communications device 1310 may include a game application 1320, a peripheral API 1325, a platform API 1330, an API support layer 1335, and a mobile operating system 1340.
  • the game application 1320 may allow a user to engage in sensor-based mobile gaming as discussed herein. More specifically, the game application 1320 may include gameplay modules to facilitate sensor-based mobile gaming. In various embodiments, the game application 1320 may include modules corresponding to one or more of the user interface module 410 and the gameplay memory datastore 430, shown in FIG. 4. The game application 1320 may be implemented in any convenient format, including, in various embodiments, an iOS® mobile application or an Android® mobile application. [0170]
  • the peripheral API 1325 may support coupling the communications device 1310 to the peripheral system 1305. In some embodiments, the peripheral API 1325 is implemented as a Bluetooth or other wireless interface to the peripheral system 1305.
  • the peripheral API 1325 may correspond to some or all of the emitter interface module 415 and the receiver interface module 420, shown in FIG. 4.
  • the platform API 1330 may support coupling the communications device 1310 to the gameplay system 1315.
  • the platform API 1330 may be implemented as a bus, a network interface, or other interface.
  • the platform API 1330 may correspond to some or all of the gameplay cloud interface module 425, shown in FIG. 4.
  • the API support layer 1335 may support function calls used by the game application 1320, the peripheral API 1325, and the platform API 1330. In some embodiments, the API support layer 1335 may facilitate receiving and processing user interface inputs, such as gestures, swipes, and clicks. In an implementation, the API support layer 1335 comprises a Cocoa Touch® layer. It is noted the API support layer 1335 may also comprise Android API support layer(s) or other support layer(s) without departing from the scope and substance of the inventive concepts described herein.
  • the mobile operating system 1340 may comprise an operating system of the communications device 1310. In various embodiments, the mobile operating system 1340 may comprise an iOS® operating system or Android® operating system. It is noted the mobile operating system 1340 may comprise other forms of operating systems in some embodiments.
  • the gameplay system 1315 may support sensor-based gaming by a user of the communications device 1310, as discussed herein.
  • the gameplay system 1315 may be coupled to the communications device 1310 using a network connection, such as an Internet connection.
  • the network connection may comprise a wireless network connection.
  • the gameplay system 1315 may also be coupled to the communications device 1310 over other convenient connections as known in the art.
  • FIG. 13B depicts an example of an augmented reality gaming system 1300, according to some embodiments.
  • the augmented reality gaming system 1300 may include a communications device 1310, a gameplay system 1315, and a user 1370.
  • the communications device 1310 may be coupled to the gameplay system 1315.
  • the communications device 1310 may correspond to the communications device 1310 in FIG. 13A.
  • the gameplay system 1315 may correspond to the gameplay system 1315 in FIG. 13A.
  • the gameplay system 1315 may include a web service module 1345, a web UI module 1350, a Ruby on Rails support module 1355, a cloud-based Platform as a Service (PaaS) module 1360, and a cloud-based storage module 1365.
  • the web service module 1345 may be coupled to the communications device 1310.
  • the web service module 1345 may provide sensor-based mobile gaming services, as described herein, as a web service to the communications device 1310.
  • the web UI module 1350 may be coupled to the user 1370.
  • the web UI module 1350 may provide an online portal to access an account associated with the user 1370.
  • the Ruby on Rails support module 1355 may allow the web service module 1345 and the web UI module 1350 to access the cloud-based PaaS module 1360 and the cloud-based storage module 1365.
  • the cloud-based PaaS module 1360 may provide PaaS to other modules.
  • the cloud-based storage module 1365 may provide cloud-based storage to the other modules.
  • the user 1370 may be any player that utilizes the system.
  • the user 1370 may represent a player seeking to access a web portal associated with sensor-based mobile gaming, as discussed herein.
  • the user 1370 may correspond to the player of the first communications device 130-1 or the Nth communications device 130-N, shown in FIG. 1.
  • FIG. 14 depicts an example of an augmented reality gaming environment 1400, according to some embodiments.
  • the virtual in-game object 140 is projected into the optical wearable device 135 of a game player.
  • the game player uses an emitter 120 to interact with the virtual in-game object 140 (e.g., by taking hits at the virtual in-game object 140 using an emitter modeled as a gun).
  • a virtual space 1405 is used to compare projections of the virtual in-game object 140 with locations of in-game interactions.
  • the virtual space 1405 may be used to compare whether the game player is shooting the emitter in the right direction and sufficiently steady to register a hit on the virtual in-game object 140.
  • FIG. 15 depicts an example of an emitter 120 and a communications device 130 in an augmented reality gaming environment 1500, according to some embodiments. In this example, game information about interactions with a virtual in-game object may be relayed to the communications device 130.
  • FIG. 16 depicts an example of a display 135 and a communications device 130 in an augmented reality gaming environment 1600, according to some embodiments. In this example both the display 135 and the communications device 130 may display the virtual in-game object.
  • FIG. 17 depicts an example screen 1700 of an augmented reality electronic game, according to some embodiments.
  • the screen 1700 may include a new operation button 1705 that allows game players to start a new game, a join operation 1710 that allows game players to join an existing game (e.g., a game that the game players or other game players have previously created), a missions button 1715 that allows access to missions that been undertaken, and an active ops button 1720 that allows access to active operations underway in the augmented reality electronic game.
  • a new operation button 1705 that allows game players to start a new game
  • a join operation 1710 that allows game players to join an existing game (e.g., a game that the game players or other game players have previously created
  • a missions button 1715 that allows access to missions that been undertaken
  • an active ops button 1720 that allows access to active operations underway in the augmented reality electronic game.
  • FIG. 18 depicts an example screen 1800 of an augmented reality electronic game, according to some embodiments.
  • the screen 1800 includes one or more statuses, such as a first status 1805 that identifies whether an emitter is active/coupled, and a second status 1810 that identifies whether a receiver is active/coupled.
  • the screen 1800 further includes a first button 1815 that provides the ability to scan QR codes, and a second button 1820 that allows a user to reset the user' s account.
  • FIG. 19 depicts an example screen 1900 of an augmented reality electronic game, according to some embodiments.
  • the screen 1900 may correspond to a new operations screen (e.g., a screen that is displayed when the new operations button 1705 has been selected).
  • the screen 1900 may include an operation name box 1905, an operation type box 1910, an operation length box 1915, and a create button 1920.
  • FIG. 20 depicts an example screen 2000 of an augmented reality electronic game, according to some embodiments.
  • the screen 2000 may correspond to a join operations screen (e.g., a screen that is displayed when the join operation button 1710 has been selected).
  • the screen 2000 may include an operation join code box 2005, a scan QR code button 2010, and a join game button 2015.
  • the above-described functions and components may be comprised of instructions that are stored on a storage medium such as a computer readable medium.
  • the instructions may be retrieved and executed by a processor.
  • Some examples of instructions are software, program code, and firmware.
  • Some examples of storage medium are memory devices, tape, disks, integrated circuits, and servers.
  • the instructions are operational when executed by the processor to direct the processor to operate in accord with some embodiments. Those skilled in the art are familiar with instructions, processor(s), and storage medium.
  • module may be hardware, software, or a combination of both. As used herein, a module may further include firmware.
  • firmware may be hardware, software, or a combination of both. As used herein, a module may further include firmware.
  • the language used herein has been principally selected for readability and instructional purposes, and it may not have been selected to delineate or circumscribe the inventive subject matter. It is therefore intended that the scope be limited not by this detailed description, but rather by any claims that issue on an application based hereon. Accordingly, the disclosure of the embodiments is intended to be illustrative, but not limiting, of the scope, which is set forth in the following claims.

Abstract

A communications device may be configured to support gameplay. A display may be configured to display a view of a physical environment proximate to the display, and to display at a first location a virtual in-game object associated with the gameplay, the first position being over the view of the physical environment. An emitter may be configured to provide an emitter interaction signal related to a first in-game interaction with the virtual in-game object. A gameplay server may be configured to determine whether a projection of the in-game interaction overlaps a projection of the virtual in-game object in a virtual space, and to change a state of the virtual in-game object if the projection of the in-game interaction overlaps the projection of the virtual in-game object in the virtual space.

Description

INTERACTIVE GAMING USING WEARABLE OPTICAL DEVICES
TECHNICAL FIELD
[0001] The technical field relates to systems and methods for interactive gaming on digital devices. More particularly, the technical field relates to systems and methods for sensor-based interactive gaming using optical wearable devices.
BACKGROUND
[0002] Electronic games have long entertained many people. Many electronic games are hosted on personal computers or dedicated game consoles that have a processing or control unit, a display device, and a joystick, a keyboard, a mouse, trackpad, or other input device. The electronic games themselves typically relate to one or more genres, such as adventure genres, first-person shooting genres, automotive or aviation genres, role-playing or fantasy genres, sports genres, and collaborative social genres. The electronic games typically utilize gameplay, in-game objectives and virtual in-game objects (such as virtual characters, virtual items, virtual points, and video game levels) to facilitate competition or collaboration between one or more game players and a computer, and/or between two or more game players.
[0003] Though potentially informative and entertaining, conventional electronic games often do not effectively interface with the physical world. Systems and methods that allow game players to play electronic games while interfacing with the physical world would be desirable.
SUMMARY
[0004] Most electronic games do not effectively interface with the physical world. People may receive greater enjoyment from electronic games that allow them to interact with the physical world, particularly electronic games that augment the physical world with virtual elements. The systems, methods, and non-transitory computer-readable media described herein allow people to play electronic games that augment reality using sensor-based gaming hardware, wireless computing devices, and/or wearable optical devices. Locations of game players may be determined using the sensor-based gaming hardware, the wireless computing devices, the wearable optical devices, or some combination thereof. Depending on game players' locations, state(s) of the electronic game, and/or other factors, interactive virtual objects may be selected to augment game players' fields of vision. The game players may further interact with these virtual objects using the sensor-based gaming hardware, the wireless computing devices, the wearable optical devices, or some combination thereof. In various implementations, these interactive inputs may be used to change the state(s) of the electronic game, state(s) of the virtual objects, etc.
[0005] A communications device may be configured to support gameplay. A display may be configured to display a view of a physical environment proximate to the display, and to display at a first location a virtual in-game object associated with the gameplay, the first position being over the view of the physical environment. An emitter may be configured to provide an emitter interaction signal related to a first in-game interaction with the virtual in- game object. A gameplay server may be configured to determine whether a projection of the in-game interaction overlaps a projection of the virtual in-game object in a virtual space, and to change a state of the virtual in-game object if the projection of the in-game interaction overlaps the projection of the virtual in-game object in the virtual space.
[0006] A method may comprise: supporting gameplay on a communications device, a display, and an emitter; displaying a view of a physical environment proximate to the display; displaying at a first location a virtual in-game object associated with gameplay, the first position being over the view of the physical environment; providing an emitter interaction signal related to a first in-game interaction with the virtual in-game object; projecting the first location onto a first area in a virtual space; identifying a second area in the virtual space, the second area associated with the first in-game interaction; determining whether the second area overlaps with the first area in the virtual space; and changing a state of the virtual in-game object if the second area overlaps with the first area in the virtual space.
[0007] A system may comprise: a communications device configured to support gameplay; means for displaying a view of a physical environment proximate to the display, and to display at a first location a virtual in-game object associated with the gameplay, the first position being over the view of the physical environment; means for providing an emitter interaction signal related to a first in-game interaction with the virtual in-game object; and a gameplay server coupled to the communications device, the gameplay server configured to: project the first location onto a first area in a virtual space; identify a second area in the virtual space, the second area associated with the first in-game interaction based on the emitter interaction signal; determine whether the second area overlaps with the first area in the virtual space; and change a state of the virtual in-game object if the second area overlaps with the first area in the virtual space.
[0008] In some implementations, the emitter interaction signal is based on one or more of: a squeeze of a trigger of a toy gun incorporating the emitter, a specified motion of a toy sword incorporating the emitter, and a specified motion of a toy wand incorporating the emitter.
[0009] A communications device interaction may be provided from the communications device, the communications device interaction signal being related to a second in-game interaction with the virtual in-game object. The communications device interaction signal may be based on one or more gestures into a user interface of the communications device.
[0010] In some implementations, the display is incorporated into an wearable optical device, and display is configured to provide a display interaction signal, the display interaction signal being related to a second in-game interaction signal with the virtual in-game object. The display interaction signal may be based on one or more of: eye motion of a user of the wearable optical device, touch interaction with the wearable optical device, and voice input into the wearable optical device. The display interaction signal may be based on a tilt or a translation of the wearable optical device.
[0011] In various implementations, the gameplay server is configured to select the virtual in-game object for the display based on a location of the display. The gameplay server may be configured to select the virtual in-game object for the display based on a gameplay state of the gameplay.
[0012] The display may be incorporated into a head mounted device (HMD) or a heads-up display (HUD). The display may be incorporated into the communications device.
[0013] Other features and embodiments are apparent from the accompanying drawings and from the following detailed description.
BRIEF DESCRIPTION OF THE DRAWINGS
[0014] FIG. 1A depicts an example of an augmented reality gaming system, with one or more wearable optical devices, according to some embodiments.
[0015] FIG. IB depicts an example of an augmented reality gaming environment, according to some embodiments.
[0016] FIG. 1C depicts an example of an interior view of a wearable optical device, according to some embodiments.
[0017] FIG. ID depicts an example of an interior view of a wearable optical device, according to some embodiments.
[0018] FIG. IE depicts an example of an interior view of a wearable optical device, according to some embodiments.
[0019] FIG. 2 depicts an example of a emitter, according to some embodiments.
[0020] FIG. 3 depicts an example of a receiver, according to some embodiments.
[0021] FIG. 4 depicts an example of a communications device, according to some embodiments.
[0022] FIG. 5 depicts an example of a wearable optical device, according to some embodiments.
[0023] FIG. 6 depicts an example of a gameplay system, according to some embodiments.
[0024] FIG. 7 depicts an example of a gameplay display management module, according to some embodiments.
[0025] FIG. 8 depicts a flowchart of an example of a method for facilitating an augmented reality electronic game having one or more virtual objects, according to some embodiments.
[0026] FIG. 9 depicts a flowchart of an example of a method for rendering a virtual object in an augmented reality electronic game, according to some embodiments. [0027] FIG. 10 depicts a flowchart of an example of a method for modifying a state of a virtual object in an augmented reality electronic game, according to some embodiments.
[0028] FIG. 11 depicts a flowchart of an example of a method for facilitating an augmented reality electronic game having one or more virtual objects, according to some embodiments.
[0029] FIG. 12 depicts an example of a digital device, according to some embodiments.
[0030] FIG. 13A depicts an example of an augmented reality gaming system, according to some embodiments.
[0031] FIG. 13B depicts an example of an augmented reality gaming system, according to some embodiments.
[0032] FIG. 14 depicts an example of an augmented reality gaming environment, according to some embodiments.
[0033] FIG. 15 depicts an example of an emitter and a communications device in an augmented reality gaming environment, according to some embodiments.
[0034] FIG. 16 depicts an example of a display and a communications device in an augmented reality gaming environment, according to some embodiments.
[0035] FIG. 17 depicts an example screen of an augmented reality electronic game, according to some embodiments.
[0036] FIG. 18 depicts an example screen of an augmented reality electronic game, according to some embodiments.
[0037] FIG. 19 depicts an example screen of an augmented reality electronic game, according to some embodiments.
[0038] FIG. 20 depicts an example screen of an augmented reality electronic game, according to some embodiments. DETAILED DESCRIPTION
[0039] Example Augmented Reality Gaming System 100A and Augmented Reality Gaming
Environment 100B.
[0040] FIG. 1A depicts an example of an augmented reality gaming system 100, with one or more wearable optical devices, according to some embodiments. The augmented reality gaming system 100A includes a plurality of player environments 105 (illustrated in FIG. 1 as a first player environment 105-1 through an Nth player environment 105-N), a network 110, and a gameplay system 115.
[0041] The first player environment 105-1 may comprise one or more devices associated with a first person or set of persons. The first player environment 105-1 may include a first emitter 120-1, a first receiver 125-1, a first communications device 130-1, and a first wearable optical device 135-1. The first emitter 120-1, the first receiver 125-1, and/or the first wearable optical device 135-1 may be coupled to the first communications device 130-1. The coupling may use any known or convenient format (a Bluetooth® connection (e.g., a Bluetooth Low Energy® connection), a 802.11 connection, a cellular connection, a bus, wire, or wires, etc.). As discussed further herein, the first emitter 120-1, the first receiver 125-1, and the first communications device 130-1 may be used by a first player to engage in augmented reality electronic gameplay and/or augmented reality electronic gameplay.
[0042] The first emitter 120-1 may comprise a digital device having a transmitter that emits an emitter signal to a receiver. A digital device, as used herein, may comprise any device having a processor and a memory. A digital device may comprise some or all of the components of the digital device 1200, shown in FIG. 12. The emitter signal may comprise one or more of a variety of electromagnetic signals. In various embodiments, the emitter signal may include an infrared signal, a Near Field Communications (NFC) signal, etc. The emitter signal may comprise a beam that is directed at the receiver. The beam may be encoded with a unique identifier corresponding to the first emitter 120-1. The first emitter 120-1 may provide information related to the emitter signal to the first communications device 130-1. The first emitter 120-1 may be controlled by the first communications device 130-1. The first emitter 120-1 may be incorporated into a modular peripheral device, that is, a device that is provided using a hardware development kit. An example of a hardware development kit includes a set of plans that players can print on a three-dimensional (3D) printer using a template in the kit. [0043] In some embodiments, the first emitter 120-1 may take the form of a weapon used in augmented reality electronic gameplay. For example, the first emitter 120-1 may be a gun, a bow, a sword, a wand, a grenade, or other weapon. The first emitter 120-1 may have an interaction recognition mechanism that recognizes interactions with the first emitter 120-1 and/or instructs the transmitter of the first emitter 120-1 to emit the emitter signal. The interaction recognition mechanism may have a variety of forms.
[0044] As various examples, the interaction recognition mechanism may comprise: a shoot mechanism corresponding to a trigger on a gun, a motion recognition mechanism that recognizes body movements that correspond to motions taken by a user of a bow, a sword, a wand, grenade, etc. As additional examples, in embodiments where the first emitter 120-1 comprises a gun, the interaction recognition mechanism may appear as a finger-based trigger. When the finger-based trigger is activated, the first emitter 120-1 may emit the emitter signal. As another example, in embodiments where the first emitter 120-1 comprises a grenade, the interaction recognition mechanism may appear as a grenade clip that instructs emission of the emitter signal after expiration of a specified time. It is noted the first emitter 120-1 need not have an interaction recognition mechanism, and may emit the emitter signal upon occurrence of any number of specified events. It is further noted, in various embodiments, the first emitter 120-1 need not take the form of a weapon, and may instead take some other form. For instance, in some embodiments, the first emitter 120-1 may take the form of a search device used in scavenger-hunting gameplay. In various embodiments, the first emitter 120-1 may be wearable. For example, the first emitter 120-1 may be integrated into a piece of clothing to be worn on a player.
[0045] Further, in some embodiments, the first emitter 120-1 may include hardware, software, and/or firmware to trigger data export to the first communications device 130-1 at various times, including: when the first emitter 120-1 is initially coupled to the first communications device 130-1, when a player has taken an action on the first emitter 120-1, and when the first emitter 120-1 is decoupled from the first communications device 130-1. In various embodiments, the first emitter 120-1 may have some or all of the components of the emitter 120, shown in FIG. 2.
[0046] The first receiver 125-1 may comprise a digital device configured to receive an emitter signal. The first receiver 125-1 may receive the emitter signal from an emitter associated with another player (e.g., the Nth emitter 120-N). If the emitter signal is encoded with the identity of an emitter, the first receiver 125-1 may decode the emitter signal. The first receiver 125-1 may provide to the first communications device 130-1 a receiver signal corresponding to the received emitter signal. The first receiver 125-1 may be controlled by the first communications device 130-1. The first receiver 125-1 may be incorporated into a modular peripheral device.
[0047] In particular embodiments, the first receiver 125-1 may have a form compatible with augmented reality electronic gameplay. More specifically, the first receiver 125-1 may be configured to register in-game actions, such as shots, hits, outcomes of spells, etc. In gameplay where the first emitter 120-1 is configured as a gun, for instance, the first receiver 125-1 may be configured to receive a beam from the emitter 120-1. In gameplay where the first emitter 120-1 is configured as a sword, the first receiver 125-1 may be configured as a tunic or other wearable item configured to receive a touch by the first emitter 120-1, in one example. In gameplay where the first emitter 120-1 is configured as a grenade, the first receiver 125-1 may be configured to receive emitter signals from an approximate point source corresponding to the location of the first emitter 120-1. In gameplay where the first emitter 120-1 is configured for scavenger hunt games, the first receiver 125-1 may include an identifier, such as a Quick Response (QR) Code that facilitates access to items in gameplay. In combat-based gameplay embodiments, the first receiver 125-1 may provide such an identifier. In gameplay defined by a geographical region (e.g., gameplay having geo-fenced boundaries), the first receiver 125-1 may comprise a disk, puck, biscuit, etc. that is placed within the geographical region and receives information related position from the first emitter 120-1. As examples, the first receiver 125-1 may include BLE or Wi-Fi hardware that allows distance to the first emitter 120-1 to be determined with a specified degree of accuracy.
[0048] In various embodiments, the first receiver 125-1 may trigger data export to the first communications device 130-1 at various times, including: when the first receiver 125-1 is initially coupled to the first communications device 130-1, when the first receiver 125-1 has indicated some action (e.g., a valid hit) has been taken on the first receiver 125-, and when the first receiver 125-1 is decoupled from the first communications device 130-1. In various embodiments, the first receiver 125-1 may have some or all of the components of the receiver 125, shown in FIG. 3. [0049] The first communications device 130-1 may comprise a digital device configured to control the first emitter 120-1, the first receiver 125-1, and/or the first optical wearable device 135-1. In various embodiments, the first communications device 130-1 may be one or more of: a mobile phone, a tablet computing device, a desktop computer, a laptop computer, or other digital device. The first communications device 130-1 may have some or all of the components of the communications device 130, shown in FIG. 4.
[0050] In some embodiments, the first communications device 130-1 supports augmented reality electronic gameplay using the first emitter 120-1, the first receiver 125-1, the gameplay system 115, and/or the first optical wearable device 135-1. The first communications device 130-1 may receive emitter signals from the first emitter 120-1. The first communications device 130-1 may further receive the receiver signal from the first receiver 125-1. In various embodiments, the first communications device 130-1 may provide the first player with an application that presents augmented reality electronic gameplay. The application may include data, services, and other information obtained from the gameplay system 115. The application may have been downloaded from an application store or installed using other methodologies. The application may support in-game purchases and/or in-game advertising. In various embodiments, through the use of geo-fencing, the application may give any venue (retail stores, restaurants, stadiums, movie theaters, etc.) the ability to run promotions, drive advertisement revenue, and encourage the social sharing of their brand to the player' s game app on their phone.
[0051] Though FIG. 1 shows the first communications device 130-1 associated with a first player, it is noted, in various embodiments, the first communications device 130-1 need not be associated with a human being. Rather, in various embodiments, the first communications device 130-1 may be associated with and/or controlled by a digital device. The first communications device 130-1 may be controlled by an inanimate entity that, in turn receives instructions from the gameplay system 115. In such embodiments, the first emitter 120-1 and/or the first receiver 125-1 may be associated with the inanimate entity. Taking the example of a scavenger hunt game, the first receiver 125-1 may correspond to an inanimate object that is to be discovered as an object of gameplay.
[0052] In some embodiments, the first communications device 130-1 may not have access or may have only limited access to the network 110 while gameplay is underway. For example, the first communications device 130-1 may not have access to a cellular or Wi-Fi network during augmented reality electronic gameplay. In these embodiments, the first communications device 130-1 may cache or otherwise store data associated with the augmented reality electronic gameplay and provide the data to the gameplay system 115 when there is connectivity or sufficient connectivity to the network 110.
[0053] The first wearable optical device 135-1 may comprise a digital device configured to display virtual objects to the first user. A "virtual object," as used herein, may refer to any object that is displayed on a display of a digital device and that is not part of the physical world. Virtual objects may include portions of a graphical user interface (GUI), such as menus, radio buttons, text fields, visible web and/or application components, or the like.
Virtual objects may, but need not, comprise virtual in-game objects, such as elements of an electronic game that change state in response to a user's inputs/interactions. Examples of virtual in-game objects further include virtual characters, virtual items, virtual points, game levels, or the like that are part of gameplay of an electronic game.
[0054] In some embodiments, the first wearable optical device 135-1 renders virtual objects onto a display. The display may be transparent, translucent, opaque, etc. In implementations where the display is transparent or translucent, the first wearable optical device 135-1 may superimpose virtual objects over a first perspective of the physical world. In implementations where the display is opaque, the first wearable optical device 135-1 may include or be coupled to external cameras (e.g., depth-sensing cameras) or other hardware configured to provide the first user with the first perspective. In such implementations, the display may superimpose virtual objects over representations (images, video, streaming video, etc.) of the physical world. In various embodiments, the first wearable optical device 135-1 provides one or more of augmented reality and virtual reality to the first user. Examples embodiments of the first wearable optical device 135-1 include an Optical Head Mounted Display (e.g., a heads up display (HUD)), or an optical device mounted on or coupled to some portion of the first user's body or clothing. In various embodiments, the first wearable optical device 135-1 has some or all of the components of the wearable optical device 135, shown in FIG. 5.
[0055] Though FIG. 1 and portions of the description herein may describe the first wearable optical device 135-1 as separate from the first emitter 120-1, the first receiver 125-1, and the first communications device 130-1, it is noted that in various implementations, the wearable optical device 135-1 may be part of or connected to the first emitter 120-1, the first receiver 125-1, or the first communications device 130-1. For example, in some embodiments, the first wearable optical device 135-1 may include at least a portion of the display of the first communications device 130-1. As another example, in various embodiments, the
functionalities of the first communications device 130-1 may be incorporated into (e.g., embedded in circuitry within) the first wearable optical device 135-1. It is noted that in various embodiments, the first wearable optical device 135-1 may also reside within one or more of the first emitter 120-1 and the first receiver 125-1.
[0056] The Nth player environment 105-N represents a set of devices associated with an Nth person or set of persons. It is noted the letter "N" represents an arbitrary number, and may correspond to any integer greater than 1. The Nth player environment 105-N comprises an Nth emitter 120-N, an Nth receiver 125-N, an Nth communications device 130-N, and an Nth wearable optical device 135-N. In various embodiments, the Nth emitter 120-N may be similar to the first emitter 120-1, discussed herein. The Nth receiver 125-N may be similar to the first receiver 125-1, discussed herein. The Nth communications device 130-N may be similar to the first communications device 130-1. The Nth wearable optical device 135-N may be similar to the first optical wearable device 135-1, discussed herein. In some embodiments, the devices in the Nth player environment 105-N engage in augmented reality electronic gameplay with the devices in the first player environment 105-1.
[0057] The network 110 may comprise a computer network. The network 110 may include technologies such as Ethernet, 802. l lx, worldwide interoperability for microwave access WiMAX, 2G, 3G, 4G, CDMA, GSM, LTE, digital subscriber line (DSL), and/or the like. The network 110 may further include networking protocols such as multiprotocol label switching (MPLS), transmission control protocol/Internet protocol (TCP/IP), User Datagram Protocol (UDP), hypertext transport protocol (HTTP), simple mail transfer protocol (SMTP), file transfer protocol (FTP), and/or the like. The data exchanged over the network 110 can be represented using technologies and/or formats including hypertext markup language (HTML) and extensible markup language (XML). In addition, all or some links can be encrypted using conventional encryption technologies such as secure sockets layer (SSL), transport layer security (TLS), and Internet Protocol security (IPsec). The network 110 may be coupled to the first communications device 130-1, to the Nth communications device 130-N, and to the gameplay system 115. In various embodiments, though not shown in FIG. 1, the network 110 may be coupled to one or more of the first emitter 120-1, the first receiver 125-1, the Nth emitter 120-N, and the Nth receiver 125-N.
[0058] The gameplay system 115 may comprise one or more digital devices configured to support processes, applications, etc. on the communication devices 130. The gameplay system 115 may include dedicated, shared, or distributed servers. In some implementations, the gameplay system 115 supports augmented reality electronic gameplay by the communications devices 130. In various embodiments, the gameplay system 115 may facilitate creation of new games, and/or may manage player accounts. The gameplay system 115 may also allow for the management of aspects of existing electronic games. For instance, in some embodiments, the gameplay system 115 may track successful or unsuccessful actions by emitters associated with players. The gameplay system 115 may provide to communications devices whether an action by a particular emitter successfully registered at a particular receiver. The gameplay system 115 may further provide instructions to the wearable optical devices 135 to display virtual objects. In some embodiments, the gameplay system 115 may have some or all of the components of the gameplay system 115, shown in FIG. 5.
[0059] FIG. 1 depicts a first emitter 120-1 through an Nth emitter 120-N, a first receiver 125-1 through an Nth receiver 125-N, a first communications device 130-1 through an Nth communications device 130-N, and a first wearable optical device 135-1 through an Nth wearable optical device 135-N, in order to illustrate various implications of multiple players of sensor-based mobile gameplay. However, it is noted portions of the discussion herein refer to an "emitter 120" or "emitters 120," a "receiver 125" or "receivers 125", a "communications device 130" or "communication devices 130," and a "wearable optical device 135" or
"wearable optical devices 135" for simplicity.
[0060] FIG. IB depicts an example of an augmented reality gaming environment 100B, according to some embodiments. The augmented reality gaming environment 100B may include the first player environment 105-1 (having therein the first emitter 120-1, the first receiver 125-1, the first communications device 130-1, and the first wearable optical device 135-1) and the Nth player environment 105-N (having therein the Nth emitter 120-N, the Nth receiver 125-N, the Nth communications device 130-N, and the Nth wearable optical device 135-N). The augmented reality gaming environment 100B may further include a virtual in- game object 140 that is displayed on the first wearable optical device 135-1 and the Nth wearable optical device 135-N but is not present in the physical world. The virtual in-game object 140 may be seen by the first wearable optical device 135-1 at a first perspective 145-1, and may be seen by the Nth wearable optical device 135-N at an Nth perspective 145-N. The augmented reality gaming environment 100B may be defined by geo-fences 150. Each of the geo-fences 150 may limit the areas the augmented reality electronic game can be played.
[0061] Although the virtual in-game object 140 is depicted as a ball, it will be appreciated that the virtual in-game object 140 may be any creature (e.g., alien, human, animal, dragon, or the like), animated object, or inanimate object. There may be any number of virtual in-game objects 140 in the augmented reality gaming environment 100B.
[0062] FIG. 1C depicts an example of an interior view of a wearable optical device 135, according to some embodiments. The interior view in FIG. 1C includes a virtual inventory 155 of virtual items and a menu 160 for selecting actions. Each of the virtual inventory 155 and the menu 160 may be formed from virtual objects for an augmented reality electronic game.
[0063] FIG. ID depicts an example of an interior view of a wearable optical device 135, according to some embodiments. The interior view in FIG. ID includes a virtual health monitor 165, a virtual map 170, and a notification object 175. The virtual health monitor 165 may depict the health of a game player in an augmented reality electronic game; the virtual map 170 may depict a map of a virtual world in the augmented reality electronic game; and the notification object 175 may provide notifications related to the an augmented reality electronic game. Each of the virtual health monitor 165, the virtual map 170, and the notification object 175 may be formed from virtual objects for an augmented reality electronic game.
[0064] FIG. IE depicts an example of an interior view of a wearable optical device 135, according to some embodiments. The interior view in FIG. IE includes a representation of the virtual in-game object 140 and the virtual inventory 155. The representation of the virtual in- game object 140 and the virtual inventory 155 may be formed from virtual objects for an augmented reality electronic game.
[0065] Example Operation of the Augmented Reality Gaming System 100A within the Augmented Reality Gaming Environment 100B. [0066] In various embodiments, the augmented reality gaming system 100A allows one or more game players to play augmented reality electronic games that are supported by the data available over the network 110 (e.g., over the Internet). The augmented reality electronic games may comprise forms of alternate reality gaming in which aspects of the physical world are incorporated into mobile gameplay, and/or in which the physical world is augmented with virtual in-game objects 140 from the electronic game. The gaming experience provided by the augmented reality gaming system 100 A may provide new dimensions to outdoor games by leveraging smartphone technologies and the Internet, and bridging conventional gaming divides between the real world and digital worlds by combining physical participation, geolocational data, social networking data, and elements of games (such as action and/or role- playing games). The gameplay system 115 may also provide messaging and/or social media capabilities for players to communicate with each other. The augmented reality electronic game may be developed using a Game Development Kit (GDK).
[0067] Augmented reality electronic games supported by the augmented reality gaming system 100 A and/or the augmented reality gaming environment 100B may include actions game players take against each other as well as actions game players take against virtual in- game objects 140 rendered in wearable optical device(s) 135. In some implementations, the augmented reality electronic games may allow game players can use emitter(s) 120 to register hits against receiver(s) 125 (e.g., combat or adventure genres that allow players to simulate battles with one another). In a combat game, for instance, players may use emitters to attempt in-game actions, and receivers to register successful in-game actions. In such a game, the first emitter 120-1 may emit an emitter signal toward the Nth receiver 125-N each time the first player attempts to attack the Nth player. The in-game actions may correspond to a gun being shot, a sword being swung, or a grenade being launched. Emitter signals from the first emitter 120-1 may be encoded with the identity of the first emitter 120-1. The first emitter 120-1 may provide the first communications device 130-1 with information about in-game action attempts. By using emitters and receivers to register game actions, the augmented reality gaming system 100 A may allow players to verify the actions of other players. Players need not wonder whether, for instance, the first emitter 120-1 accurately took an action with respect to the Nth receiver 125-N. More specifically, the augmented reality gaming system 100A may allow users to use technologies such as geolocational technologies, infrared technologies, and data available over the network 110 to provide real-time feedback of gameplay between players.
[0068] In some implementations, the Nth receiver 125-N may register successful in-game actions each time the emitter signal successfully contacts the Nth receiver 125-N. For each successful in-game action, the Nth receiver 125-N may decode received emitter signals as needed. The Nth receiver 125-N may further provide information about successful in-game actions to the Nth communications device 130-N, which in turn may provide this information to the gameplay system 115. In these embodiments, the gameplay system 115 may provide information about the in-game actions, whether successful or not, to the first communications device 130-1 and the Nth communications device 130-N. The first communications device 130-1 and the Nth communications device 130-N may update user interface elements thereon accordingly.
[0069] In some implementations, the augmented reality electronic games supported by the augmented reality gaming system 100 A and/or the augmented reality gaming environment 100B may render the virtual in-game objects 140 in game players' wearable optical device(s) 135 and may allow game players to take actions against the virtual in-game objects 140. More specifically, the gameplay system 115 may determine the location of a game player using one or more location determination techniques. One example of location determination techniques that may be employed includes obtaining the game player' s location through Global
Positioning System (GPS) coordinates on a communications device 130. Another example of location determination techniques that may be employed includes placing physical sensors in one or more of the receiver(s) 125, and identifying locations of emitter(s) 120 within a geo- fenced region around those physical sensors (e.g., the region within the geo-fences 150). In various embodiments, the physical sensors may determine attributes such as altitude, distance, angular orientation, etc. of the emitter(s) 120 within the geo-fenced region. Yet another example of location determination techniques includes placing beacons (e.g., BLE beacons) within a geo-fenced region and using proximity of emitter(s) 120 to beacons to determine locations of game players. It is noted that some combination of these techniques may be employed in various implementations.
[0070] Moreover, the gameplay system 115 may select virtual in-game objects 140 to render in wearable optical device(s) 135. The selection of virtual in-game objects 140 may depend on a variety of factors, such as a state of gameplay and the location of a game player in the physical world. As examples, the gameplay system 115 may select virtual in-game objects 140 based on a level a game player is encountering in an augmented reality electronic game, the status of a game player within the level, the health of the game player, the number of points or virtual items the game player has earned, the status, levels, etc. of another player in the augmented reality electronic game, etc. As yet another example, the gameplay system 115 may select virtual items such as graphical elements that represent a game player' s health, points, and virtual goods if these virtual items are associated with a gameplay status of the game player at a given time and/or physical location. As another example, the gameplay system 115 may select a virtual in-game object 140 corresponding to a three-dimensional representation of a dragon if game players in an augmented reality electronic game are to fight a dragon as part of gameplay.
[0071] In some embodiments, the gameplay system 115 renders the selected virtual in-game objects 140 in wearable optical device(s) 135. The rendering of virtual in-game objects 140 may depend on a variety of factors, such as a state of gameplay and a perspective of a game player viewing the virtual in-game object 140 through a wearable optical device associated with the game player. To continue the foregoing examples, the gameplay system 115 may render perspectives of virtual in-game objects 140 based on a level a game player is encountering in an augmented reality electronic game, the status of a game player within the level, the health of the game player, the number of points or virtual items the game player has earned, the status, levels, etc. of another player in the augmented reality electronic game, etc. Further, the gameplay system 115 may render portions of a three-dimensional representation of a dragon that game players are expected to see based on an estimated perspective(s) of the game players. In a multi-player game, as a result, the gameplay system 115 may render multiple perspectives of the dragon; each of the multiple perspectives may depend on angles, distances, etc. between game players and the coordinates of the dragon in augmented reality. In various implementations, the gameplay system 115 accesses Computer Aided Design (CAD) files (e.g., Unity® files) related to in-game objects for rendering into wearable optical device(s) 135.
[0072] The gameplay system 115 may allow game players to interact with the virtual in- game object 140 by taking one or more actions against the virtual object. More specifically, in some embodiments, the gameplay system 115 allows game players to take actions against virtual in-game objects 140 using the emitter interaction mechanism on the emitter 120.
Examples of such actions may correspond to shooting of a gun, swinging of a sword, making a motion corresponding to casting a spell using a wand, throwing a grenade, and picking up an item during a scavenger hunt. In various embodiments, the gameplay system 115 allows game players to take actions against virtual in-game objects 140 using gestures or other user input on the communications device 130. Examples of such actions include switching weapons or reloading a weapon using radio buttons on the graphical user interface of the communications device 130. In some embodiments, the gameplay system 115 allows game players to take actions against virtual in-game objects 140 using the wearable optical device 135. Examples of such actions include voice commands, touch gestures on hardware on the wearable optical device 135, eye movements that are tracked by the wearable optical device 135, and motions detected by the wearable optical device 135 (pitch, roll, yaw, tilts, translations, any motion observed by an accelerometer or gyroscope, etc.).
[0073] The gameplay system 115 may register the game player's actions against the virtual in-game object 140 by recording the actions against the virtual object. The gameplay system 115 may further modify the state of the virtual in-game object 140 based on the actions. To continue the foregoing examples, in augmented reality electronic games involving virtual in- game objects 140 corresponding to representations of dragons, successful "hits" by the emitter 120 may be registered as injuries to the dragon. In response to such hits, the gameplay system 115 may render the dragon in a diminished capacity. As yet another example, if a game player could not defend against an attack by the dragon, the gameplay system 115 may reduce a virtual representation of the game player's in-game health.
[0074] Example Emitter 120.
[0075] FIG. 2 depicts an example of an emitter 120, according to some embodiments. The emitter 120 may include a communications interface module 205, a emitter interaction mechanism 210, a speaker 215, a short-range infrared transmitter 220, a long-range infrared transmitter 225, a beam encoder module 230, and a controller 235. The emitter 120 may include sensors and/or components not identified explicitly in FIG. 2.
[0076] The communications interface module 205 may facilitate communications between the emitter 120 and the communications device 130. In an embodiment, the communications interface module 205 facilitates pairing between the emitter 120 and the communications device 130. In various embodiments, the communications interface module 205 may be configured as a Bluetooth® pairing module that allows the emitter 120 to be wirelessly coupled to the communications device 130. The communications interface module 205 may also include any wireless or wired network hardware and/or software in various embodiments. The communications interface module 205 may receive instructions from the controller 235.
[0077] The emitter interaction mechanism 210 may allow a player to initiate an action. In some embodiments, the emitter interaction mechanism 210 may correspond to a trigger of a gun. The emitter interaction mechanism 210 may also correspond to a portion (e.g., a blade portion) of a sword or a grenade, depending on a type of weapon the emitter 120 is intended to model. The emitter interaction mechanism 210 may also correspond to a portion of a metal detector for a scavenger-hunt game. The emitter interaction mechanism 210 may provide a signal to the controller 235 when an action has been initiated.
[0078] The speaker 215 may provide an audible sound. In various embodiments, the speaker 215 may provide sounds related to sensor-based mobile gameplay when the emitter interaction mechanism 210 has been activated. The sound may correspond to the nature of the action initiated. For instance, the speaker 215 may provide sounds similar to the shooting of a gun, the clash of a sword on armor, or the explosion of a grenade In various embodiments, the speaker 215 may provide in-game information such as in-game sounds, story narration, clues, and/or other information to enhance gameplay experiences. The speaker 215 may receive instructions from the controller 235.
[0079] The short-range infrared transmitter 220 and the long-range infrared transmitter 225 may each emit an infrared signal corresponding to an emitter signal. The short-range infrared transmitter 220 and the long-range infrared transmitter 225 may have different ranges, or may have partially overlapping ranges. The short-range infrared transmitter 220 and the long-range infrared transmitter 225 may provide infrared signals in response to the emitter interaction mechanism 210. The short-range infrared transmitter 220 and the long-range infrared transmitter 225 may receive instructions from the controller 235. It is noted that one or more of the short-range infrared transmitter 220 and the long-range infrared transmitter 225 may be replaced or augmented by non-infrared technologies, such as other wireless technologies and/or NFC technologies, without departing from the scope and substance of the inventive concepts herein. [0080] The beam encoder module 230 may encode emitter signals with an identifier corresponding to the identity of the emitter 120. In some embodiments, the beam encoder module 230 may receive a unique identifier of the emitter 120 from the controller 235. The beam encoder may further encode emitter signals with the unique identifier. Encoding may involve frequency selection frequency modulation of the emitter signal, or encoding particular sequences of data into the emitter signal from the emitter 120. The beam encoder module 230 may provide the code to the short-range infrared transmitter 220 and/or the long-range infrared transmitter 225.
[0081] The controller 235 may control other components of the emitter 120. The controller 235 may provide instructions to one or more of the communications interface module 205, the emitter interaction mechanism 210, the speaker 215, the short-range infrared transmitter 220, the long-range infrared transmitter 225, and the beam encoder module 230. In some embodiments, the controller 235 may include a processor and memory. The controller 235 may include a mobile device processor and static or dynamic memory.
[0082] Example Receiver 125.
[0083] FIG. 3 depicts an example of a receiver 125, according to some embodiments. The receiver 125 may include a communications interface module 305, an infrared receiver 310, a beam decoder 315, a vibrator 320, a speaker 325, Light Emitting Diodes (LEDs) 330, and a controller 335. The receiver 125 may include sensors and/or components not identified explicitly in FIG. 3.
[0084] The communications interface module 305 may facilitate communications between the receiver 125 and the communications device 130. In an embodiment, the communications interface module 305 facilitates pairing between the receiver 125 and the communications device 130. In various embodiments, the communications interface module 305 may be configured as a Bluetooth® pairing module that allows the receiver 125 to be wirelessly coupled to the communications device 130. The communications interface module 305 may also include any wireless or wired network hardware and/or software in various embodiments. The communications interface module 305 may receive instructions from the controller 335.
[0085] The infrared receiver 310 may receive infrared signals. In various embodiments, the infrared receiver 310 may be implemented as an electromagnetic receiver that filters out frequencies other than infrared signals. It is noted the infrared receiver 310 may be replaced or augmented by non-infrared technologies, such as other wireless technologies and/or NFC technologies, without departing from the scope and substance of the inventive concepts herein. The infrared receiver 310 may provide received infrared signals to the beam decoder 315 and/or other modules of the receiver 125.
[0086] The beam decoder 315 may decode received emitter signals. More specifically, the beam decoder 315 may identify an emitter identifier encoded in emitter signals received by the infrared receiver 310. In various embodiments, the beam decoder 315 may receive instructions from the controller 335.
[0087] The vibrator 320 may cause the receiver 125 to physically move. The speaker 325 may make an audible noise. The LEDs 330 may cause all or a part of the receiver 125 to appear to light up. In various embodiments, the vibrator 320, the speaker 325, and the LEDs 330 may receive instructions from the controller 335 to be activated when the infrared receiver 310 has received an emitter signal that indicates a gameplay action by an emitter.
[0088] The controller 335 may control other components of the receiver 125. The controller 235 may provide instructions to one or more of the communications interface module 305, the infrared receiver 310, the beam decoder 315, the vibrator 320, the speaker 325, and the Light Emitting Diodes (LEDs) 330. The controller 335 may include a processor (e.g., a mobile device processor) and memory (e.g., static or dynamic memory).
[0089] Example Communications Device 130.
[0090] FIG. 4 depicts an example of a communications device 130, according to some embodiments. The communications device 130 may include a pairing management module 405, a user interface module 410, an emitter interface module 415, a receiver interface module 420, a gameplay cloud interface module 425, a gameplay memory datastore 430, a wearable optical device interface module 435, a communication device interaction recognition module 440, and a local environment determination module 445. One or more of the pairing management module 405, the user interface module 410, the emitter interface module 415, the receiver interface module 420, the gameplay cloud interface module 425, the gameplay memory datastore 430, the wearable optical device interface module 435, the communication device interaction recognition module 440, and the local environment determination module 445 may include hardware and/or software, in various embodiments. One or more of the pairing management module 405, the user interface module 410, the emitter interface module 415, the receiver interface module 420, the gameplay cloud interface module 425, the gameplay memory datastore 430, the wearable optical device interface module 435, the communication device interaction recognition module 440, and the local environment determination module 445 may be coupled to one another or to components external to the communications device 130.
[0091] The pairing management module 405 may configure the communications device 130 to be paired with other devices. In various embodiments, the pairing management module 405 may include a Bluetooth® pairing module that facilitates wireless pairing with other devices. The pairing management module 405 may also perform other types of pairing to couple the communications device 130 to other devices without departing from the scope and the substance of the inventive concepts herein. In embodiments, the pairing management module 405 may facilitate pairing with one or more of the emitter 120, the receiver 125, and the wearable optical device 135.
[0092] The user interface module 410 may facilitate user interaction with the
communications device 130. In some embodiments, the user interface module 410 may configure a display of the communications device 130 to provide one or more user interface elements with which a player can interact. The user interface module 410 may further provide scenes, views, perspectives, and other attributes of gameplay to a user. The user interface module 410 may also facilitate user input to the communications device 130. The user interface module 410 may include video processing hardware and/or software, in various embodiments.
[0093] The emitter interface module 415 may facilitate interfacing with the emitter 120. In various embodiments, the emitter interface module 415 may receive and/or provide data to the emitter 120. The receiver interface module 420 may facilitate interfacing with the receiver 125. In various embodiments, the receiver interface module 420 may receive and/or provide data to the receiver 125.
[0094] The gameplay cloud interface module 425 may facilitate coupling the
communications device 130 to the gameplay system 115. In various embodiments, the gameplay cloud interface module 425 may receive and/or provide data to the gameplay system 115. The gameplay cloud interface module 425 may, in various embodiments, provide player information (e.g., player information related to the emitter 120) to the gameplay system 115. The gameplay cloud interface module 425 may incorporate network interface hardware and/or software to facilitate interfacing with the network 110.
[0095] The wearable optical device interface module 435 may facilitate interfacing with the wearable optical device 135. In various embodiments, the wearable optical device interface module 435 may receive and/or provide data to the wearable optical device 135.
[0096] The communication device interaction recognition module 440 may receive user interactions. In some embodiments, the communication device interaction recognition module 440 receives and/or identifies gestures or other user input to the communications device 130. As examples, the communication device interaction recognition module 440 may receive and/or identify switching weapons or reloading a weapon using radio buttons on the graphical user interface of the communications device 130.
[0097] The local environment determination module 445 may provide data (such as a location of the communications device 130) that used to recognize parameters of the physical world around the communications device 130. In some embodiments, the local environment determination module 445 includes a GPS receiver that identifies GPS coordinates of the communications device 130. In embodiments, the local environment determination module 445 may include hardware and/or software that interface with physical sensors on receiver(s) 125 and allows determination of location based on proximity and/or other physical parameters to the receiver(s) 125. In some embodiments, the local environment determination module 445 includes BLE hardware and/or software that provides a location of the communications device 130 based on proximity to locational beacons.
[0098] Example Wearable Optical Device 135.
[0099] FIG. 5 depicts an example of a wearable optical device 135, according to some embodiments. The wearable optical device 135 may include a communications interface module 505, a display rendering module 510, an eye movement recognition module 515, a touch input recognition module 520, a voice input recognition module 525, an emitter interaction recognition module 530, a motion detection module 535, and a controller 540. One or more of the communications interface module 505, the display rendering module 510, the eye movement recognition module 515, the touch input recognition module 520, the voice input recognition module 525, the emitter interaction recognition module 530, the motion detection module 535, and the controller 540 may include hardware and/or software, in various embodiments. One or more of the communications interface module 505, the display rendering module 510, the eye movement recognition module 515, the touch input recognition module 520, the voice input recognition module 525, the emitter interaction recognition module 530, the motion detection module 535, and the controller 540 may be coupled to one another or to components external to the wearable optical device 135.
[0100] The communications interface module 505 may facilitate communications between the wearable optical device 135 and the communications device 130. In an embodiment, the communications interface module 505 facilitates pairing between the wearable optical device 135 and the communications device 130. In various embodiments, the communications interface module 505 may be configured as a Bluetooth® pairing module that allows the wearable optical device 135 to be wirelessly coupled to the communications device 130. The communications interface module 505 may also include any wireless or wired network hardware and/or software in various embodiments. The communications interface module 505 may receive instructions from the controller 540.
[0101] The display rendering module 510 may render virtual objects onto a display of the wearable optical device 135. In some implementations, the display rendering module 510 addresses pixels and/or other portions of a display of the wearable optical device 135 to show virtual objects.
[0102] The eye movement recognition module 515 may track eye movements of a user of the wearable optical device 135. In some implementations, the eye movement recognition module 515 recognizes commands, actions, etc. based on eye movements.
[0103] The touch input recognition module 520 may recognize touch input by a user of the wearable optical device 135. In various implementations, the touch input recognition module 520 recognizes commands, actions, etc. based on touches (e.g., touches to various external surfaces of the wearable optical device 135).
[0104] The voice input recognition module 525 may recognize voice input by a user of the wearable optical device 135. In various implementations, the voice input recognition module 525 recognizes commands, actions, etc. based on natural language commands provided by the user of the wearable optical device 135.
[0105] The emitter interaction recognition module 530 may recognize actions based on touches, motions, etc. of the emitter 120. In some implementations, the emitter interaction recognition module 530 recognizes commands, actions, etc. based on touches, motions, etc. of the emitter 120.
[0106] The motion detection module 535 may recognize motion (pitch, roll, yaw, tilts, translations, any motion observed by an accelerometer or gyroscope, etc.) of the wearable optical device 135. In various implementations, the motion detection module 535 recognizes commands, actions, etc. based on how the user of the wearable optical device 135 moves the wearable optical device 135.
[0107] The controller 540 may control other components of the wearable optical device 135. The controller 540 may provide instructions to one or more of the communications interface module 505, the display rendering module 510, the eye movement recognition module 515, the touch input recognition module 520, the voice input recognition module 525, the emitter interaction recognition module 530, and the motion detection module 535. The controller 540 may include a processor (e.g., a mobile device processor) and memory (e.g., static or dynamic memory).
[0108] Example Gameplay System 115.
[0109] FIG. 6 shows an example of a gameplay system 115, according to some
embodiments. The gameplay system 115 may include a mobile device interface module 605, an account management module 610, a new game creation module 615, a game code distribution module 620, a gameplay display management module 625, an account datastore 630, a device datastore 635, and a game datastore 640. One or more of the mobile device interface module 605, the account management module 610, the new game creation module 615, the game code distribution module 620, the gameplay display management module 625, the account datastore 630, the device datastore 635, and the game datastore 640 may include hardware and/or software. One or more of the mobile device interface module 605, the account management module 610, the new game creation module 615, the game code distribution module 620, the gameplay display management module 625, the account datastore 630, the device datastore 635, and the game datastore 640 may be coupled to one another or to components external to the gameplay system 115.
[0110] The mobile device interface module 605 may facilitate coupling the gameplay system 115 to the communications device 130. In various embodiments, the mobile device interface module 605 may receive and/or provide data to the communications device 130. The mobile device interface module 605 may incorporate network interface hardware and/or software to facilitate interfacing with the network 110.
[0111] The account management module 610 may manage accounts for players of sensor- based mobile gameplay. The account management module 610 may manage information such as players' points, usernames, and levels. The account management module 610 may also manage players' relationships with each other. For example, the account management module 610 may manage actions specific players have taken with respect to other players. In various embodiments, the account management module 610 may manage player accounts based on information about players stored in the account datastore 630. The account management module 610 may also manage player accounts based on information about devices stored in the device datastore 635.
[0112] The new game creation module 615 may facilitate creation of new games. In various embodiments, the new game creation module 615 may receive instructions to create a new game from a player. The instructions may include identifiers of all players who are invited to play the game. In response to the instructions, the new game creation module 615 may obtain a game instance from the game datastore 640, and place the game instance into memory of the gameplay system 115. The new game creation module 615 may further associate the instance of the game with the identifiers of the players invited to play the game. In various embodiments, the new game creation module 615 may create a game code for the instance of the new game. The new game creation module 615 may provide the game code to the game code distribution module 620.
[0113] The game code distribution module 620 may distribute the game code to all players who have been invited to play the instance of the new game. The game code distribution module 620 may receive from the new game creation module 615 a game code for a new game. In various embodiments, the game code distribution module 620 may further obtain, from the account management module 610 or otherwise, contact information of each of the players who were invited to play the game. The game code distribution module 620 may provide the game code for a new game to the contact information of each of the players who were invited to play the game.
[0114] The gameplay display management module 625 may manage aspects of gameplay related to a new or existing augmented reality electronic game. In various embodiments, the gameplay display management module 625 may identify actions one player has taken with respect to another player. For example, the gameplay display management module 625 may identify whether a receiver of a second player has registered an in-game action from an emitter of a first player. The gameplay display management module 625 may also identify movements or evasive actions on the part of the second player. In some embodiments, the gameplay display management module 625 may associate points with specific actions by players of the game. The gameplay display management module 625 may also manage lives, levels, and coordinate group gameplay between players of the game. In some embodiments, the gameplay display management module 625 may manage a storyline underlying the gameplay. For example, in a first-person shooting game, the gameplay display management module 625 may manage a storyline associated with players entering into combat with one another. In various embodiments, the gameplay display management module 625 may support messaging between players. In embodiments, the gameplay display management module 625 may further render scenes, views, perspectives, and other attributes of gameplay on the user interface module 410, shown in FIG. 4.
[0115] In various embodiments, the gameplay display management module 625 manages display of virtual objects in the wearable optical device 135 as part of augmented reality electronic gaming. To this end, the gameplay display management module 625 may select virtual objects for a game player based on one or more factors (a state of gameplay, the location of a game player in the physical world, etc.). The gameplay display management module 625 may identify one or more perspectives a game player is likely to have with respect to a virtual object, and may render those perspectives of the virtual object on the wearable optical device 135 associated with that game player. The gameplay display management module 625 may further receive interactions from the game player with respect to the virtual object. Examples of interactions may include actions using the emitter 120 (activity related to the emitter interaction mechanism 210, etc.), actions using the communication device 130 (activity related to the communications device 130, etc.) , and actions using the wearable optical device 135 (eye movements, touch inputs, voice inputs, movement(s), etc.). In various implementations, the gameplay display management module 625 registers actions against virtual objects by modifying the state of the virtual objects. FIG. 7 shows the gameplay display management module 625 in greater detail.
[0116] The account datastore 630 may store information related to player accounts. The account datastore 630 may store information such as players' points, usernames, and players' relationships with each other, actions specific players have taken with respect to other players, and other information. The device datastore 635 may store devices that have participated in gameplay. The game datastore 640 may store game instances. In various embodiments, game instances are implemented as data structures in the game datastore 640 that can be instantiated and placed into memory by the new game creation module 615.
[0117] Example Gameplay Display Management Module 625.
[0118] FIG. 7 depicts an example of a gameplay display management module 625, according to some embodiments. The gameplay display management module 625 may include a gameplay state management module 705, a user location determination module 710, a user perspective selection module 715, a virtual object management module 720, a virtual object perspective module 725, a virtual object rendering module 730, an interaction management module 735, a virtual space mapping module 740, a gameplay state datastore 745, a physical environment mapping datastore 750, a virtual object datastore 755, and a virtual space mapping datastore 760.
[0119] One or more of the gameplay state management module 705, the user location determination module 710, the user perspective selection module 715, the virtual object management module 720, the virtual object perspective module 725, the virtual object rendering module 730, the interaction management module 735, the virtual space mapping module 740, the gameplay state datastore 745, the physical environment mapping datastore 750, the virtual object datastore 755, and the virtual space mapping datastore 760 may include hardware and/or software. One or more of the gameplay state management module 705, the user location determination module 710, the user perspective selection module 715, the virtual object management module 720, the virtual object perspective module 725, the virtual object rendering module 730, the interaction management module 735, the virtual space mapping module 740, the gameplay state datastore 745, the physical environment mapping datastore 750, the virtual object datastore 755, and the virtual space mapping datastore 760 may be coupled to one another or to components external to the gameplay display management module 625.
[0120] The gameplay state management module 705 may manage state(s) of augmented reality electronic gameplay. In various implementations, the gameplay state management module 705 retrieves, modifies, updates, etc. state(s) of augmented reality electronic games in the gameplay state datastore 745. The gameplay state management module 705 may receive instructions from the virtual object rendering module 730 to modify gameplay state(s) based on virtual objects, and/or the interaction management module 735 to modify gameplay state(s) based on interactions with the emitter 120, the communications device 130, and the wearable optical device 135.
[0121] The user location determination module 710 may identify locations of game players. In some embodiments, the user location determination module 710 gathers GPS coordinates of game players from GPS devices on emitter(s) 120, receiver(s) 125, and/or communication device(s) 130. In various embodiments, the user location determination module 710 may determine the locations of game players based on the orientations of receiver(s) 125 and/or communication device(s) 130 in relation to receiver(s) 125 in a geo-fenced region(e.g., by determining the proximity of an emitter 120 or a communication device 130 to a receiver 125 in a geo-fenced region). In some embodiments, the user location determination module 710 receives information from beacons (e.g., BLE beacons) on emitter(s) 120 and/or
communication device(s) 130 to determine locations of game players. It is noted the user location determination module 710 may determine location of game players using some combination of the techniques herein or using techniques not described explicitly herein.
[0122] The user perspective selection module 715 may select one or more perspectives game players may have of the physical world. In various embodiments, the user perspective selection module 715 gather s information about the physical world from the physical environment mapping datastore 750. The user perspective selection module 715 may further identify a game player's distances, orientations, etc. with respect to obstacles, contours, etc. in the game player's physical environment. In various embodiments, the user perspective selection module 715 may provide information about game players' perspectives regarding a physical environment to other modules. [0123] The virtual object management module 720 may select virtual objects to be displayed on the communications device(s) 130 and/or the wearable optical device(s) 135. In some embodiments, the virtual object management module 720 gathers relevant virtual objects from the virtual object datastore 755 based on gameplay state(s) and/or physical location(s) of game players. As an example of operation, the virtual object management module 720 may gather specific virtual objects for game players who have reached specific game levels, accrued specific amounts of game points, and/or confronted specific virtual characters or virtual items. For instance, the virtual object management module 720 may gather a virtual object containing a representation of a dragon or other mythical creature in an augmented reality electronic fantasy game in which a game player has passed a certain game level. As another example of operation, the virtual object management module 720 may gather virtual objects related to specific physical locations or environments of game players. For instance, in an augmented reality electronic game in which game players are in the desert, the virtual object management module 720 may select clay targets to display on the wearable optical device(s) 135 of game players.
[0124] The virtual object perspective module 725 may select perspectives of virtual objects for rendering. In some embodiments, the selection of perspective may depend on the angles, distances, and orientations of game player(s) from a projection of a virtual object. As an example of operation, the virtual object perspective module 725 may determine that the virtual object management module 720 selected a virtual object that projects an image of a fifty foot dragon approximately twenty feet in the air above two game players. To continue this example, the wearable optical device 135 of the first game player may need to view the right side of the dragon, while the wearable optical device 135 of the second game player may need to view the front of the dragon. The virtual object perspective module 725 may identify, based on properties of the CAD file corresponding to the virtual object, a first perspective
corresponding to the right side of the dragon, and a second perspective corresponding to the front of the dragon. These perspectives may form the basis of rendering, as discussed further herein.
[0125] The virtual object rendering module 730 may render virtual objects in the communication device(s) 130 and/or the wearable optical device(s) 135. In some
implementations, the virtual object rendering module 730 may receive a virtual object from the virtual object management module 720, and receive a perspective of that virtual object from the virtual object perspective module 725. The virtual object rendering module 730 may instruct relevant displays on the communication device(s) 130 and/or the wearable optical device(s) 135 to display the virtual object from the selected perspective.
[0126] The interaction management module 735 may detect interactions by game players, in some embodiments, the interaction management module 735 may monitor the emitter interaction mechanism 210 on the emitter 120 for actions taken in response to a virtual object. Examples of such actions may correspond to shooting of a gun, swinging of a sword, making a motion corresponding to casting a spell using a wand, throwing a grenade, and picking up an item during a scavenger hunt. In various embodiments, the interaction management module 735 may monitor gestures or other input on the communications device 130. Examples of such actions include switching weapons or reloading a weapon using radio buttons on the graphical user interface of the communications device 130. Further, in some embodiments, the interaction management module 735 may monitor actions on the wearable optical device 135. Examples of such actions include voice commands, touch gestures on hardware on the wearable optical device 135, eye movements that are tracked by the wearable optical device 135, and motions detected by the wearable optical device 135 (pitch, roll, yaw, tilts, translations, any motion observed by an accelerometer or gyroscope, etc.). The interaction management module 735 may provide information related to detected interactions to other modules, such as the gameplay state management module 705.
[0127] The virtual space mapping module 740 may map models of user interactions and virtual objects into a virtual space. The virtual space may be indexed by a relevant coordinate system (e.g., a Cartesian coordinate system) that specifies the distance and direction models of user interactions and/or virtual objects are projected away from a game player. The maps of virtual spaces may be gathered from the virtual space mapping datastore 760. In some embodiments, the virtual space mapping module 740 may identify one or more areas in a virtual space that corresponds to models of user interactions and/or virtual objects. The virtual space mapping module 740 may further determine whether one area in a virtual space overlaps with another area in the virtual space.
[0128] The gameplay state datastore 745 may store the various states of one or more augmented reality electronic games. In some embodiments, the gameplay state datastore 745 stores sequences of actions, levels, triggers, conditions, etc. that may form the basis of the states of augmented reality electronic games. The states of augmented reality electronic games may be updated, modified, etc. as the game players progress through the augmented reality electronic games. As an example, the states of augmented reality electronic games may change as the gameplay state management module 705 receives information about user actions with virtual objects, as discussed further herein.
[0129] The physical environment mapping datastore 750 may store files that have information related to one or more physical environments. In some embodiments, the files provide information about what the physical world around game players looks like. As an example, the files may provide information about open areas, obstacles, and contours of physical items within a particular physical environment. In some embodiments, the physical environment mapping datastore 750 gathers relevant geographical information from
geographical databases, such as map databases, databases of building plans, etc. In various embodiments, the physical environment mapping datastore 750 gathers geographical information about game players' environments from meshes, such as predetermined meshes that provide information about open areas, obstacles, and contours of physical items within a particular physical environment as well as meshes generated using cameras on wearable optical device(s) 135.
[0130] The virtual object datastore 755 may store files that represent virtual objects. In various embodiments, the virtual object datastore 755 stores libraries of CAD files (e.g., Unity® files) that represent virtual objects. The CAD files may further specify how virtual objects appear from various perspectives, including various angles, distances, and orientations. In some embodiments, the virtual object datastore 755 obtains the CAD files from external sources, such as third-party illustrators and/or publishers. Further, representation of virtual objects in the virtual object datastore 755 may relate to a particular augmented reality electronic game or genre of augmented reality electronic games (e.g., the virtual object datastore 755 may store representation of fantasy creatures for an augmented reality electronic game having fantasy themes, representations of combat vehicles for a augmented reality electronic game having a combat theme, representations of inanimate objects for a augmented reality electronic game implementing a scavenger hunt, etc.).
[0131] The virtual space mapping datastore 760 may store maps of the virtual spaces used to project models of user interactions and virtual objects. In various implementations, the virtual spaces in the virtual space mapping datastore 760 are indexed by a relevant coordinate system (e.g., a Cartesian coordinate system) that specifies the distance and direction models of user interactions and/or virtual objects are projected away from a game player. As an example, the virtual space mapping datastore 760 may store a map of a virtual space that represents all items within the field of view of a game player. In this example, the map may contain a virtual object of a dragon that is represented about fifty feet directly East of the game player at a height of fifty feet. The map may further contain objects of user interactions with the dragon, such as objects that represent a specified number of shots (and the directions of such shots) the game player has taken at the object using an emitter 120.
[0132] Example Flowcharts of Methods of Operation.
[0133] FIG. 8 depicts a flowchart of an example of a method 800 for facilitating an augmented reality electronic game having one or more virtual objects, according to some embodiments. The method 800 is discussed in conjunction with the gameplay display management module 625, shown in FIG. 7, and discussed further herein. It is noted that at least some of the operations of the method 800 may be optional, and that the method 800 need not include all of the operations shown in FIG. 8.
[0134] At an operation 805, the user location determination module 710 may determine a location of a game player of an augmented reality electronic game. In various
implementations, the user location determination module 710 gathers user location information from a GPS transmitter on the communication device 130, from proximity data between the emitter 120 and the receiver 125, from BLE beacons coupled to the emitter 120, the receiver 125, the communications device 130, and/or the wearable optical device 135, and/or other techniques described herein.
[0135] At an operation 810, the virtual object management module 720 may identify a virtual in-game object to be rendered in a display used to display at least a portion of the augmented reality electronic game. In various embodiments, the virtual object management module 720 selects virtual in-game objects for the augmented reality electronic game based on one or more factors, such as locations of game players and game states of the augmented reality electronic game. As an example, the virtual object management module 720 may select virtual in-game objects for game players based on specific locations of the game players in the physical world. As another example, the virtual object management module 720 may select virtual in-game objects for game players based on levels/points/etc. the game players have achieved in the augmented reality electronic game.
[0136] At an operation 815, the virtual object perspective module 725 may identify a game player perspective of a game player in relation to the virtual in-game object. In some implementations, the virtual object perspective module 725 gathers angles, orientations, distances, etc. from the game player to a projection of the virtual object. The virtual object perspective module 725 may further evaluate, based on parameters of the virtual object, how the virtual object would appear to the display of the game player if the virtual object were projected into the physical environment around the game player.
[0137] At an operation 820, the virtual object rendering module 730 may render the virtual in-game object in the display in accordance with the game player perspective. More particularly, the virtual object rendering module 730 may instruct the display to display the virtual in-game object in accordance with the user perspective identified by the virtual object perspective module 725.
[0138] At an operation 825, the interaction management module 735 may receive user interaction with the virtual in-game object in the augmented reality electronic game.
Interactions may include input to the emitter 120, the communication device 130, and/or the wearable optical device 135. The interaction management module 735 may provide this input to the gameplay state management module 705, so that the state of the augmented reality electronic game may be appropriately updated and/or modified.
[0139] At an operation 830, the virtual space mapping module 740 may identify a first area in a virtual space corresponding to the virtual in-game object. At an operation 835, the virtual space mapping module 740 may identify a second area in the virtual space corresponding to the virtual in-game object. At an operation 840, the virtual space mapping module 740 may determine whether the second area overlaps the first area.
[0140] At an operation 845, the gameplay state management module 705 may modify a state of the virtual in-game object based on the user interaction. The gameplay state management module 705 may provide instructions to modify the virtual in-game object to the other modules of the gameplay display management module 625. The modified state of the virtual in-game object may be stored in the virtual object datastore 755. [0141] At an operation 850, the virtual object rendering module 730 may render a modified virtual in-game object on the display based on the modified state. The virtual object rendering module 730 may also instruct the display to display the modified virtual in-game object in accordance with the modifications.
[0142] FIG. 9 depicts a flowchart of an example of a method 900 for rendering a virtual object in an augmented reality electronic game, according to some embodiments. The method 900 is discussed in conjunction with the gameplay display management module 625, shown in FIG. 7, and discussed further herein. It is noted that at least some of the operations of the method 900 may be optional, and that the method 900 need not include all of the operations shown in FIG. 9.
[0143] At an operation 905, the user location determination module 710 may identify a physical location of a game player of an augmented reality electronic game. In various implementations, the user location determination module 710 gathers user location information from a GPS transmitter on the communication device 130, from proximity data between the emitter 120 and the receiver 125, from BLE beacons coupled to the emitter 120, the receiver 125, the communications device 130, and/or the wearable optical device 135, and/or other techniques described herein.
[0144] At an operation 910, the gameplay state management module 705 may identify a gameplay state of the augmented reality electronic game. More particularly, the gameplay state management module 705 may relevant gameplay levels, points, etc. associated with the gameplay state of the augmented reality electronic game.
[0145] At an operation 915, the virtual object management module 720 may identify in the virtual object datastore 755 a virtual in-game object associated with the physical location or the gameplay state. More particularly, the virtual object management module 720 may select virtual in-game objects that gameplay rules indicate may be projected at the identified location and/or in response to the identified gameplay state of the augmented reality electronic game. At an operation 920, the virtual object management module 720 may gather the virtual in-game object from the virtual object datastore 755.
[0146] FIG. 10 depicts a flowchart of an example of a method 1000 for modifying a state of a virtual object in an augmented reality electronic game, according to some embodiments. The method 1000 is discussed in conjunction with the gameplay display management module 625, shown in FIG. 7, and discussed further herein. It is noted that at least some of the operations of the method 1000 may be optional, and that the method 1000 need not include all of the operations shown in FIG. 10.
[0147] At an operation 1005, the virtual object management module 720 may identify a virtual in-game object displayed in accordance with an augmented reality electronic game. More particularly, in some embodiments, the virtual object management module 720 may receive from the gameplay state management module 705 identifiers of virtual in-game objects that have been displayed in an augmented reality electronic game. For instance, the virtual object management module 720 may receive from the gameplay state management module 705 an identifier of a dragon or other virtual object displayed in the communications device 130 and/or the wearable optical device 135.
[0148] At an operation 1010, the interaction management module 735 may receive user interactions in the augmented reality electronic game. Interactions may include input to the emitter 120, the communication device 130, and/or the wearable optical device 135. The interaction management module 735 may provide this input to the gameplay state management module 705, so that the state of the augmented reality electronic game may be appropriately updated and/or modified.
[0149] At an operation 1015, the virtual object management module 720 may associate the user interactions with one or more parameters of the virtual in-game object. More particularly, the virtual object management module 720 may determine the extent these user interactions correspond to changes in the virtual in-game object. As an example, if a user uses an emitter 120 to "shoot" at a virtual in-game object that represents a dragon, the virtual object management module 720 may determine where the shots from the emitter 120 would project on the dragon.
[0150] At an operation 1020, the virtual object management module 720 may modify the one or more parameters of the virtual in-game object. To continue the foregoing example, a user uses an emitter 120 to "shoot" at a virtual in-game object that represents a dragon, the virtual object management module 720 may modify portions of an image that represents where the shot would have projected on the dragon. At an operation 1025, the virtual object management module 720 may store the virtual in-game object with the modified parameters. [0151] FIG. 11 depicts a flowchart of an example of a method for facilitating an augmented reality electronic game having one or more virtual objects, according to some embodiments. The method 1100 is discussed in conjunction with the gameplay display management module 625, shown in FIG. 7, and discussed further herein. It is noted that at least some of the operations of the method 1100 may be optional, and that the method 1100 need not include all of the operations shown in FIG. 11.
[0152] FIG. 11 depicts a flowchart of an example of a method 1100 for facilitating an augmented reality electronic game having one or more virtual objects, according to some embodiments. The method 1100 is discussed in conjunction with the gameplay display management module 625, shown in FIG. 7, and discussed further herein. It is noted that at least some of the operations of the method 1100 may be optional, and that the method 1100 need not include all of the operations shown in FIG. 11.
[0153] At an operation 1105, the user location determination module 710 may determine a location of a game player of an augmented reality electronic shooting game. In various implementations, the user location determination module 710 gathers user location information from a GPS transmitter on the communication device 130, from proximity data between the emitter 120 and the receiver 125, from BLE beacons coupled to the emitter 120, the receiver 125, the communications device 130, and/or the wearable optical device 135, and/or other techniques described herein.
[0154] At an operation 1110, the virtual object management module 720 may identify a virtual in-game object corresponding to virtual shooting targets to be rendered in a display used to display at least a portion of the augmented reality electronic game. In various embodiments, the virtual object management module 720 selects virtual in-game objects of virtual shooting targets for the augmented reality electronic game based on one or more factors, such as locations of game players and game states of the augmented reality electronic game. As an example, the virtual object management module 720 may select virtual in-game objects for game players based on specific locations of the game players in the physical world. As another example, the virtual object management module 720 may select virtual in-game objects for game players based on levels/points/etc. the game players have achieved in the augmented reality electronic game. [0155] At an operation 1115, the virtual object perspective module 725 may identify a game player perspective of a game player in relation to the virtual shooting targets. In some implementations, the virtual object perspective module 725 gathers angles, orientations, distances, etc. from the game player to a projection of the virtual object. The virtual object perspective module 725 may further evaluate, based on parameters of the virtual object, how the virtual object would appear to the display of the game player if the virtual object were projected into the physical environment around the game player.
[0156] At an operation 1120, the virtual object rendering module 730 may render the virtual shooting targets in the display in accordance with the user perspective. More particularly, the virtual object rendering module 730 may instruct the display to display the virtual in-game object in accordance with the user perspective identified by the virtual object perspective module 725.
[0157] At an operation 1125, the interaction management module 735 may receive through the emitter 120 user interaction with the virtual shooting targets in the augmented reality electronic game. For instance, the interaction management module 735 may receive indication that a game player squeezed at trigger of the emitter 120 to shoot at the virtual shooting targets. The interaction management module 735 may provide this input to the gameplay state management module 705, so that the state of the augmented reality electronic game may be appropriately updated and/or modified (e.g., so that the virtual shooting targets can register hits against them).
[0158] At an operation 1130, the gameplay state management module 705 may modify an appearance of the virtual shooting targets based on the user interaction. The gameplay state management module 705 may provide instructions to modify the virtual in-game object to the other modules of the gameplay display management module 625. The modified state of the virtual in-game object may be stored in the virtual object datastore 755.
[0159] At an operation 1135, the virtual object rendering module 730 may render a modified virtual shooting target on the display based on the modified appearance. The virtual object rendering module 730 may also instruct the display to display the modified virtual in- game object in accordance with the modifications. As an example, the virtual object rendering module 730 may render virtual shooting targets that have been hit or have exploded as a result of being shot by the game player in the augmented reality electronic game. [0160] FIG. 12 depicts an example of a digital device 1200, according to some
embodiments. The digital device 1200 comprises a processor 1205, a memory system 1210, a storage system 1215, a communication network interface 1220, an Input/output (I/O) interface 1225, a display interface 1230, and a bus 1235. The bus 1235 may be communicatively coupled to the processor 1205, the memory system 1210, the storage system 1215, the communication network interface 1220, the I/O interface 1225, and the display interface 1230.
[0161] In some embodiments, the processor 1205 comprises circuitry or any processor capable of processing the executable instructions. The memory system 1210 comprises any memory configured to store data. Some examples of the memory system 1210 are storage devices, such as RAM or ROM. The memory system 1210 may comprise the RAM cache. In various embodiments, data is stored within the memory system 1210. The data within the memory system 1210 may be cleared or ultimately transferred to the storage system 1215.
[0162] The storage system 1215 comprises any storage configured to retrieve and store data. Some examples of the storage system 1215 are flash drives, hard drives, optical drives, and/or magnetic tape. In some embodiments, the digital device 1200 includes a memory system 1210 in the form of RAM and a storage system 1215 in the form of flash data. Both the memory system 1210 and the storage system 1215 comprise computer readable media which may store instructions or programs that are executable by a computer processor including the processor 1205.
[0163] The communication network interface (com. network interface) 1220 may be coupled to a data network. The communication network interface 1220 may support communication over an Ethernet connection, a serial connection, a parallel connection, or an ATA connection, for example. The communication network interface 1220 may also support wireless communication (e.g., 802.12 a/b/g/n, WiMAX, LTE, 3G, 2G). It will be apparent to those skilled in the art that the communication network interface 1220 may support many wired and wireless standards.
[0164] The optional input/output (I/O) interface 1225 is any device that receives input from the user and output data. The display interface 1230 is any device that may be configured to output graphics and data to a display. In one example, the display interface 1230 is a graphics adapter. [0165] It will be appreciated by those skilled in the art that the hardware elements of the digital device 1200 are not limited to those depicted in FIG. 12. A digital device 1200 may comprise more or less hardware elements than those depicted. Further, hardware elements may share functionality and still be within various embodiments described herein. In one example, encoding and/or decoding may be performed by the processor 1205 and/or a coprocessor located on a GPU.
[0166] FIG. 13A depicts an example of an augmented reality gaming system 1300, according to some embodiments. The augmented reality gaming system 1300 may include a peripheral system 1305, a communications device 1310, and a gameplay system 1315.
[0167] The peripheral system 1305 may include any peripheral system, such as a receiver or an emitter, as discussed herein. In some embodiments, the peripheral system 1305 may correspond to one or more of the emitter 120 and/or the receiver 125, shown in FIG. 1. As such, the peripheral system 1305 may include a transmitter, a receiver, a lens, and other hardware to facilitate sensor-based mobile gaming. The peripheral system 1305 may be paired to the communications device 1310, as discussed herein. The peripheral system 1305 may be coupled to the communications device 1310. In some embodiments, the peripheral system 1305 is coupled to the communications device 1310 using a Bluetooth connection or other wireless connection.
[0168] The communications device 1310 may include any digital device, an example of which is the digital device 1200 shown in FIG. 12. In various embodiments, the
communications device 1310 may correspond to the communications device 130, shown in FIG. 1. In some embodiments, the communications device 1310 may include a game application 1320, a peripheral API 1325, a platform API 1330, an API support layer 1335, and a mobile operating system 1340.
[0169] In various embodiments, the game application 1320 may allow a user to engage in sensor-based mobile gaming as discussed herein. More specifically, the game application 1320 may include gameplay modules to facilitate sensor-based mobile gaming. In various embodiments, the game application 1320 may include modules corresponding to one or more of the user interface module 410 and the gameplay memory datastore 430, shown in FIG. 4. The game application 1320 may be implemented in any convenient format, including, in various embodiments, an iOS® mobile application or an Android® mobile application. [0170] The peripheral API 1325 may support coupling the communications device 1310 to the peripheral system 1305. In some embodiments, the peripheral API 1325 is implemented as a Bluetooth or other wireless interface to the peripheral system 1305. In various embodiments, the peripheral API 1325 may correspond to some or all of the emitter interface module 415 and the receiver interface module 420, shown in FIG. 4. The platform API 1330 may support coupling the communications device 1310 to the gameplay system 1315. The platform API 1330 may be implemented as a bus, a network interface, or other interface. In various embodiments, the platform API 1330 may correspond to some or all of the gameplay cloud interface module 425, shown in FIG. 4.
[0171] The API support layer 1335 may support function calls used by the game application 1320, the peripheral API 1325, and the platform API 1330. In some embodiments, the API support layer 1335 may facilitate receiving and processing user interface inputs, such as gestures, swipes, and clicks. In an implementation, the API support layer 1335 comprises a Cocoa Touch® layer. It is noted the API support layer 1335 may also comprise Android API support layer(s) or other support layer(s) without departing from the scope and substance of the inventive concepts described herein. The mobile operating system 1340 may comprise an operating system of the communications device 1310. In various embodiments, the mobile operating system 1340 may comprise an iOS® operating system or Android® operating system. It is noted the mobile operating system 1340 may comprise other forms of operating systems in some embodiments.
[0172] The gameplay system 1315 may support sensor-based gaming by a user of the communications device 1310, as discussed herein. In some embodiments, the gameplay system 1315 may be coupled to the communications device 1310 using a network connection, such as an Internet connection. The network connection may comprise a wireless network connection. The gameplay system 1315 may also be coupled to the communications device 1310 over other convenient connections as known in the art.
[0173] FIG. 13B depicts an example of an augmented reality gaming system 1300, according to some embodiments. The augmented reality gaming system 1300 may include a communications device 1310, a gameplay system 1315, and a user 1370. [0174] The communications device 1310 may be coupled to the gameplay system 1315. The communications device 1310 may correspond to the communications device 1310 in FIG. 13A.
[0175] The gameplay system 1315 may correspond to the gameplay system 1315 in FIG. 13A. The gameplay system 1315 may include a web service module 1345, a web UI module 1350, a Ruby on Rails support module 1355, a cloud-based Platform as a Service (PaaS) module 1360, and a cloud-based storage module 1365. In some embodiments, the web service module 1345 may be coupled to the communications device 1310. The web service module 1345 may provide sensor-based mobile gaming services, as described herein, as a web service to the communications device 1310. The web UI module 1350 may be coupled to the user 1370. The web UI module 1350 may provide an online portal to access an account associated with the user 1370. The Ruby on Rails support module 1355 may allow the web service module 1345 and the web UI module 1350 to access the cloud-based PaaS module 1360 and the cloud-based storage module 1365. The cloud-based PaaS module 1360 may provide PaaS to other modules. The cloud-based storage module 1365 may provide cloud-based storage to the other modules.
[0176] The user 1370 may be any player that utilizes the system. The user 1370 may represent a player seeking to access a web portal associated with sensor-based mobile gaming, as discussed herein. The user 1370 may correspond to the player of the first communications device 130-1 or the Nth communications device 130-N, shown in FIG. 1.
[0177] FIG. 14 depicts an example of an augmented reality gaming environment 1400, according to some embodiments. In the augmented reality gaming environment 1400, the virtual in-game object 140 is projected into the optical wearable device 135 of a game player. The game player uses an emitter 120 to interact with the virtual in-game object 140 (e.g., by taking hits at the virtual in-game object 140 using an emitter modeled as a gun). A virtual space 1405 is used to compare projections of the virtual in-game object 140 with locations of in-game interactions. As an example, the virtual space 1405 may be used to compare whether the game player is shooting the emitter in the right direction and sufficiently steady to register a hit on the virtual in-game object 140. A downward stroke may be used as part of the emitter interaction mechanism in this example. The display 135 may show the virtual in-game object 140 as well as other notifications 1410 related to gameplay. [0178] FIG. 15 depicts an example of an emitter 120 and a communications device 130 in an augmented reality gaming environment 1500, according to some embodiments. In this example, game information about interactions with a virtual in-game object may be relayed to the communications device 130. FIG. 16 depicts an example of a display 135 and a communications device 130 in an augmented reality gaming environment 1600, according to some embodiments. In this example both the display 135 and the communications device 130 may display the virtual in-game object.
[0179] FIG. 17 depicts an example screen 1700 of an augmented reality electronic game, according to some embodiments. The screen 1700 may include a new operation button 1705 that allows game players to start a new game, a join operation 1710 that allows game players to join an existing game (e.g., a game that the game players or other game players have previously created), a missions button 1715 that allows access to missions that been undertaken, and an active ops button 1720 that allows access to active operations underway in the augmented reality electronic game.
[0180] FIG. 18 depicts an example screen 1800 of an augmented reality electronic game, according to some embodiments. The screen 1800 includes one or more statuses, such as a first status 1805 that identifies whether an emitter is active/coupled, and a second status 1810 that identifies whether a receiver is active/coupled. The screen 1800 further includes a first button 1815 that provides the ability to scan QR codes, and a second button 1820 that allows a user to reset the user' s account.
[0181] FIG. 19 depicts an example screen 1900 of an augmented reality electronic game, according to some embodiments. The screen 1900 may correspond to a new operations screen (e.g., a screen that is displayed when the new operations button 1705 has been selected). The screen 1900 may include an operation name box 1905, an operation type box 1910, an operation length box 1915, and a create button 1920.
[0182] FIG. 20 depicts an example screen 2000 of an augmented reality electronic game, according to some embodiments. The screen 2000 may correspond to a join operations screen (e.g., a screen that is displayed when the join operation button 1710 has been selected). The screen 2000 may include an operation join code box 2005, a scan QR code button 2010, and a join game button 2015. [0183] The above-described functions and components may be comprised of instructions that are stored on a storage medium such as a computer readable medium. The instructions may be retrieved and executed by a processor. Some examples of instructions are software, program code, and firmware. Some examples of storage medium are memory devices, tape, disks, integrated circuits, and servers. The instructions are operational when executed by the processor to direct the processor to operate in accord with some embodiments. Those skilled in the art are familiar with instructions, processor(s), and storage medium.
[0184] For purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the description. It will be apparent, however, to one skilled in the art that embodiments of the disclosure can be practiced without these specific details. In some instances, modules, structures, processes, features, and devices are shown in block diagram form in order to avoid obscuring the description. In other instances, functional block diagrams and flow diagrams are shown to represent data and logic flows. The components of block diagrams and flow diagrams (e.g., modules, blocks, structures, devices, features, etc.) may be variously combined, separated, removed, reordered, and replaced in a manner other than as expressly described and depicted herein.
[0185] Reference in this specification to "one embodiment", "an embodiment", "some embodiments", "various embodiments", "certain embodiments", "other embodiments", "one series of embodiments", or the like means that a particular feature, design, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the disclosure. The appearances of, for example, the phrase "in one embodiment" or "in an embodiment" in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. Moreover, whether or not there is express reference to an "embodiment" or the like, various features are described, which may be variously combined and included in some embodiments, but also variously omitted in other embodiments.
Similarly, various features are described that may be preferences or requirements for some embodiments, but not other embodiments.
[0186] As used herein, module may be hardware, software, or a combination of both. As used herein, a module may further include firmware. [0187] The language used herein has been principally selected for readability and instructional purposes, and it may not have been selected to delineate or circumscribe the inventive subject matter. It is therefore intended that the scope be limited not by this detailed description, but rather by any claims that issue on an application based hereon. Accordingly, the disclosure of the embodiments is intended to be illustrative, but not limiting, of the scope, which is set forth in the following claims.

Claims

CLAIMS What is claimed is:
1. A system comprising:
a communications device configured to support gameplay;
a display coupled to the communications device, the display configured to display a view of a physical environment proximate to the display, and to display at a first location a virtual in-game object associated with the gameplay, the first position being over the view of the physical environment;
an emitter coupled to the communications device, the emitter configured to provide an emitter interaction signal related to a first in-game interaction with the virtual in-game object; and
a gameplay server coupled to the communications device, the gameplay server configured to:
identify a first area in a virtual space corresponding to the first location;
identify a second area in the virtual space corresponding to the first in-game interaction;
determine whether the second area overlaps with the first area in the virtual space; and
change a state of the virtual in-game object if the second area overlaps with the first area in the virtual space.
2. The system of claim 1, wherein the emitter interaction signal is based on one or more of: a squeeze of a trigger of a toy gun incorporating the emitter, a specified motion of a toy sword incorporating the emitter, and a specified motion of a toy wand incorporating the emitter.
3. The system of claim 1, further comprising providing a communications device interaction signal from the communications device, the communications device interaction signal being related to a second in-game interaction with the virtual in-game object.
4. The system of claim 3, wherein the communications device interaction signal is based on one or more gestures into a user interface of the communications device.
5. The system of claim 1, wherein the display is incorporated into an wearable optical device, and display is configured to provide a display interaction signal, the display interaction signal being related to a second in-game interaction signal with the virtual in-game object.
6. The system of claim 5, wherein the display interaction signal is based on one or more of: eye motion of a user of the wearable optical device, touch interaction with the wearable optical device, and voice input into the wearable optical device.
7. The system of claim 5, wherein the display interaction signal is based on a tilt or a translation of the wearable optical device.
8. The system of claim 1, wherein the gameplay server is configured to select the virtual in-game object for the display based on a location of the display.
9. The system of claim 1, wherein the gameplay server is configured to select the virtual in-game object for the display based on a gameplay state of the gameplay.
10. The system of claim 1, wherein the display is incorporated into a head mounted device (HMD) or a heads-up display (HUD).
11. The system of claim 1, wherein the display is incorporated into the communications device.
12. A method comprising:
supporting gameplay on a communications device, a display, and an emitter;
displaying a view of a physical environment proximate to the display;
displaying at a first location a virtual in-game object associated with gameplay, the first position being over the view of the physical environment;
providing an emitter interaction signal related to a first in-game interaction with the virtual in-game object;
identifying a first area in a virtual space corresponding to the first location;
identifying a second area in the virtual space corresponding to the first in-game interaction; determining whether the second area overlaps with the first area in the virtual space; and
changing a state of the virtual in-game object if the second area overlaps with the first area in the virtual space.
13. The method of claim 12, wherein the emitter interaction signal is based on one or more of: a squeeze of a trigger of a toy gun incorporating the emitter, a specified motion of a toy sword incorporating the emitter, and a specified motion of a toy wand incorporating the emitter.
14. The method of claim 12, further comprising providing a communications device interaction signal from the communications device, the communications device interaction signal being related to a second in-game interaction with the virtual in-game object.
15. The method of claim 14, wherein the communications device interaction signal is based on one or more gestures into a user interface of the communications device.
16. The method of claim 12, wherein the display is incorporated into an wearable optical device, and display is configured to provide a display interaction signal, the display interaction signal being related to a second in-game interaction signal with the virtual in-game object.
17. The method of claim 16, wherein the display interaction signal is based on one or more of: eye motion of a user of the wearable optical device, touch interaction with the wearable optical device, and voice input into the wearable optical device.
18. The method of claim 16, wherein the display interaction signal is based on a tilt or a translation of the wearable optical device.
19. The method of claim 12, wherein the gameplay server is configured to select the virtual in-game object for the display based on a location of the display.
20. The method of claim 12, wherein the gameplay server is configured to select the virtual in-game object for the display based on a gameplay state of the gameplay.
21. The method of claim 12, wherein the display is incorporated into a head mounted device (HMD) or a heads-up display (HUD).
22. The method of claim 12, wherein the display is incorporated into the communications device.
23. A system comprising:
a communications device configured to support gameplay;
means for displaying a view of a physical environment proximate to the display, and to display at a first location a virtual in-game object associated with the gameplay, the first position being over the view of the physical environment;
means for providing an emitter interaction signal related to a first in-game interaction with the virtual in-game object; and
a gameplay server coupled to the communications device, the gameplay server configured to:
identify a first area in a virtual space corresponding to the first location;
identify a second area in the virtual space corresponding to the first in-game interaction based on the emitter interaction signal;
determine whether the second area overlaps with the first area in the virtual space; and
change a state of the virtual in-game object if the second area overlaps with the first area in the virtual space.
PCT/US2015/058672 2014-10-31 2015-11-02 Interactive gaming using wearable optical devices WO2016070192A1 (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US201462073852P 2014-10-31 2014-10-31
US62/073,852 2014-10-31
US201562131121P 2015-03-10 2015-03-10
US62/131,121 2015-03-10

Publications (1)

Publication Number Publication Date
WO2016070192A1 true WO2016070192A1 (en) 2016-05-06

Family

ID=55851543

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2015/058672 WO2016070192A1 (en) 2014-10-31 2015-11-02 Interactive gaming using wearable optical devices

Country Status (2)

Country Link
US (1) US20160121211A1 (en)
WO (1) WO2016070192A1 (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10209771B2 (en) * 2016-09-30 2019-02-19 Sony Interactive Entertainment Inc. Predictive RF beamforming for head mounted display
WO2015095507A1 (en) * 2013-12-18 2015-06-25 Joseph Schuman Location-based system for sharing augmented reality content
US10445925B2 (en) * 2016-09-30 2019-10-15 Sony Interactive Entertainment Inc. Using a portable device and a head-mounted display to view a shared virtual reality space
US20180188923A1 (en) * 2016-12-30 2018-07-05 Cirque Corporation Arbitrary control mapping of input device
US10146300B2 (en) * 2017-01-25 2018-12-04 Lenovo Enterprise Solutions (Singapore) Pte. Ltd. Emitting a visual indicator from the position of an object in a simulated reality emulation
JP6878350B2 (en) * 2018-05-01 2021-05-26 グリー株式会社 Game processing program, game processing method, and game processing device

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110187640A1 (en) * 2009-05-08 2011-08-04 Kopin Corporation Wireless Hands-Free Computing Headset With Detachable Accessories Controllable by Motion, Body Gesture and/or Vocal Commands
US20110221669A1 (en) * 2010-02-28 2011-09-15 Osterhout Group, Inc. Gesture control in an augmented reality eyepiece
US20130328927A1 (en) * 2011-11-03 2013-12-12 Brian J. Mount Augmented reality playspaces with adaptive game rules
US20140287806A1 (en) * 2012-10-31 2014-09-25 Dhanushan Balachandreswaran Dynamic environment and location based augmented reality (ar) systems

Family Cites Families (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6042490A (en) * 1996-07-26 2000-03-28 Lenhart; Christopher W. Systems and methods of playing games in three dimensions
US6530841B2 (en) * 2001-06-26 2003-03-11 Cutlass, Inc. Electronic tag game
US20030224855A1 (en) * 2002-05-31 2003-12-04 Robert Cunningham Optimizing location-based mobile gaming applications
US20040058732A1 (en) * 2002-06-14 2004-03-25 Piccionelli Gregory A. Method, system and apparatus for location based gaming
US7991220B2 (en) * 2004-09-01 2011-08-02 Sony Computer Entertainment Inc. Augmented reality game system using identification information to display a virtual object in association with a position of a real object
US7435179B1 (en) * 2004-11-15 2008-10-14 Sprint Spectrum L.P. Location-based authorization of gaming action in wireless communication gaming devices
US8585476B2 (en) * 2004-11-16 2013-11-19 Jeffrey D Mullen Location-based games and augmented reality systems
US9118428B2 (en) * 2009-11-04 2015-08-25 At&T Intellectual Property I, L.P. Geographic advertising using a scalable wireless geocast protocol
US8506377B2 (en) * 2010-09-30 2013-08-13 Disney Enterprises, Inc. Systems and methods to provide augmented reality for a board game
US20120122570A1 (en) * 2010-11-16 2012-05-17 David Michael Baronoff Augmented reality gaming experience
US8401343B2 (en) * 2011-03-27 2013-03-19 Edwin Braun System and method for defining an augmented reality character in computer generated virtual reality using coded stickers
US20130017884A1 (en) * 2011-07-13 2013-01-17 Igt Methods and apparatus for providing secure logon to a gaming machine using a mobile device
US9539498B1 (en) * 2012-07-31 2017-01-10 Niantic, Inc. Mapping real world actions to a virtual world associated with a location-based game
US20150097719A1 (en) * 2013-10-03 2015-04-09 Sulon Technologies Inc. System and method for active reference positioning in an augmented reality environment

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110187640A1 (en) * 2009-05-08 2011-08-04 Kopin Corporation Wireless Hands-Free Computing Headset With Detachable Accessories Controllable by Motion, Body Gesture and/or Vocal Commands
US20110221669A1 (en) * 2010-02-28 2011-09-15 Osterhout Group, Inc. Gesture control in an augmented reality eyepiece
US20130328927A1 (en) * 2011-11-03 2013-12-12 Brian J. Mount Augmented reality playspaces with adaptive game rules
US20140287806A1 (en) * 2012-10-31 2014-09-25 Dhanushan Balachandreswaran Dynamic environment and location based augmented reality (ar) systems

Also Published As

Publication number Publication date
US20160121211A1 (en) 2016-05-05

Similar Documents

Publication Publication Date Title
US20160263477A1 (en) Systems and methods for interactive gaming with non-player engagement
US11948260B1 (en) Streaming mixed-reality environments between multiple devices
US20160121211A1 (en) Interactive gaming using wearable optical devices
US10740951B2 (en) Foveal adaptation of particles and simulation models in a foveated rendering system
US20210252398A1 (en) Method and system for directing user attention to a location based game play companion application
US10380798B2 (en) Projectile object rendering for a virtual reality spectator
US9654613B2 (en) Dual-mode communication devices and methods for arena gaming
WO2022237275A1 (en) Information processing method and apparatus and terminal device
US20180357794A1 (en) Optimized deferred lighting in a foveated rendering system
EP3634593B1 (en) Optimized deferred lighting and foveal adaptation of particles and simulation models in a foveated rendering system
US10916061B2 (en) Systems and methods to synchronize real-world motion of physical objects with presentation of virtual content
JP2023126292A (en) Information display method, device, instrument, and program
JP2023036743A (en) Method and system for directing user attention to a location based game play companion application
US20230298242A1 (en) Notification application for a computing device
US9656172B2 (en) Unlocking of virtual content through geo-location
KR20230042517A (en) Contact information display method, apparatus and electronic device, computer-readable storage medium, and computer program product
US20190038975A1 (en) Systems and methods for sensor-based mobile gaming
JP2023164687A (en) Virtual object control method and apparatus, and computer device and storage medium
JP6959267B2 (en) Generate challenges using a location-based gameplay companion application
TWI807732B (en) Non-transitory computer-readable storage medium for interactable augmented and virtual reality experience
US10839607B2 (en) Systems and methods to provide views of a virtual space

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15854197

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15854197

Country of ref document: EP

Kind code of ref document: A1