US20070229397A1 - Virtual reality system - Google Patents

Virtual reality system Download PDF

Info

Publication number
US20070229397A1
US20070229397A1 US11/753,910 US75391007A US2007229397A1 US 20070229397 A1 US20070229397 A1 US 20070229397A1 US 75391007 A US75391007 A US 75391007A US 2007229397 A1 US2007229397 A1 US 2007229397A1
Authority
US
United States
Prior art keywords
user
images
image
set forth
playback system
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US11/753,910
Inventor
Robert Sefton
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Volo LLC
Original Assignee
Volo LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Family has litigation
First worldwide family litigation filed litigation Critical https://patents.darts-ip.com/?family=34911888&utm_source=google_patent&utm_medium=platform_link&utm_campaign=public_patent_search&patent=US20070229397(A1) "Global patent litigation dataset” by Darts-ip is licensed under a Creative Commons Attribution 4.0 International License.
Application filed by Volo LLC filed Critical Volo LLC
Priority to US11/753,910 priority Critical patent/US20070229397A1/en
Assigned to VOLO, LLC reassignment VOLO, LLC CORRECTING RECORDATION FORM COVER SHEET WITH CORRECT SERIAL NUMBER R/F 019345/0359 Assignors: SEFTON, ROBERT T.
Publication of US20070229397A1 publication Critical patent/US20070229397A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/012Head tracking input arrangements
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B71/00Games or sports accessories not covered in groups A63B1/00 - A63B69/00
    • A63B71/06Indicating or scoring devices for games or players, or for other sports activities
    • A63B71/0619Displays, user interfaces and indicating devices, specially adapted for sport equipment, e.g. display mounted on treadmills
    • A63B71/0622Visual, audio or audio-visual systems for entertaining, instructing or motivating the user
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B71/00Games or sports accessories not covered in groups A63B1/00 - A63B69/00
    • A63B71/06Indicating or scoring devices for games or players, or for other sports activities
    • A63B71/0619Displays, user interfaces and indicating devices, specially adapted for sport equipment, e.g. display mounted on treadmills
    • A63B71/0622Visual, audio or audio-visual systems for entertaining, instructing or motivating the user
    • A63B2071/0638Displaying moving images of recorded environment, e.g. virtual environment
    • A63B2071/0644Displaying moving images of recorded environment, e.g. virtual environment with display speed of moving landscape controlled by the user's performance
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B71/00Games or sports accessories not covered in groups A63B1/00 - A63B69/00
    • A63B71/06Indicating or scoring devices for games or players, or for other sports activities
    • A63B71/0619Displays, user interfaces and indicating devices, specially adapted for sport equipment, e.g. display mounted on treadmills
    • A63B2071/0658Position or arrangement of display
    • A63B2071/0661Position or arrangement of display arranged on the user
    • A63B2071/0666Position or arrangement of display arranged on the user worn on the head or face, e.g. combined with goggles or glasses
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B22/00Exercising apparatus specially adapted for conditioning the cardio-vascular system, for training agility or co-ordination of movements
    • A63B22/06Exercising apparatus specially adapted for conditioning the cardio-vascular system, for training agility or co-ordination of movements with support elements performing a rotating cycling movement, i.e. a closed path movement
    • A63B22/0605Exercising apparatus specially adapted for conditioning the cardio-vascular system, for training agility or co-ordination of movements with support elements performing a rotating cycling movement, i.e. a closed path movement performing a circular movement, e.g. ergometers
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2220/00Measuring of physical parameters relating to sporting activity
    • A63B2220/30Speed
    • A63B2220/34Angular speed
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2225/00Miscellaneous features of sport apparatus, devices or equipment
    • A63B2225/50Wireless data transmission, e.g. by radio transmitters or telemetry

Definitions

  • the subject invention relates generally to virtual reality (VR) systems.
  • the invention relates specifically to VR systems coupled with an exercise apparatus.
  • the '146 patent discloses a VR system having an image playback system for storing a plurality of images having a 360-degree field-of-view.
  • the images are previously recorded using a plurality of video cameras and electronically “stitched” together to create the images with the 360-degree field-of-view.
  • the playback system is operatively connected to a display and a directional sensor.
  • the display and directional sensor are mounted to a helmet that is worn by a user.
  • the display shows a portion of each image based on the position of the helmet, as measured by the directional sensor.
  • the plurality of images are sequenced and displayed for the user at a predetermined rate.
  • the '987 patent discloses a VR system having an image playback system for storing a plurality of images.
  • the playback system is operatively connected to a display and a speed sensor.
  • the speed sensor is attached to an exercise apparatus for measuring a speed of a user operating the exercise apparatus.
  • the display presents the plurality of images to the user at a rate determined by the speed measured by the speed apparatus.
  • the invention provides a virtual reality (VR) system including a playback system having a storage device.
  • the storage device maintains a plurality of images where each image has a field-of-view defining an X direction and a Y direction.
  • An image viewing device is in communication with the playback system for displaying a portion of the plurality of images to a user.
  • a directional sensor is in communication with the playback system for defining a viewing direction of the user as both an X direction coordinate and a Y direction coordinate.
  • the playback system includes a controller in communication with the storage device for coordinating the portion of the plurality of images displayed by the image viewing device with the X and Y direction coordinates of the directional sensor. The controller recognizes a command from the user based on positioning of said directional sensor.
  • the invention also includes a method of operating the VR system.
  • the method includes the steps of maintaining a plurality of images where each image has a 360-degree field-of-view defining an X direction and a Y direction. A viewing direction of a user is determined as both an X direction coordinate and a Y direction coordinate.
  • the method also includes the step of displaying a portion of the plurality of images to the user based on the viewing direction.
  • the method further includes the steps of recognizing a command from the user based on positioning of the directional sensor and changing an operating aspect of the VR system based on the command.
  • the invention provides a VR system with a “hands-free” user interface that allows operation of the system without keyboards, mice, or other hand or foot operated devices. This allows a user to continue to operate the VR system, such as when exercising, without interruption.
  • the VR system also includes a forward and rearward moving device for defining a Z direction.
  • a speed sensor is operatively connected to the moving device and operatively communicating with the playback system for providing a rate of change of the plurality of images in the Z direction.
  • the controller is in communication with the storage device for simultaneously coordinating the X and Y directions of the directional sensor and the Z direction of the speed sensor.
  • the VR system also includes an inclination feedback device operatively connected to the moving device and in communication with the storage device for changing the operation of the forward and reward moving device based upon the inclination data.
  • this aspect of the invention provides a VR system with a more realistic VR experience which simulates uphill and downhill movement coordinated with the plurality of images viewed by the user.
  • FIG. 1 is a block diagram of a preferred embodiment of a virtual reality system in accordance with the subject invention
  • FIG. 2 is a perspective view of the preferred embodiment of the virtual reality system where a stationary bicycle is implemented as an exercise apparatus;
  • FIG. 3 is a forward-looking segment defining a first image and a portion of the first image viewed by a user
  • FIG. 4 is a backward-looking segment of the first image and another portion of the first image viewed by the user;
  • FIG. 5 is a forward-looking segment defining a second image and a portion of the second image viewed by the user;
  • FIG. 6 is a forward-looking segment of a third image and a portion of the third image viewed by the user;
  • FIG. 7 is a forward-looking segment of a fourth image and a portion of the fourth image viewed by the user;
  • FIG. 8 is a forward-looking segment of a fifth image and a portion of the fifth image viewed by the user;
  • FIG. 9 is a forward-looking portion of an image viewed by the user showing an operator interface bar and an arrow indicating the viewing direction of the user;
  • FIG. 10 is a video options menu viewed by the user.
  • FIG. 11 is an audio options menu viewed by the user.
  • the VR system 20 broadly includes an image playback system 22 and an image viewing device 24 in communication with each other.
  • the image playback system 22 includes a storage device 26 and a controller 28 in communication with each other.
  • the storage device 26 may be a hard disk drive, a digital versatile disc (DVD), a compact disc (CD), a Blu-ray disc (BD), a semiconductor-based memory device, or other suitable data storage device known to those skilled in the art.
  • the storage device 26 maintains a plurality of images. Each image has a field-of-view defining an X direction and a Y direction. In a preferred embodiment, the X direction field-of-view is defined as 360 degrees and the Y direction field-of-view is defined as 180 degrees.
  • the field-of-view of the X direction may be less than 360 degrees and the field-of-view of the Y direction could be less than 180 degrees.
  • the 360 degrees of the X direction and the 180 degrees of the Y direction represent a completely spherical image.
  • the images are preferably generated using a camera with a 360-degree field-of-view lens.
  • One suitable lens is the “ParaMax360” produced by Panosmart of Antibes, France.
  • the images may be produced by several standard lenses then combined to create the 360 degree field-of-view.
  • a second alternative is for the images to be computer generated. Those skilled in the art realize other suitable lenses and alternatives to generate the 360 degree field-of-view images.
  • the plurality of images is compressed before storage. This allows an increased amount of images to be stored on the storage device 26 .
  • the images are then decompressed before being displayed.
  • Codecs Several acceptable compression/decompression algorithms (Codecs) are known to those skilled in the art. However, it is preferred that the XviD codec is implemented.
  • the XviD codec is open-source software available via the Internet at www.xvid.org.
  • position, elevation, and/or inclination data is associated with each image and stored on the storage device 26 along with the images. This data may be collected simultaneously while the camera generates the images by using a global positioning system (GPS) receiver (not shown).
  • GPS global positioning system
  • the GPS receiver receives position and elevation data from a plurality of GPS satellites. Inclination data can be calculated based on changes in elevation between the images.
  • the image viewing device 24 communicates with the image playback system 22 .
  • the image viewing device 24 displays a portion 30 of the plurality of images to a user.
  • the image viewing device 24 is further defined as a pair of display glasses (not separately numbered) worn on the head of the user.
  • the portion 30 of the plurality of images displayed by the display glasses 24 is preferably 140 degrees in the X direction and 90 degrees in the Y direction.
  • display glasses 24 with alternate dimensional configurations are also possible.
  • An example of suitable display glasses 24 is the “i-glasses SVGA Pro” model manufactured by i-O Display systems, LLC, a division of Ilixco, Inc., both of Menlo Park, Calif.
  • a variety of suitable display glasses 24 exists and could be implemented in the VR system 20 .
  • the image viewing device 24 could be a flat or curved screen or monitor positioned in front of and/or about the user.
  • the VR system also preferably includes a forward and rearward moving device 34 .
  • the forward and rearward moving device 34 is an exercise apparatus for allowing the user to exercise.
  • the exercise apparatus is illustrated in FIG. 2 as a stationary bicycle.
  • a different type of exercise apparatus could be implemented, including, but not limited to, a treadmill, a stair climber, or an elliptical trainer.
  • Those skilled in the art will also realize that other types of moving devices 34 could be utilized in the subject invention without deviating from the scope of the subject invention.
  • the image playback system 22 , image viewing device 24 , and forward and rearward moving device 34 communicate with each other across one or more wireless interfaces (not shown).
  • the wireless interfaces operate using radio waves.
  • the wireless interfaces utilize Bluetooth® technology as described by the Bluetooth Special Interest Group headquartered in Overland Park, Kans.
  • Other radio wave interfaces, such as 802.11, PCS, etc. may also be implemented.
  • the wireless interfaces operate using frequencies in the optical band, such as the infrared standards developed by the Infrared Data Association (IRDA) of Walnut Creek, Calif.
  • the communication between the image playback system 22 and the other components of the VR system 20 is accomplished using a hardwired interface.
  • This hardwired interface may involve transmission of electrons over a conductive wire or pulses of light over a fiber-optic cable.
  • the VR system includes a directional sensor 38 .
  • the directional sensor 38 communicates with the image playback system 22 .
  • the directional sensor 38 defines a viewing direction of the user in both of the X and Y directions. That is, the directional sensor 38 provides an X direction coordinate and a Y direction coordinate corresponding to the viewing direction of the user.
  • the directional sensor 38 is attached to the display glasses 24 . This allows a portion of the plurality of images displayed by the display glasses 24 to change in the X and Y directions as the user moves the display glasses 24 by moving his or her head.
  • the directional sensor 38 preferably includes a wireless interface for communicating with the of the playback system 22 .
  • a suitable directional sensor 24 is the “InterTrax2” manufactured by InterSense of Burlington, Mass.
  • any suitable directional sensor 38 may be used, including sensors that monitor the viewing direction and movement of human eyes.
  • the directional sensor 38 could be mounted directly to the head and/or different areas of the user.
  • a speed sensor 40 is provided.
  • the speed sensor 40 is operatively connected to the forward and rearward moving device 34 such that the forward and rearward moving device 34 defines the Z direction.
  • the speed sensor 40 is also in communication with the image playback system 22 to provide a rate of change of the plurality of images in the Z direction.
  • the speed sensor 40 is operatively connected to a rotating wheel, pedal crank, or similar part of the stationary bicycle 36 .
  • the speed sensor 40 preferably includes a wireless interface for wirelessly communicating with the playback system 22 .
  • FIGS. 3 and 4 one of the images of the plurality of 360-degree field-of-view images is illustrated and is defined as a first image 42 .
  • a forward-looking segment 44 of the first image 42 defined by the X-direction between 0 and 180 degrees, is shown in FIG. 3 .
  • a portion 30 of the image viewed by the image viewing device 24 is shown with a broken line. The portion 30 of the image is illustrated as a rectangle but could be of any suitable shape or size.
  • FIG. 4 illustrates a backward-looking segment 46 , defined by the X-direction between 180 and 360 degrees.
  • the controller 28 simultaneously coordinates the X and Y directions of the directional sensor 38 and the Z direction of the speed sensor 40 .
  • the viewing direction and the rate of change are interlaced to automatically change the portion 30 of the plurality of images displayed by the image viewing device 24 in the X, Y, and Z directions when the user moves the directional sensor 38 in at least one of the X and Y directions and simultaneously moves the moving device 34 in the Z direction.
  • the controller 28 automatically changes the portion 30 of the images displayed by the image viewing device 24 throughout the 360-degree field-of-view.
  • the preferred embodiment of the image playback system 22 also includes a frame buffer (not shown).
  • the frame buffer receives the portion 30 of the plurality of images and retransmits the portion 30 at a constant frame rate. This retransmission at a constant frame rate prevents “slow motion” or blurry images from being received by the user.
  • the frame buffer may be implemented as software within the controller 28 or as a separate hardware module within the image playback system 22 .
  • the constant frame rate may be at 30 frames/second, which is the standard frame rate of the National Television Standards Committee (NTSC).
  • FIGS. 5 through 8 illustrate the coordination between the X and Y directions of the directional sensor 38 and the Z direction of the speed sensor 40 .
  • FIGS. 5 through 8 illustrate the coordination between the X and Y directions of the directional sensor 38 and the Z direction of the speed sensor 40 .
  • FIGS. 5 through 8 illustrate the coordination between the X and Y directions of the directional sensor 38 and the Z direction of the speed sensor 40 .
  • FIGS. 5 and 8 illustrate the coordination between the X and Y directions of the directional sensor 38 and the Z direction of the speed sensor 40 .
  • FIGS. 5 through 8 illustrate the coordination between the X and Y directions of the directional sensor 38 and the Z direction of the speed sensor 40 .
  • FIGS. 5 and 8 illustrate the coordination between the X and Y directions of the directional sensor 38 and the Z direction of the speed sensor 40 .
  • the plurality of images are advanced in the Z direction.
  • the first image 42 shown in FIG. 3
  • will advance to a second image 48 shown in FIG. 5
  • a third image 50 shown in FIG. 6
  • a fourth image 52 shown in FIG. 7
  • a fifth image 54 shown in FIG. 8 .
  • the user does not see the entirety of each of the images 42 , 48 , 50 , 52 , 54 .
  • the user views the portion 30 of each image 42 , 48 , 50 , 52 , 54 as shown by the broken line.
  • the directional sensor 38 sends a signal to the controller 28 which, in turn, changes the portion 30 that is viewed by the user.
  • the user can turn his or her head to the right or left, i.e. in the X direction, or up and down, i.e., in the Y direction, to view the objects in the X direction's 360-degree field-of-view.
  • the user looking relatively straight ahead in FIG. 5 can turn his or her head to the right get a better view of the lake to the right of the road as shown in FIG. 6 .
  • the user can look up, as shown in FIG. 7 , or down, as shown in FIG. 8 , to focus on an approaching vehicle. If the user stops pedaling the stationary bicycle 36 , the progression of images stops. The user may then also turn his or her head to get a better look at any objects in the stationary 360-degree field-of-view of the X direction.
  • the VR system 20 preferably is capable of providing audio to the user as well.
  • the storage device 26 preferably includes a plurality of audio files. These audio files are preferably music. However, alternatively, these audio files may correspond to the video images being displayed to the user. For example, the audio files may be recorded during the recording of the video images such that environmental noise (e.g. passing cars, nature, etc.) may correspond to the video images.
  • the VR system 20 also includes at least one speaker 56 in communication with the playback system.
  • the at least one speaker 56 receives an audio signal from the storage device 26 and coverts the audio signal into sound waves.
  • the at least one speaker 56 is implemented as a pair of stereo headphones (not separately numbered). Of course, those skilled in the art realize other techniques to implement the at least one speaker 56 .
  • a headset 57 supports the display glasses 24 , the directional sensor 38 , and the headphone speakers 56 .
  • This headset 57 is preferably in wireless communication with the playback system 22 such that data may be communicated back and forth between the headset 57 and the playback system 22 .
  • the controller 28 of the playback system 22 also recognizes and receives commands from the user. Preferably, the controller 28 recognizes these commands based on positioning of the directional sensor 38 . Specifically, in the preferred embodiment, the controller 28 recognizes a command from the user when the viewing direction of the user corresponds to a predetermined area of X and Y directional coordinates for a predetermined period of time. The controller 28 then changes an operating aspect of the VR system 20 based on the command. For example, FIG. 9 illustrates a portion 30 of an image displayed by the image viewing device 24 . The image viewing device also displays an operator interface bar 58 and an arrow 60 indicating the viewing direction of the user. Preferably, the operator interface bar 58 is hidden from the view of the user, but appears depending on the viewing direction of the user. Alternatively, the operator interface bar 58 may be continuously visible to the user.
  • the user can select to open a menu 62 by “holding” the arrow 58 on the “M” of the operator interface bar 58 for a short period of time (e.g., 1.5 seconds). Said another way, the user opens the menu 62 by moving his or her head such that the arrow 60 moves on the “M” for the short period of time.
  • a short period of time e.g. 1.5 seconds
  • the controller 28 may recognize the command based on a predetermined change in position, or motion, of the viewing direction of the user.
  • the predetermined change in position could be achieved by the user quickly nodding his or hear head, thus quickly changing the viewing direction received by the directional sensor 38 .
  • the menu 62 is preferably a hierarchical-type menu having a plurality of sub-menus.
  • FIG. 10 shows a video options menu 62 .
  • the video options menu 62 includes a selectable display of different courses (or chapters), a selector to change the brightness of the image viewing device 24 , a selector to remap the position of (i.e., center) the directional sensor 38 , and a variable selector to synchronize the speed sensor 40 to the number of video images displayed. Selection again occurs by the user simply moving the arrow 58 to a desired selection (by moving his or her head) and “holding” the arrow 58 there for a short period of time.
  • FIG. 10 shows a video options menu 62 .
  • the video options menu 62 includes a selectable display of different courses (or chapters), a selector to change the brightness of the image viewing device 24 , a selector to remap the position of (i.e., center) the directional sensor 38 , and a variable selector
  • the audio options menu 62 allows the user to select the audio file or files to be played via the speaker 56 .
  • the volume of the audio may also be adjusted via the audio options menu 62 .
  • those skilled in the art will realize other menus and sub-menus that could be implemented to allow the user further control over the VR system 20 .
  • the technique of recognizing and receiving commands from the user as described above is ideal for the VR system 20 since the user does not need to use his or her hands to send a command. Instead, the VR system 20 of the subject invention provides a completely “hands-free” user interface. The user need not remove the display glasses 24 or otherwise interrupt their exercise routine in order to change video images, audio selection, or other features of the VR system 20 .
  • the VR system preferably includes an inclination feed back device 68 .
  • the inclination feedback device 68 is in communication with the storage device 26 and receives inclination data from the storage device 26 .
  • the inclination feedback device 68 is also operatively connected to the forward and rearward moving device 34 for changing the operation of the moving device 34 based on the inclination data. Said another way, the inclination feedback device 68 changes the “feel” of the moving device 34 based on the inclination data, providing a more realistic VR experience.
  • the inclination feedback device 68 may change the tension on the pedals that is felt by the user. When images depict going uphill, the tension increases; while when images depict the going downhill, the tension decreases.
  • the inclination feedback device 68 may change the inclination of the treadmill's moving belt. As such, the inclination of the belt may be adjusted to match the inclination of the current image as stored in the inclination data.
  • the image playback system 22 is preferably in communication with a network 64 .
  • the network 64 may be a local area network (LAN), a wide area network (WAN), the Internet, or other network known to those skilled in the art.
  • a server 66 is also in communication with the network 64 , such that the image playback system 22 and the server 66 are also in communication with one another.
  • the server 66 includes various video images and files, audio files, and/or associated data. These images, files, and/or data may be transmitted from the server to the playback system 22 for use by the user. Therefore, the playback system 22 need not require the user to change DVDs or CDs in order to travel a different course or listen to a different selection of music.
  • the server 66 is in communication with a plurality of playback systems 22 .
  • the server 66 facilitates competitions, i.e., races, between the users of the various exercise apparatus 34 .
  • the server 66 receives speed sensor 40 data from each exercise apparatus 34 .
  • the server 66 can then determine a winner based on the distance traveled for a certain period of time. Alternatively, other criteria may be used to determine the winner, such as, but not limited to highest average speed.
  • the server 66 may provide feedback to each user based on the performance of the other users.
  • each image viewing device 24 may show a representation of a race course corresponding to the images shown to the user with a position of each user on the race course. Further, a lap time of each other user may be shown. Further, each image viewing device 24 may display a representation of each user superimposed on the plurality of images. This allows the user to see when he or she is passing another user or is being passed by another user.
  • the VR system 20 may also present various other content to the user, besides the plurality of images described above.
  • This other content includes, but is not limited to, Internet data such as webpages and email, television programs, radio stations, and DVD content such as movies.

Abstract

A virtual reality (VR) system includes an image playback system that sends images to an image viewing device, such as a pair of display glasses. Each image has a 360-degree field of view. A user views a portion of the images. The portion of the image viewed is determined by a directional sensor mounted to the display glasses. The images are advanced according to a speed sensor attached to a moving device, such as a stationary bicycle. The VR system simultaneously coordinates the portion of the images viewed by the user by coordinating signals from the directional sensor and the speed sensor. The user may also give commands to the playback system based on the positioning of the directional sensor.

Description

    CROSS REFERENCE TO RELATED APPLICATION
  • This patent application is a continuation-in-part of co-pending patent application Ser. No. 10/792,593, filed on Mar. 3, 2004, now U.S. Pat. No. 7,224,326.
  • BACKGROUND OF THE INVENTION
  • 1. Field of the Invention
  • The subject invention relates generally to virtual reality (VR) systems. The invention relates specifically to VR systems coupled with an exercise apparatus.
  • 2. Description of the Related Art
  • Various VR systems are well known in the prior art in which a user views a plurality of images. Two such VR systems are disclosed in U.S. Pat. Nos. 5,499,146 (the '146 patent) to Donahe et al. and 6,244,987 (the '987 patent) to Ohsuga et al.
  • The '146 patent discloses a VR system having an image playback system for storing a plurality of images having a 360-degree field-of-view. The images are previously recorded using a plurality of video cameras and electronically “stitched” together to create the images with the 360-degree field-of-view. The playback system is operatively connected to a display and a directional sensor. The display and directional sensor are mounted to a helmet that is worn by a user. The display shows a portion of each image based on the position of the helmet, as measured by the directional sensor. The plurality of images are sequenced and displayed for the user at a predetermined rate.
  • The '987 patent discloses a VR system having an image playback system for storing a plurality of images. The playback system is operatively connected to a display and a speed sensor. The speed sensor is attached to an exercise apparatus for measuring a speed of a user operating the exercise apparatus. The display presents the plurality of images to the user at a rate determined by the speed measured by the speed apparatus.
  • Although these systems may provide some advantages over other systems, there remains an opportunity for a VR system that provides a more realistic environment of 360-degree images that are dynamically viewed by a user. There also remains an opportunity for a VR system having a “hands-free” operator interface to change the parameters of the playback of the images.
  • BRIEF SUMMARY OF THE INVENTION AND ADVANTAGES
  • The invention provides a virtual reality (VR) system including a playback system having a storage device. The storage device maintains a plurality of images where each image has a field-of-view defining an X direction and a Y direction. An image viewing device is in communication with the playback system for displaying a portion of the plurality of images to a user. A directional sensor is in communication with the playback system for defining a viewing direction of the user as both an X direction coordinate and a Y direction coordinate. The playback system includes a controller in communication with the storage device for coordinating the portion of the plurality of images displayed by the image viewing device with the X and Y direction coordinates of the directional sensor. The controller recognizes a command from the user based on positioning of said directional sensor.
  • The invention also includes a method of operating the VR system. The method includes the steps of maintaining a plurality of images where each image has a 360-degree field-of-view defining an X direction and a Y direction. A viewing direction of a user is determined as both an X direction coordinate and a Y direction coordinate. The method also includes the step of displaying a portion of the plurality of images to the user based on the viewing direction. The method further includes the steps of recognizing a command from the user based on positioning of the directional sensor and changing an operating aspect of the VR system based on the command.
  • Accordingly, the invention provides a VR system with a “hands-free” user interface that allows operation of the system without keyboards, mice, or other hand or foot operated devices. This allows a user to continue to operate the VR system, such as when exercising, without interruption.
  • Another aspect of the invention provides the VR system described above wherein the storage device also maintains inclination data corresponding to the images. The VR system also includes a forward and rearward moving device for defining a Z direction. A speed sensor is operatively connected to the moving device and operatively communicating with the playback system for providing a rate of change of the plurality of images in the Z direction. The controller is in communication with the storage device for simultaneously coordinating the X and Y directions of the directional sensor and the Z direction of the speed sensor. The VR system also includes an inclination feedback device operatively connected to the moving device and in communication with the storage device for changing the operation of the forward and reward moving device based upon the inclination data.
  • Accordingly, this aspect of the invention provides a VR system with a more realistic VR experience which simulates uphill and downhill movement coordinated with the plurality of images viewed by the user.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • Other advantages of the present invention will be readily appreciated as the same becomes better understood by reference to the following detailed description when considered in connection with the accompanying drawings wherein:
  • FIG. 1 is a block diagram of a preferred embodiment of a virtual reality system in accordance with the subject invention;
  • FIG. 2 is a perspective view of the preferred embodiment of the virtual reality system where a stationary bicycle is implemented as an exercise apparatus;
  • FIG. 3 is a forward-looking segment defining a first image and a portion of the first image viewed by a user;
  • FIG. 4 is a backward-looking segment of the first image and another portion of the first image viewed by the user;
  • FIG. 5 is a forward-looking segment defining a second image and a portion of the second image viewed by the user;
  • FIG. 6 is a forward-looking segment of a third image and a portion of the third image viewed by the user;
  • FIG. 7 is a forward-looking segment of a fourth image and a portion of the fourth image viewed by the user;
  • FIG. 8 is a forward-looking segment of a fifth image and a portion of the fifth image viewed by the user;
  • FIG. 9 is a forward-looking portion of an image viewed by the user showing an operator interface bar and an arrow indicating the viewing direction of the user;
  • FIG. 10 is a video options menu viewed by the user; and
  • FIG. 11 is an audio options menu viewed by the user.
  • DETAILED DESCRIPTION OF THE INVENTION
  • Referring to the Figures, wherein like numerals indicate like parts throughout the several views, a virtual reality (VR) system is shown at 20 in FIG. 1. The VR system 20 broadly includes an image playback system 22 and an image viewing device 24 in communication with each other.
  • The image playback system 22 includes a storage device 26 and a controller 28 in communication with each other. The storage device 26 may be a hard disk drive, a digital versatile disc (DVD), a compact disc (CD), a Blu-ray disc (BD), a semiconductor-based memory device, or other suitable data storage device known to those skilled in the art. The storage device 26 maintains a plurality of images. Each image has a field-of-view defining an X direction and a Y direction. In a preferred embodiment, the X direction field-of-view is defined as 360 degrees and the Y direction field-of-view is defined as 180 degrees. However, those skilled in the art appreciate the field-of-view of the X direction may be less than 360 degrees and the field-of-view of the Y direction could be less than 180 degrees. The 360 degrees of the X direction and the 180 degrees of the Y direction represent a completely spherical image.
  • The images are preferably generated using a camera with a 360-degree field-of-view lens. One suitable lens is the “ParaMax360” produced by Panosmart of Antibes, France. In a first alternative, the images may be produced by several standard lenses then combined to create the 360 degree field-of-view. A second alternative is for the images to be computer generated. Those skilled in the art realize other suitable lenses and alternatives to generate the 360 degree field-of-view images.
  • In the preferred embodiment, the plurality of images is compressed before storage. This allows an increased amount of images to be stored on the storage device 26. The images are then decompressed before being displayed. Several acceptable compression/decompression algorithms (Codecs) are known to those skilled in the art. However, it is preferred that the XviD codec is implemented. The XviD codec is open-source software available via the Internet at www.xvid.org.
  • In the preferred embodiment, position, elevation, and/or inclination data is associated with each image and stored on the storage device 26 along with the images. This data may be collected simultaneously while the camera generates the images by using a global positioning system (GPS) receiver (not shown). The GPS receiver receives position and elevation data from a plurality of GPS satellites. Inclination data can be calculated based on changes in elevation between the images.
  • As stated above, the image viewing device 24 communicates with the image playback system 22. The image viewing device 24 displays a portion 30 of the plurality of images to a user. In the preferred embodiment, the image viewing device 24 is further defined as a pair of display glasses (not separately numbered) worn on the head of the user. The portion 30 of the plurality of images displayed by the display glasses 24 is preferably 140 degrees in the X direction and 90 degrees in the Y direction. Those skilled in the art realize that display glasses 24 with alternate dimensional configurations are also possible. An example of suitable display glasses 24 is the “i-glasses SVGA Pro” model manufactured by i-O Display systems, LLC, a division of Ilixco, Inc., both of Menlo Park, Calif. However, a variety of suitable display glasses 24 exists and could be implemented in the VR system 20. Further, the image viewing device 24 could be a flat or curved screen or monitor positioned in front of and/or about the user.
  • The VR system also preferably includes a forward and rearward moving device 34. In the preferred embodiment, the forward and rearward moving device 34 is an exercise apparatus for allowing the user to exercise. The exercise apparatus is illustrated in FIG. 2 as a stationary bicycle. However, a different type of exercise apparatus could be implemented, including, but not limited to, a treadmill, a stair climber, or an elliptical trainer. Those skilled in the art will also realize that other types of moving devices 34 could be utilized in the subject invention without deviating from the scope of the subject invention.
  • Preferably, the image playback system 22, image viewing device 24, and forward and rearward moving device 34 communicate with each other across one or more wireless interfaces (not shown). There may be other wireless interfaces for communicating among other components in the VR system 10. In the preferred embodiment, the wireless interfaces operate using radio waves. Preferably, the wireless interfaces utilize Bluetooth® technology as described by the Bluetooth Special Interest Group headquartered in Overland Park, Kans. Other radio wave interfaces, such as 802.11, PCS, etc., may also be implemented. In a first alternative embodiment, the wireless interfaces operate using frequencies in the optical band, such as the infrared standards developed by the Infrared Data Association (IRDA) of Walnut Creek, Calif. In a second alternative embodiment, the communication between the image playback system 22 and the other components of the VR system 20 is accomplished using a hardwired interface. This hardwired interface may involve transmission of electrons over a conductive wire or pulses of light over a fiber-optic cable.
  • In order to specifically monitor the movement of the user in the X and Y directions, the VR system includes a directional sensor 38. The directional sensor 38 communicates with the image playback system 22. In particular, the directional sensor 38 defines a viewing direction of the user in both of the X and Y directions. That is, the directional sensor 38 provides an X direction coordinate and a Y direction coordinate corresponding to the viewing direction of the user. In the preferred embodiment, the directional sensor 38 is attached to the display glasses 24. This allows a portion of the plurality of images displayed by the display glasses 24 to change in the X and Y directions as the user moves the display glasses 24 by moving his or her head. The directional sensor 38 preferably includes a wireless interface for communicating with the of the playback system 22. An example of a suitable directional sensor 24 is the “InterTrax2” manufactured by InterSense of Burlington, Mass. As appreciated by those skilled in the art, any suitable directional sensor 38 may be used, including sensors that monitor the viewing direction and movement of human eyes. Furthermore, in the embodiment where the image viewing device 24 is a screen or monitor, the directional sensor 38 could be mounted directly to the head and/or different areas of the user.
  • In order to monitor the movement of the user in a Z direction, a speed sensor 40 is provided. The speed sensor 40 is operatively connected to the forward and rearward moving device 34 such that the forward and rearward moving device 34 defines the Z direction. The speed sensor 40 is also in communication with the image playback system 22 to provide a rate of change of the plurality of images in the Z direction. In the embodiment illustrated by FIG. 2, the speed sensor 40 is operatively connected to a rotating wheel, pedal crank, or similar part of the stationary bicycle 36. The speed sensor 40 preferably includes a wireless interface for wirelessly communicating with the playback system 22.
  • Referring to FIGS. 3 and 4, one of the images of the plurality of 360-degree field-of-view images is illustrated and is defined as a first image 42. A forward-looking segment 44 of the first image 42, defined by the X-direction between 0 and 180 degrees, is shown in FIG. 3. A portion 30 of the image viewed by the image viewing device 24 is shown with a broken line. The portion 30 of the image is illustrated as a rectangle but could be of any suitable shape or size. FIG. 4 illustrates a backward-looking segment 46, defined by the X-direction between 180 and 360 degrees.
  • In the preferred embodiment, the controller 28 simultaneously coordinates the X and Y directions of the directional sensor 38 and the Z direction of the speed sensor 40. The viewing direction and the rate of change are interlaced to automatically change the portion 30 of the plurality of images displayed by the image viewing device 24 in the X, Y, and Z directions when the user moves the directional sensor 38 in at least one of the X and Y directions and simultaneously moves the moving device 34 in the Z direction. In particular, the controller 28 automatically changes the portion 30 of the images displayed by the image viewing device 24 throughout the 360-degree field-of-view.
  • The preferred embodiment of the image playback system 22 also includes a frame buffer (not shown). The frame buffer receives the portion 30 of the plurality of images and retransmits the portion 30 at a constant frame rate. This retransmission at a constant frame rate prevents “slow motion” or blurry images from being received by the user. The frame buffer may be implemented as software within the controller 28 or as a separate hardware module within the image playback system 22. For example, the constant frame rate may be at 30 frames/second, which is the standard frame rate of the National Television Standards Committee (NTSC).
  • FIGS. 5 through 8 illustrate the coordination between the X and Y directions of the directional sensor 38 and the Z direction of the speed sensor 40. For simplification of illustration, only the forward-looking portions of the images are shown. Several of the plurality of images that would be present between FIGS. 5 and 6, between FIGS. 6 and 7, etc., are omitted to further simplify the illustration.
  • As the user operates the stationary bicycle 36 of the preferred embodiment, the plurality of images are advanced in the Z direction. The first image 42, shown in FIG. 3, will advance to a second image 48, shown in FIG. 5, which will advance to a third image 50, shown in FIG. 6, to a fourth image 52, shown in FIG. 7, and finally to a fifth image 54, shown in FIG. 8. The user, however, does not see the entirety of each of the images 42, 48, 50, 52, 54. Instead, the user views the portion 30 of each image 42, 48, 50, 52, 54 as shown by the broken line. As the user moves his or her head, the directional sensor 38 sends a signal to the controller 28 which, in turn, changes the portion 30 that is viewed by the user.
  • As the images are advanced, the user can turn his or her head to the right or left, i.e. in the X direction, or up and down, i.e., in the Y direction, to view the objects in the X direction's 360-degree field-of-view. For example, the user looking relatively straight ahead in FIG. 5 can turn his or her head to the right get a better view of the lake to the right of the road as shown in FIG. 6. Also, the user can look up, as shown in FIG. 7, or down, as shown in FIG. 8, to focus on an approaching vehicle. If the user stops pedaling the stationary bicycle 36, the progression of images stops. The user may then also turn his or her head to get a better look at any objects in the stationary 360-degree field-of-view of the X direction.
  • In addition to providing video images to the user, the VR system 20 preferably is capable of providing audio to the user as well. To this end, the storage device 26 preferably includes a plurality of audio files. These audio files are preferably music. However, alternatively, these audio files may correspond to the video images being displayed to the user. For example, the audio files may be recorded during the recording of the video images such that environmental noise (e.g. passing cars, nature, etc.) may correspond to the video images.
  • The VR system 20 also includes at least one speaker 56 in communication with the playback system. The at least one speaker 56 receives an audio signal from the storage device 26 and coverts the audio signal into sound waves. In the preferred embodiment, the at least one speaker 56 is implemented as a pair of stereo headphones (not separately numbered). Of course, those skilled in the art realize other techniques to implement the at least one speaker 56.
  • In the preferred embodiment, a headset 57 supports the display glasses 24, the directional sensor 38, and the headphone speakers 56. This headset 57 is preferably in wireless communication with the playback system 22 such that data may be communicated back and forth between the headset 57 and the playback system 22.
  • The controller 28 of the playback system 22 also recognizes and receives commands from the user. Preferably, the controller 28 recognizes these commands based on positioning of the directional sensor 38. Specifically, in the preferred embodiment, the controller 28 recognizes a command from the user when the viewing direction of the user corresponds to a predetermined area of X and Y directional coordinates for a predetermined period of time. The controller 28 then changes an operating aspect of the VR system 20 based on the command. For example, FIG. 9 illustrates a portion 30 of an image displayed by the image viewing device 24. The image viewing device also displays an operator interface bar 58 and an arrow 60 indicating the viewing direction of the user. Preferably, the operator interface bar 58 is hidden from the view of the user, but appears depending on the viewing direction of the user. Alternatively, the operator interface bar 58 may be continuously visible to the user.
  • Still referring to FIG. 9, the user can select to open a menu 62 by “holding” the arrow 58 on the “M” of the operator interface bar 58 for a short period of time (e.g., 1.5 seconds). Said another way, the user opens the menu 62 by moving his or her head such that the arrow 60 moves on the “M” for the short period of time.
  • In an alternative embodiment, the controller 28 may recognize the command based on a predetermined change in position, or motion, of the viewing direction of the user. For example, the predetermined change in position could be achieved by the user quickly nodding his or hear head, thus quickly changing the viewing direction received by the directional sensor 38.
  • The menu 62, as best seen in FIGS. 10 and 11, is preferably a hierarchical-type menu having a plurality of sub-menus. FIG. 10 shows a video options menu 62. In the preferred embodiment, the video options menu 62 includes a selectable display of different courses (or chapters), a selector to change the brightness of the image viewing device 24, a selector to remap the position of (i.e., center) the directional sensor 38, and a variable selector to synchronize the speed sensor 40 to the number of video images displayed. Selection again occurs by the user simply moving the arrow 58 to a desired selection (by moving his or her head) and “holding” the arrow 58 there for a short period of time. FIG. 11 shows an audio options menu 62. The audio options menu 62 allows the user to select the audio file or files to be played via the speaker 56. The volume of the audio may also be adjusted via the audio options menu 62. Of course, those skilled in the art will realize other menus and sub-menus that could be implemented to allow the user further control over the VR system 20.
  • The technique of recognizing and receiving commands from the user as described above is ideal for the VR system 20 since the user does not need to use his or her hands to send a command. Instead, the VR system 20 of the subject invention provides a completely “hands-free” user interface. The user need not remove the display glasses 24 or otherwise interrupt their exercise routine in order to change video images, audio selection, or other features of the VR system 20.
  • Referring to FIGS. 1 and 2, the VR system preferably includes an inclination feed back device 68. The inclination feedback device 68 is in communication with the storage device 26 and receives inclination data from the storage device 26. The inclination feedback device 68 is also operatively connected to the forward and rearward moving device 34 for changing the operation of the moving device 34 based on the inclination data. Said another way, the inclination feedback device 68 changes the “feel” of the moving device 34 based on the inclination data, providing a more realistic VR experience.
  • For example, in an embodiment where the moving device 34 is implemented as part of a stationary bicycle, the inclination feedback device 68 may change the tension on the pedals that is felt by the user. When images depict going uphill, the tension increases; while when images depict the going downhill, the tension decreases. In another embodiment, where the moving device 34 is a treadmill, the inclination feedback device 68 may change the inclination of the treadmill's moving belt. As such, the inclination of the belt may be adjusted to match the inclination of the current image as stored in the inclination data.
  • Referring now to FIG. 1, the image playback system 22 is preferably in communication with a network 64. The network 64 may be a local area network (LAN), a wide area network (WAN), the Internet, or other network known to those skilled in the art. A server 66 is also in communication with the network 64, such that the image playback system 22 and the server 66 are also in communication with one another.
  • The server 66 includes various video images and files, audio files, and/or associated data. These images, files, and/or data may be transmitted from the server to the playback system 22 for use by the user. Therefore, the playback system 22 need not require the user to change DVDs or CDs in order to travel a different course or listen to a different selection of music.
  • Preferably, the server 66 is in communication with a plurality of playback systems 22. In addition to providing files and data to the playback systems 22, the server 66 facilitates competitions, i.e., races, between the users of the various exercise apparatus 34. As such, the server 66 receives speed sensor 40 data from each exercise apparatus 34. The server 66 can then determine a winner based on the distance traveled for a certain period of time. Alternatively, other criteria may be used to determine the winner, such as, but not limited to highest average speed. Furthermore, the server 66 may provide feedback to each user based on the performance of the other users. For example, each image viewing device 24 may show a representation of a race course corresponding to the images shown to the user with a position of each user on the race course. Further, a lap time of each other user may be shown. Further, each image viewing device 24 may display a representation of each user superimposed on the plurality of images. This allows the user to see when he or she is passing another user or is being passed by another user.
  • The VR system 20 may also present various other content to the user, besides the plurality of images described above. This other content includes, but is not limited to, Internet data such as webpages and email, television programs, radio stations, and DVD content such as movies.
  • The present invention has been described herein in an illustrative manner, and it is to be understood that the terminology which has been used is intended to be in the nature of words of description rather than of limitation. Obviously, many modifications and variations of the invention are possible in light of the above teachings. The invention may be practiced otherwise than as specifically described within the scope of the appended claims.

Claims (20)

1. A virtual reality (VR) system comprising:
a playback system having a storage device for maintaining a plurality of images where each image has a field-of-view defining an X direction and a Y direction;
an image viewing device in communication with said playback system for displaying a portion of said plurality of images to a user;
a directional sensor in communication with said playback system for sensing a viewing direction of the user to define both an X direction coordinate and a Y direction coordinate;
said playback system having a controller in communication with said storage device for coordinating said portion of said plurality of images displayed by said image viewing device with said X and Y direction coordinates of said directional sensor; and
said controller recognizing a command from the user based on positioning of said X and Y direction coordinates of said directional sensor.
2. A VR system as set forth in claim 1 wherein said controller recognizing a command from the user based on positioning of said directional sensor is further defined as said controller recognizing a command from the user when said viewing direction of the user corresponds to a predetermined area of X and Y directional coordinates for a predetermined period of time.
3. A VR system as set forth in claim 1 wherein said image viewing device displays a menu image in response to said controller recognizing a menu command from the user.
4. A VR system as set forth in claim 1 wherein said storage device also maintains at least one audio file.
5. A VR system as set forth in claim 4 further comprising at least one speaker in communication with said playback system for receiving an audio signal from said storage device and converting said audio signal into sound waves corresponding to said at least one audio file.
6. A VR system as set forth in claim 1 further comprising:
a forward and rearward moving device for defining a Z direction; and
a speed sensor operatively connected to said moving device and in communication with said playback system for providing a rate of change of said plurality of images in said Z direction.
7. A VR system as set forth in claim 6 wherein said controller simultaneously coordinates said X and Y directions of said directional sensor and said Z direction of said speed sensor such that said viewing direction and said rate of change are interlaced to automatically change said portion of said plurality of images displayed by said image viewing device in said X, Y, and Z directions when the user moves the directional sensor in at least one of the X and Y directions and simultaneously moves the moving device in the Z direction.
8. A VR system as set forth in claim 1 wherein said controller automatically changes said portion of said images displayed by said image viewing device throughout said field-of-view.
9. A VR system as set forth in claim 1 wherein said image viewing device is further defined as display glasses adapted to be worn by the user.
10. A VR system as set forth in claim 9 wherein said directional sensor is attached to said display glasses such that said portion of said plurality of images automatically changes as said glasses move in said X and Y directions.
11. A VR system as set forth in claim 6 wherein said forward and rearward moving device is further defined as an exercise apparatus for allowing a user to exercise.
12. A VR system as set forth in claim 1 wherein said image playback system further includes a frame buffer operatively communicating with said image viewing device for displaying said portion of said plurality of images to the user at a constant frame rate.
13. A VR system as set forth in claim 6 wherein said storage device includes inclination data corresponding to said images.
14. A VR system as set forth in claim 13 further comprising an inclination feedback device operatively connected to said forward and reward moving device and in communication with said storage device for changing the operation of said forward and reward moving device based upon said inclination data.
15. A method of operating a virtual reality (VR) system comprising:
maintaining a plurality of images where each image has a 360-degree field-of-view defining an X direction and a Y direction;
determining an X direction coordinate and a Y direction coordinate corresponding to a viewing direction of a user;
displaying a portion of the plurality of images to the user based on the viewing direction;
recognizing a command from the user based on positioning of the X and Y direction coordinates; and
changing an operating aspect of the VR system based on the command.
16. A method as set forth in claim 15 further comprising the step of sensing a rate of change of the plurality of images moving in a Z direction;
17. A method as set forth in claim 16 further comprising the step of simultaneously coordinating the X and Y directions and the Z direction and interlacing the viewing direction and the rate of change for automatically changing the plurality of images in the X, Y, and Z directions as the user changes the viewing direction in at least one of the X and Y directions and simultaneously moves in the Z direction.
18. A method as set forth in claim 15 wherein the step of determining a viewing direction of a user is further defined as monitoring any movement of the user.
19. A method as set forth in claim 15 wherein the step of displaying a portion of the plurality of images to the user is further defined as displaying a portion of the plurality of images to the user at a constant frame rate.
20. A virtual reality (VR) system comprising:
a playback system having a storage device for maintaining a plurality of images where each image has a field-of-view defining an X direction and a Y direction and inclination data corresponding to said images;
an image viewing device in communication with said playback system for displaying a portion of said plurality of images to a user;
a directional sensor in communication with said playback system for defining a viewing direction of the user as both an X direction coordinate and a Y direction coordinate;
said playback system having a controller in communication with said storage device for coordinating said portion of said plurality of images displayed by said image viewing device with said X and Y direction coordinates of said directional sensor;
a forward and rearward moving device for defining a Z direction;
a speed sensor operatively connected to said moving device and operatively communicating with said playback system for providing a rate of change of said plurality of images in said Z direction;
said playback system having a controller in communication with said storage device for simultaneously coordinating said X and Y directions of said directional sensor and said Z direction of said speed sensor; and
an inclination feedback device operatively connected to said moving device and in communication with said storage device for changing the operation of said forward and reward moving device based upon said inclination data.
US11/753,910 2004-03-03 2007-05-25 Virtual reality system Abandoned US20070229397A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US11/753,910 US20070229397A1 (en) 2004-03-03 2007-05-25 Virtual reality system

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US10/792,593 US7224326B2 (en) 2004-03-03 2004-03-03 Virtual reality system
US11/753,910 US20070229397A1 (en) 2004-03-03 2007-05-25 Virtual reality system

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US10/792,593 Continuation-In-Part US7224326B2 (en) 2004-03-03 2004-03-03 Virtual reality system

Publications (1)

Publication Number Publication Date
US20070229397A1 true US20070229397A1 (en) 2007-10-04

Family

ID=34911888

Family Applications (2)

Application Number Title Priority Date Filing Date
US10/792,593 Expired - Fee Related US7224326B2 (en) 2004-03-03 2004-03-03 Virtual reality system
US11/753,910 Abandoned US20070229397A1 (en) 2004-03-03 2007-05-25 Virtual reality system

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US10/792,593 Expired - Fee Related US7224326B2 (en) 2004-03-03 2004-03-03 Virtual reality system

Country Status (2)

Country Link
US (2) US7224326B2 (en)
WO (1) WO2005091798A2 (en)

Cited By (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130007668A1 (en) * 2011-07-01 2013-01-03 James Chia-Ming Liu Multi-visor: managing applications in head mounted displays
US9180053B2 (en) 2013-01-29 2015-11-10 Xerox Corporation Central vision impairment compensation
WO2016086494A1 (en) * 2014-12-05 2016-06-09 钱晓炯 Video presentation method for smart mobile terminal
CN106880945A (en) * 2017-01-24 2017-06-23 扬州大学 Interior based on virtual reality glasses is ridden body-building system
US9812096B2 (en) 2008-01-23 2017-11-07 Spy Eye, Llc Eye mounted displays and systems using eye mounted displays
WO2018000784A1 (en) * 2016-07-01 2018-01-04 歌尔科技有限公司 Bilateral wireless earphone system of vr device and bilateral wireless earphone of vr device
US9993335B2 (en) 2014-01-08 2018-06-12 Spy Eye, Llc Variable resolution eye mounted displays
US20180356884A1 (en) * 2016-01-22 2018-12-13 Samsung Electronics Co., Ltd. Hmd device and control method therefor
CN109271016A (en) * 2017-07-18 2019-01-25 中兴通讯股份有限公司 A kind of virtual reality data processing method, device, system and storage medium
CN109891850A (en) * 2016-09-09 2019-06-14 Vid拓展公司 Method and apparatus for reducing the delay of 360 degree of vision area adaptive stream medias
US10444827B2 (en) 2017-09-18 2019-10-15 Fujitsu Limited Platform for virtual reality movement
US10810800B2 (en) * 2017-11-10 2020-10-20 Korea Electronics Technology Institute Apparatus and method for providing virtual reality content of moving means
US11106037B2 (en) 2017-02-27 2021-08-31 Snap Inc. Processing a media content based on device movement
US20220161120A1 (en) * 2020-11-25 2022-05-26 Saga Holographic, Inc. Exercise apparatus with integrated holographic display
US11616942B2 (en) 2018-03-22 2023-03-28 Interdigital Madison Patent Holdings, Sas Viewport dependent video streaming events
WO2023118978A1 (en) * 2021-12-22 2023-06-29 Давит АРСЕНЯН Intelligent gait simulator
US11917127B2 (en) 2018-05-25 2024-02-27 Interdigital Madison Patent Holdings, Sas Monitoring of video streaming events

Families Citing this family (65)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7874957B2 (en) * 2006-07-06 2011-01-25 Artis, Llc Apparatus for measuring exercise performance
DE102007002614B4 (en) * 2007-01-12 2009-04-02 Jerichow, Ulrich, Dr. Device for monitoring, controlling and / or regulating a gas composition
US11265082B2 (en) 2007-05-24 2022-03-01 Federal Law Enforcement Development Services, Inc. LED light control assembly and system
US9100124B2 (en) 2007-05-24 2015-08-04 Federal Law Enforcement Development Services, Inc. LED Light Fixture
US9455783B2 (en) 2013-05-06 2016-09-27 Federal Law Enforcement Development Services, Inc. Network security and variable pulse wave form with continuous communication
US8643722B2 (en) * 2008-10-08 2014-02-04 Cerevellum Design, Llc Rear-view display system for a bicycle
IL194701A (en) * 2008-10-12 2012-08-30 Rafael Advanced Defense Sys Method and system for displaying a panoramic view to an operator
US8890773B1 (en) 2009-04-01 2014-11-18 Federal Law Enforcement Development Services, Inc. Visible light transceiver glasses
FR2944615B1 (en) * 2009-04-21 2013-11-22 Eric Belmon CARPET ADAPTED TO DISPLACEMENTS IN A VIRTUAL REALITY
US20100302233A1 (en) * 2009-05-26 2010-12-02 Holland David Ames Virtual Diving System and Method
US8121618B2 (en) 2009-10-28 2012-02-21 Digimarc Corporation Intuitive computing methods and systems
US8175617B2 (en) 2009-10-28 2012-05-08 Digimarc Corporation Sensor-based mobile search, related methods and systems
WO2012048252A1 (en) 2010-10-07 2012-04-12 Aria Glassworks, Inc. System and method for transitioning between interface modes in virtual and augmented reality applications
WO2012071463A2 (en) * 2010-11-24 2012-05-31 Aria Glassworks, Inc. System and method for presenting virtual and augmented reality scenes to a user
US9017163B2 (en) 2010-11-24 2015-04-28 Aria Glassworks, Inc. System and method for acquiring virtual and augmented reality scenes by a user
US9041743B2 (en) 2010-11-24 2015-05-26 Aria Glassworks, Inc. System and method for presenting virtual and augmented reality scenes to a user
US8953022B2 (en) 2011-01-10 2015-02-10 Aria Glassworks, Inc. System and method for sharing virtual and augmented reality scenes between users and viewers
TW201231132A (en) * 2011-01-24 2012-08-01 Hon Hai Prec Ind Co Ltd Simulation system and method for body building equipment
US9118970B2 (en) * 2011-03-02 2015-08-25 Aria Glassworks, Inc. System and method for embedding and viewing media files within a virtual and augmented reality scene
US9339691B2 (en) 2012-01-05 2016-05-17 Icon Health & Fitness, Inc. System and method for controlling an exercise device
US9277367B2 (en) * 2012-02-28 2016-03-01 Blackberry Limited Method and device for providing augmented reality output
US8620021B2 (en) 2012-03-29 2013-12-31 Digimarc Corporation Image-related methods and arrangements
US9626799B2 (en) 2012-10-02 2017-04-18 Aria Glassworks, Inc. System and method for dynamically displaying multiple virtual and augmented reality scenes on a single display
US10769852B2 (en) 2013-03-14 2020-09-08 Aria Glassworks, Inc. Method for simulating natural perception in virtual and augmented reality scenes
EP2969058B1 (en) 2013-03-14 2020-05-13 Icon Health & Fitness, Inc. Strength training apparatus with flywheel and related methods
GB201310367D0 (en) 2013-06-11 2013-07-24 Sony Comp Entertainment Europe Head-mountable apparatus and systems
US9594983B2 (en) 2013-08-02 2017-03-14 Digimarc Corporation Learning systems and methods
EP3974036A1 (en) 2013-12-26 2022-03-30 iFIT Inc. Magnetic resistance mechanism in a cable machine
US20150198941A1 (en) 2014-01-15 2015-07-16 John C. Pederson Cyber Life Electronic Networking and Commerce Operating Exchange
US9832353B2 (en) 2014-01-31 2017-11-28 Digimarc Corporation Methods for encoding, decoding and interpreting auxiliary data in media signals
US10067341B1 (en) 2014-02-04 2018-09-04 Intelligent Technologies International, Inc. Enhanced heads-up display system
US9311639B2 (en) 2014-02-11 2016-04-12 Digimarc Corporation Methods, apparatus and arrangements for device to device communication
US10977864B2 (en) 2014-02-21 2021-04-13 Dropbox, Inc. Techniques for capturing and displaying partial motion in virtual or augmented reality scenes
US9699437B2 (en) * 2014-03-03 2017-07-04 Nextvr Inc. Methods and apparatus for streaming content
WO2015138339A1 (en) 2014-03-10 2015-09-17 Icon Health & Fitness, Inc. Pressure sensor to quantify work
US10426989B2 (en) 2014-06-09 2019-10-01 Icon Health & Fitness, Inc. Cable system incorporated into a treadmill
WO2015195965A1 (en) 2014-06-20 2015-12-23 Icon Health & Fitness, Inc. Post workout massage device
GB2532465B (en) 2014-11-19 2021-08-11 Bae Systems Plc Interactive control station
GB2532464B (en) 2014-11-19 2020-09-02 Bae Systems Plc Apparatus and method for selectively displaying an operational environment
KR102178298B1 (en) 2014-11-21 2020-11-12 삼성전자주식회사 Method for controlling display and apparatus supplying the same
US20160179206A1 (en) * 2014-12-22 2016-06-23 Tim LaForest Wearable interactive display system
US10108832B2 (en) * 2014-12-30 2018-10-23 Hand Held Products, Inc. Augmented reality vision barcode scanning system and method
EP3262488B1 (en) 2015-02-25 2021-04-07 BAE Systems PLC Apparatus and method for effecting a control action in respect of system functions
GB2535729A (en) * 2015-02-25 2016-08-31 Bae Systems Plc Immersive vehicle simulator apparatus and method
US10391361B2 (en) 2015-02-27 2019-08-27 Icon Health & Fitness, Inc. Simulating real-world terrain on an exercise device
AT516901B1 (en) * 2015-03-06 2018-07-15 Amst Systemtechnik Gmbh Flight simulator and flight simulation method
US20170046950A1 (en) 2015-08-11 2017-02-16 Federal Law Enforcement Development Services, Inc. Function disabler device and system
US10303988B1 (en) 2015-08-14 2019-05-28 Digimarc Corporation Visual search methods and systems
CN105334692B (en) * 2015-09-28 2018-06-29 联想(北京)有限公司 Information processing method and electronic equipment
CN105963914B (en) * 2016-03-03 2017-12-01 河南汇亚电子科技有限公司 A kind of VR Spinnings
US10272317B2 (en) 2016-03-18 2019-04-30 Icon Health & Fitness, Inc. Lighted pace feature in a treadmill
US10493349B2 (en) 2016-03-18 2019-12-03 Icon Health & Fitness, Inc. Display on exercise device
US10625137B2 (en) 2016-03-18 2020-04-21 Icon Health & Fitness, Inc. Coordinated displays in an exercise device
WO2017200279A1 (en) 2016-05-17 2017-11-23 Samsung Electronics Co., Ltd. Method and apparatus for facilitating interaction with virtual reality equipment
US10596446B2 (en) * 2016-06-24 2020-03-24 Hashplay Inc. System and method of fitness training in virtual environment
TWI661851B (en) * 2016-07-26 2019-06-11 光旴科技股份有限公司 Head-mounted target synchronous fitness system
CN106371601A (en) * 2016-09-12 2017-02-01 苏州金螳螂文化发展股份有限公司 Multi-person riding virtual reality system
US10671705B2 (en) 2016-09-28 2020-06-02 Icon Health & Fitness, Inc. Customizing recipe recommendations
KR102598082B1 (en) * 2016-10-28 2023-11-03 삼성전자주식회사 Image display apparatus, mobile device and operating method for the same
IT201600124198A1 (en) * 2016-12-07 2018-06-07 Michele Presicci "INTERACTIVE SYSTEM TO BE APPLIED TO GYMNASTER EQUIPMENT, BASED ON THE INCREASED REALITY"
CN106730613A (en) * 2016-12-07 2017-05-31 浙江理工大学 The heart rate and speed sync real-time acquisition device of Spinning
JP6953865B2 (en) * 2017-07-28 2021-10-27 富士フイルムビジネスイノベーション株式会社 Information processing system
KR101865520B1 (en) 2017-11-30 2018-06-07 송병우 Virtual Augmented Reality Bicycles
US10712810B2 (en) * 2017-12-08 2020-07-14 Telefonaktiebolaget Lm Ericsson (Publ) System and method for interactive 360 video playback based on user location
WO2019165501A1 (en) * 2018-02-28 2019-09-06 Visospace Holdings Pty Ltd Virtual locomotion device

Citations (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4566763A (en) * 1983-02-08 1986-01-28 Budapesti Muszaki Egyetem Panoramic imaging block for three-dimensional space
US5130794A (en) * 1990-03-29 1992-07-14 Ritchey Kurtis J Panoramic display system
US5466200A (en) * 1993-02-02 1995-11-14 Cybergear, Inc. Interactive exercise apparatus
US5499146A (en) * 1994-05-24 1996-03-12 Texas Instruments Incorporated Method and apparatus for recording images for a virtual reality system
US5562572A (en) * 1995-03-10 1996-10-08 Carmein; David E. E. Omni-directional treadmill
US5577981A (en) * 1994-01-19 1996-11-26 Jarvik; Robert Virtual reality exercise machine and computer controlled video system
US5745305A (en) * 1995-04-28 1998-04-28 Lucent Technologies Inc. Panoramic viewing apparatus
US5785630A (en) * 1993-02-02 1998-07-28 Tectrix Fitness Equipment, Inc. Interactive exercise apparatus
US5890995A (en) * 1993-02-02 1999-04-06 Tectrix Fitness Equipment, Inc. Interactive exercise apparatus
US5960108A (en) * 1997-06-12 1999-09-28 Apple Computer, Inc. Method and system for creating an image-based virtual reality environment utilizing a fisheye lens
US6050822A (en) * 1997-10-01 2000-04-18 The United States Of America As Represented By The Secretary Of The Army Electromagnetic locomotion platform for translation and total immersion of humans into virtual environments
US20010001303A1 (en) * 1996-11-25 2001-05-17 Mieko Ohsuga Physical exercise system having a virtual reality environment controlled by a users movement
US6327381B1 (en) * 1994-12-29 2001-12-04 Worldscape, Llc Image transformation and synthesis methods
US20020055422A1 (en) * 1995-05-18 2002-05-09 Matthew Airmet Stationary exercise apparatus adaptable for use with video games and including springed tilting features
US6429867B1 (en) * 1999-03-15 2002-08-06 Sun Microsystems, Inc. System and method for generating and playback of three-dimensional movies
US20020158815A1 (en) * 1995-11-28 2002-10-31 Zwern Arthur L. Multi axis motion and position controller for portable electronic displays
US6486908B1 (en) * 1998-05-27 2002-11-26 Industrial Technology Research Institute Image-based method and system for building spherical panoramas
US20030076280A1 (en) * 2000-03-07 2003-04-24 Turner James A Vehicle simulator having head-up display
US20040061663A1 (en) * 2002-09-27 2004-04-01 Cybereyes, Inc. Virtual reality display apparatus and associated display mounting system
US20040110602A1 (en) * 2002-12-04 2004-06-10 Feldman Philip G. Computer interactive isometric exercise system and method for operatively interconnecting the exercise system to a computer system for use as a peripheral
US20040214690A1 (en) * 2001-07-23 2004-10-28 Southwest Research Institute Virtual reality system locomotion interface utilizing a pressure-sensing mat attached to movable base structure
US20040260191A1 (en) * 1999-11-09 2004-12-23 Stubbs Jack B. Exercise monitoring system and methods
US20050148432A1 (en) * 2003-11-03 2005-07-07 Carmein David E.E. Combined omni-directional treadmill and electronic perception technology
US20050156817A1 (en) * 2002-08-30 2005-07-21 Olympus Corporation Head-mounted display system and method for processing images
US20050233861A1 (en) * 2001-10-19 2005-10-20 Hickman Paul L Mobile systems and methods for heath, exercise and competition
US20060057549A1 (en) * 2004-09-10 2006-03-16 United States of America as represented by the Administrator of the National Aeronautics and Method and apparatus for performance optimization through physical perturbation of task elements
US7043695B2 (en) * 2000-09-19 2006-05-09 Technion Research & Development Foundation Ltd. Object positioning and display in virtual environments

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3054146B1 (en) 1999-01-06 2000-06-19 宜昇科技股▲分▼有限公司 Panoramic image device
CA2363775C (en) 2001-11-26 2010-09-14 Vr Interactive International, Inc. A symmetric, high vertical field of view 360 degree reflector using cubic transformations and method

Patent Citations (30)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4566763A (en) * 1983-02-08 1986-01-28 Budapesti Muszaki Egyetem Panoramic imaging block for three-dimensional space
US5130794A (en) * 1990-03-29 1992-07-14 Ritchey Kurtis J Panoramic display system
US5466200A (en) * 1993-02-02 1995-11-14 Cybergear, Inc. Interactive exercise apparatus
US5785630A (en) * 1993-02-02 1998-07-28 Tectrix Fitness Equipment, Inc. Interactive exercise apparatus
US5890995A (en) * 1993-02-02 1999-04-06 Tectrix Fitness Equipment, Inc. Interactive exercise apparatus
US5577981A (en) * 1994-01-19 1996-11-26 Jarvik; Robert Virtual reality exercise machine and computer controlled video system
US5499146A (en) * 1994-05-24 1996-03-12 Texas Instruments Incorporated Method and apparatus for recording images for a virtual reality system
US6327381B1 (en) * 1994-12-29 2001-12-04 Worldscape, Llc Image transformation and synthesis methods
US5562572A (en) * 1995-03-10 1996-10-08 Carmein; David E. E. Omni-directional treadmill
US5745305A (en) * 1995-04-28 1998-04-28 Lucent Technologies Inc. Panoramic viewing apparatus
US20020055422A1 (en) * 1995-05-18 2002-05-09 Matthew Airmet Stationary exercise apparatus adaptable for use with video games and including springed tilting features
US20020158815A1 (en) * 1995-11-28 2002-10-31 Zwern Arthur L. Multi axis motion and position controller for portable electronic displays
US6244987B1 (en) * 1996-11-25 2001-06-12 Mitsubishi Denki Kabushiki Kaisha Physical exercise system having a virtual reality environment controlled by a user's movement
US20010001303A1 (en) * 1996-11-25 2001-05-17 Mieko Ohsuga Physical exercise system having a virtual reality environment controlled by a users movement
US5960108A (en) * 1997-06-12 1999-09-28 Apple Computer, Inc. Method and system for creating an image-based virtual reality environment utilizing a fisheye lens
US6050822A (en) * 1997-10-01 2000-04-18 The United States Of America As Represented By The Secretary Of The Army Electromagnetic locomotion platform for translation and total immersion of humans into virtual environments
US20030063089A1 (en) * 1998-05-27 2003-04-03 Ju-Wei Chen Image-based method and system for building spherical panoramas
US6486908B1 (en) * 1998-05-27 2002-11-26 Industrial Technology Research Institute Image-based method and system for building spherical panoramas
US20030063816A1 (en) * 1998-05-27 2003-04-03 Industrial Technology Research Institute, A Taiwanese Corporation Image-based method and system for building spherical panoramas
US6429867B1 (en) * 1999-03-15 2002-08-06 Sun Microsystems, Inc. System and method for generating and playback of three-dimensional movies
US20040260191A1 (en) * 1999-11-09 2004-12-23 Stubbs Jack B. Exercise monitoring system and methods
US20030076280A1 (en) * 2000-03-07 2003-04-24 Turner James A Vehicle simulator having head-up display
US7043695B2 (en) * 2000-09-19 2006-05-09 Technion Research & Development Foundation Ltd. Object positioning and display in virtual environments
US20040214690A1 (en) * 2001-07-23 2004-10-28 Southwest Research Institute Virtual reality system locomotion interface utilizing a pressure-sensing mat attached to movable base structure
US20050233861A1 (en) * 2001-10-19 2005-10-20 Hickman Paul L Mobile systems and methods for heath, exercise and competition
US20050156817A1 (en) * 2002-08-30 2005-07-21 Olympus Corporation Head-mounted display system and method for processing images
US20040061663A1 (en) * 2002-09-27 2004-04-01 Cybereyes, Inc. Virtual reality display apparatus and associated display mounting system
US20040110602A1 (en) * 2002-12-04 2004-06-10 Feldman Philip G. Computer interactive isometric exercise system and method for operatively interconnecting the exercise system to a computer system for use as a peripheral
US20050148432A1 (en) * 2003-11-03 2005-07-07 Carmein David E.E. Combined omni-directional treadmill and electronic perception technology
US20060057549A1 (en) * 2004-09-10 2006-03-16 United States of America as represented by the Administrator of the National Aeronautics and Method and apparatus for performance optimization through physical perturbation of task elements

Cited By (32)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9899005B2 (en) 2008-01-23 2018-02-20 Spy Eye, Llc Eye mounted displays and systems, with data transmission
US10089966B2 (en) 2008-01-23 2018-10-02 Spy Eye, Llc Eye mounted displays and systems
US9899006B2 (en) 2008-01-23 2018-02-20 Spy Eye, Llc Eye mounted displays and systems, with scaler using pseudo cone pixels
US10467992B2 (en) 2008-01-23 2019-11-05 Tectus Corporation Eye mounted intraocular displays and systems
US11393435B2 (en) 2008-01-23 2022-07-19 Tectus Corporation Eye mounted displays and eye tracking systems
US9812096B2 (en) 2008-01-23 2017-11-07 Spy Eye, Llc Eye mounted displays and systems using eye mounted displays
US9824668B2 (en) 2008-01-23 2017-11-21 Spy Eye, Llc Eye mounted displays and systems, with headpiece
US9837052B2 (en) 2008-01-23 2017-12-05 Spy Eye, Llc Eye mounted displays and systems, with variable resolution
US9858901B2 (en) 2008-01-23 2018-01-02 Spy Eye, Llc Eye mounted displays and systems, with eye tracker and head tracker
US9858900B2 (en) 2008-01-23 2018-01-02 Spy Eye, Llc Eye mounted displays and systems, with scaler
US20130007668A1 (en) * 2011-07-01 2013-01-03 James Chia-Ming Liu Multi-visor: managing applications in head mounted displays
US9727132B2 (en) * 2011-07-01 2017-08-08 Microsoft Technology Licensing, Llc Multi-visor: managing applications in augmented reality environments
US9180053B2 (en) 2013-01-29 2015-11-10 Xerox Corporation Central vision impairment compensation
US11284993B2 (en) 2014-01-08 2022-03-29 Tectus Corporation Variable resolution eye mounted displays
US9993335B2 (en) 2014-01-08 2018-06-12 Spy Eye, Llc Variable resolution eye mounted displays
WO2016086494A1 (en) * 2014-12-05 2016-06-09 钱晓炯 Video presentation method for smart mobile terminal
US20180356884A1 (en) * 2016-01-22 2018-12-13 Samsung Electronics Co., Ltd. Hmd device and control method therefor
US10845869B2 (en) * 2016-01-22 2020-11-24 Samsung Electronics Co., Ltd. HMD device and control method therefor
WO2018000784A1 (en) * 2016-07-01 2018-01-04 歌尔科技有限公司 Bilateral wireless earphone system of vr device and bilateral wireless earphone of vr device
CN109891850A (en) * 2016-09-09 2019-06-14 Vid拓展公司 Method and apparatus for reducing the delay of 360 degree of vision area adaptive stream medias
US11677802B2 (en) 2016-09-09 2023-06-13 Vid Scale, Inc. Methods and apparatus to reduce latency for 360-degree viewport adaptive streaming
CN106880945A (en) * 2017-01-24 2017-06-23 扬州大学 Interior based on virtual reality glasses is ridden body-building system
US11668937B2 (en) 2017-02-27 2023-06-06 Snap Inc. Processing a media content based on device movement
US11106037B2 (en) 2017-02-27 2021-08-31 Snap Inc. Processing a media content based on device movement
CN109271016A (en) * 2017-07-18 2019-01-25 中兴通讯股份有限公司 A kind of virtual reality data processing method, device, system and storage medium
US10444827B2 (en) 2017-09-18 2019-10-15 Fujitsu Limited Platform for virtual reality movement
US10810800B2 (en) * 2017-11-10 2020-10-20 Korea Electronics Technology Institute Apparatus and method for providing virtual reality content of moving means
US11616942B2 (en) 2018-03-22 2023-03-28 Interdigital Madison Patent Holdings, Sas Viewport dependent video streaming events
US11917127B2 (en) 2018-05-25 2024-02-27 Interdigital Madison Patent Holdings, Sas Monitoring of video streaming events
US20220161120A1 (en) * 2020-11-25 2022-05-26 Saga Holographic, Inc. Exercise apparatus with integrated holographic display
US11931641B2 (en) * 2020-11-25 2024-03-19 Saga Holographic, Inc. Exercise apparatus with integrated holographic display
WO2023118978A1 (en) * 2021-12-22 2023-06-29 Давит АРСЕНЯН Intelligent gait simulator

Also Published As

Publication number Publication date
WO2005091798A3 (en) 2006-10-19
US20050195128A1 (en) 2005-09-08
US7224326B2 (en) 2007-05-29
WO2005091798A2 (en) 2005-10-06

Similar Documents

Publication Publication Date Title
US20070229397A1 (en) Virtual reality system
US11240624B2 (en) Information processing apparatus, information processing method, and program
US9918118B2 (en) Apparatus and method for playback of audio-visual recordings
JP7277451B2 (en) racing simulation
JP6585358B2 (en) System and method for converting sensory data into haptic effects
EP3432590A1 (en) Display device and information processing terminal device
US8717254B1 (en) Portable motion sensor and video glasses system for displaying a real time video display to a user while exercising
US20010045978A1 (en) Portable personal wireless interactive video device and method of using the same
KR102161646B1 (en) System and method for interworking virtual reality and indoor exercise machine
US20100026809A1 (en) Camera-based tracking and position determination for sporting events
CN105915849A (en) Virtual reality sports event play method and system
US8947253B2 (en) Immersive vehicle multimedia system
JP2006502453A (en) Multimedia race experience system
JP6714625B2 (en) Computer system
KR101878201B1 (en) Real time interactive health care apparatus based on display and the method thereof
KR101961934B1 (en) VR bicycle system
WO2018101279A1 (en) Information processing device, information processing method, and computer program
WO2015033446A1 (en) Running assistance system and head mount display device used in same
KR100761465B1 (en) Treadmill system having wearable display device
KR20180050442A (en) Method and apparatus for providing realistic service for riding
EP3389030A1 (en) Device for presenting onomatopoeia regarding evaluation result for user action
JP2001029506A (en) Indoor exercise device
KR20170102677A (en) Apparatus for Display related to Speed
Nagaosa et al. Basic Research on Construction of a VR Riding Simulator for Motorcycle Safety
JP2022118692A (en) Exercise system, method for providing exercise, and computer program

Legal Events

Date Code Title Description
AS Assignment

Owner name: VOLO, LLC, MICHIGAN

Free format text: CORRECTING RECORDATION FORM COVER SHEET WITH CORRECT SERIAL NUMBER R/F 019345/0359;ASSIGNOR:SEFTON, ROBERT T.;REEL/FRAME:019457/0880

Effective date: 20070523

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION