CA2371349A1 - Panoramic movies which simulate movement through multidimensional space - Google Patents

Panoramic movies which simulate movement through multidimensional space Download PDF

Info

Publication number
CA2371349A1
CA2371349A1 CA002371349A CA2371349A CA2371349A1 CA 2371349 A1 CA2371349 A1 CA 2371349A1 CA 002371349 A CA002371349 A CA 002371349A CA 2371349 A CA2371349 A CA 2371349A CA 2371349 A1 CA2371349 A1 CA 2371349A1
Authority
CA
Canada
Prior art keywords
frame
images
panorama
key
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
CA002371349A
Other languages
French (fr)
Inventor
Scott Gilbert
David Kaiman
Michael C. Park
G. David Ripley
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Imove Inc
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Publication of CA2371349A1 publication Critical patent/CA2371349A1/en
Abandoned legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/2628Alteration of picture size, shape, position or orientation, e.g. zooming, rotation, rolling, perspective, translation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/698Control of cameras or camera modules for achieving an enlarged field of view, e.g. panoramic image capture
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/765Interface circuits between an apparatus for recording and another apparatus
    • H04N5/77Interface circuits between an apparatus for recording and another apparatus between a recording apparatus and a television camera
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/765Interface circuits between an apparatus for recording and another apparatus
    • H04N5/775Interface circuits between an apparatus for recording and another apparatus between a recording apparatus and a television receiver
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/79Processing of colour television signals in connection with recording
    • H04N9/80Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
    • H04N9/804Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback involving pulse code modulation of the colour picture signal components
    • H04N9/8042Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback involving pulse code modulation of the colour picture signal components involving data reduction
    • H04N9/8047Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback involving pulse code modulation of the colour picture signal components involving data reduction using transform coding
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/79Processing of colour television signals in connection with recording
    • H04N9/80Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
    • H04N9/804Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback involving pulse code modulation of the colour picture signal components
    • H04N9/806Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback involving pulse code modulation of the colour picture signal components with processing of the sound signal
    • H04N9/8063Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback involving pulse code modulation of the colour picture signal components with processing of the sound signal using time division multiplex of the PCM audio and PCM video signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/79Processing of colour television signals in connection with recording
    • H04N9/80Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
    • H04N9/82Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only
    • H04N9/8205Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only involving the multiplexing of an additional signal and the colour video signal

Abstract

Movement through multi-dimensional space is simulated using a series of panoramic images which are projected or displayed in sequence (22). The user's direction of view, that is the selected view window, is maintained as the series of images is projected or displayed. Motion in directions other than forward or reverse is simulated by utilizing "branch" points in the sequence.
Each path from a branch point simulates motion in a different direction.
Branch points are generally indicated to a viewer by visual indicators called "hot spots"; however, branch points may also be hidden and activated in response to the viewer's selected direction of view. If a branch point is indicated by a visual indicator, a user can select motion in a desired direction by clicking on a "hot spot". In order to conserve storage space, the image representing each panorama can be stored in a compressed format (10).
Only the portion of the panorama necessary to create a "view window" that is, the portion of the image displayed in response to the user's direction of view, is decompressed at view time (21).

Description

4 Field of the Invention:
The present invention relates to photography, digital image processing and to 6 computer graphics. More particularly the present invention relates to a method and 7 system for providing a viewer with a multidimensional view which simulates 8 movement through space or time.

1o Copyright Notice:
11 A portion of the disclosure of this patent document contains material which is subject 12 to copyright protection. The copyright owner has no objection to the facsimile 13 reproduction by anyone of the patent document; however, otherwise the copyright 14 owner reserves all rights whatsoever.
16 Background of the invention:
17 Motion is usually simulated by means of single view movies. Single view movies 18 consist of a series of single photographs which are sequentially projected on a 19 screen. At any time, a single photograph is entirely projected on the screen. Some 2o movie theaters have screens which partially surround the viewer, so that a viewer 21 can turn and look at a different portion of the screen as the movie progresses 22 However, irrespective of where the viewer's attention is focused, in fact the entire 23 image is being projected on the screen. With some equipment single view movies 24 can be stopped and reversed; however, again at a particular time a selected frame is entirely displayed. In summary, traditional movies do not allow a viewer to control 26 the portion of the image which is projected on the screen. Stated differently, with 27 traditional movies a viewer can not control the "view window" through which each 2s image is viewed.

3o It should be noted that as used herein the term "screen" refers to either a traditional 31 screen onto which an image is projected or an electronic display which projects or 32 displays an image in such a manner that the image can be seen by a viewer.

34 The technology for producing panoramic images and photographs is well known.
Panoramic images are images which represent the visual surroundings from a single I location (or view point) of a particular 3D environment. Panoramic images can be 2 photographic, computer-generated (CG), or a composite of photo and CG
imagery.
3 Equipment is available which seams together a series of two dimensional 4 conventional images to form a panoramic image. Panoramic images may consist of any field of view such as a full sphere full cylinder, semi-sphere, etc.
however, full 6 sphere views are generally preferred. Panoramic images may be in any of the 7 known projection formats, such as equi-rectangular, Mercator, Peters, fisheye, cube, s or hemicube, etc. If the field of view is wide enough to warrant, perspective 9 correction may be applied to the portion of a panoramic image displayed in order to to remove projection distortions from the given view of the user. Computer programs I 1 and systems which allow one to view a selected portion of, to pan, to rotate, etc a 12 panoramic image or photograph in response to the movement of a computer mouse, 13 joystick, keyboard, etc. are commercially available.

Panoramic image (or photographic image) viewing systems are available which 16 provide a number of panoramas (for example views of adjacent rooms in a building) 17 and which allow a user who is viewing one of the rooms to "click" on a door and to I8 thus bring up the panorama of the next room, thereby, in some sense simulating 19 movement into the next room. However, each of the panoramic views in such 2o systems is static and some explicit action on the part of the viewer is required to 21 move on to a different panorama.

23 U.S. Patents 5,023,925 and 5,703,604 describe, and Dodeca L.L.C. located in 24 Portland Oregon commercially markets, a system for capturing images using a multi lens camera, for seaming the images into panoramas, and for viewing selected 26 portions of the panoramic images.

28 Other panoramic image or photographic viewing systems are available which initiate 29 a conventional single view motion picture when a user "clicks" on a spot such as a 3o door in a first panoramic view. The single view movie simulates movement into a 31 different location and at the end of the movie, the user is presented with a second 32 panoramic view.

34 A company named "Warp" located in the Kingdom of Tonga has demonstrated a system wherein a video sequence is captured using a video camera with a fisheye 1 lens which is pointed in the vertical or "up" direction (see, VTV Software 2 Development Kit Reference Manual 2.01 Win95 1996). This approach provides a 3 hemispheric movie in which the user may "pan around" while the movie is playing. In the system demonstrated by Warp, the user views the movie "in sequence", meaning that each frame of the movie is played according to the temporal sequence in which 6 the hemispheric movie was captured. The system demonstrated by Warp was 7 limited to sub-spherical panorama dimensions and the camera was located at a fixed 8 position.

to Realistic simulation of movement from one location to another can be provided by 11 three dimensional computer modeling systems, such as those used in some flight 12 simulators. However, such systems are very computationally intensive 14 Summary of the present invention:
The present invention simulates movement through multi-dimensional space using a 16 series of panoramic images which are projected or displayed in sequence.
The 17 user's direction of view, that is the selected view window, is maintained as the series 1s of images is projected or displayed. Motion in directions other than forward or 19 reverse is simulated by utilizing "branch" points in the sequence. Each path from a branch point can simulate motion in a different direction. Branch points are generally 21 indicated to a viewer by visual indicators called "hot spots"; however, branch points 22 may also be hidden and activated in response to the viewer's selected direction of 23 view. If a branch point is indicated by a visual indicator, a user can select motion in a 24 desired direction by "clicking" on a "hot spot".
26 In order to conserve storage space, the image representing each panorama can be 27 stored in a compressed format. If the images are stored in compressed format, in 28 order to conserve time and processing power, when an image is displayed, only the 29 portion of the panorama necessary to create a "view window" that is, the portion of 3o the image displayed in response to the user's direction of view, is decompressed.
31 Furthermore, the images are stored in a format that does not utilize inter-image 32 compression (such as that used by the MPEG format). Since the images are stored 33 in a format that does not utilize inter-image compression, it is possible to simulate 34 motion in both the forward and backward direction without operating on a series of decompressed images.
_ , __ 2 An index methodology is used to store the panoramic images. Use of the indexing 3 methodology allows the images to be retrieved in both the forward and backward 4 direction to simulate movement in either direction.
6 Sound is provided in a special format, so that special effects can be provided based 7 on the user's point of view and dependent upon the direction of motion selected by s the user.

to Brief Description of the Figures:
11 Figure 1 illustrates a key frame (that is, panoramic image) with a view window and 12 associated sound tracks.

14 Figure 2 is a block diagram showing the major components in the preferred embodiment.

17 Figures 3A to 3D shows the sequence of operations performed by the various 18 components in the system shown in Figure 2.

Figure 4A illustrates a sequence of frames which constitute a panoramic movie.

22 Figure 4B illustrates the sound track associated with the frames of a panoramic 23 movie.

Figure 5A is a perspective view of the multi lens hand held unit that captures a series 26 of panoramic images.

28 Figure 5B is top view of the multi lens hand held unit shown in Figure 5A.

3o Figure 6 is a block diagram of the electronic components in the hand held unit shown 31 in Figures 5A and 5B.

33 Figure. 7 is a diagram of a file containing a pan movie. Figure 7 shows a series of 34 panoramas stored as a series of compressed key-frames and a file index for sequencing playback of the key-frames.

2 Figure 8 is a block diagram of a program for inserting hot spots in a pan movie.

4 Figure 9A is a block diagram of a system for playback of a 3-D panoramic movie according to the invention.

7 Figure 9B is a block diagram of a real time viewing unit.
s 9 Figure 10 is a flowchart of the program for viewing a 3-D movie containing a to sequence of panoramas according to the invention.

12 Figure 11 is a diagram illustrating the audio information associated with each key 13 frame.

Figure 12A, 12B and 12C are a spatial sequence of perspectively correct views 16 illustrating movement past a billboard displaying an advertisement which has been 17 superimposed into a scene as a hot spot.

1~ Description of Appendices:
2o Appendix A is a print out of computer code for retrieving images and correcting the 21 perspective of images in a pan movie.
22 Appendix B is a sample of link control file for a pan movie.
23 Appendix C is a print out of computer pseudocode for linking sequences of images to 24 form a pan movie.
26 Description of a preferred embodiment:
27 In order to simulate movement through multi-dimensional space, one must first 28 capture a series of panoramic images, the panoramic images must be stored as 29 frames and then the appropriate view window from selected frames must be 3o displayed in an appropriate sequence.

32 A panoramic image provides data concerning what is visible in any direction from a 33 particular point in space. At any particular time a viewer or user can only look in one 34 direction. The direction or point of view of a viewer or user determines the "view window", that is, the part of a panoramic image which is projected on a screen at a 1 particular time. Figure 1 shows a key frame (i.e. a panoramic image) or a panorama 2 3a. Panorama 3a has a view window 3b which corresponds to a portion of panorama 3 3a. Panorama 3a also has associated therewith a number of sound tracks 3c.
It is 4 noted that for ease and clarity of illustration, no attempt has been made to illustrate in Figure 3 the well know fact that there is a difference in perspective between what is 6 displayed in a view window and what is stored in a flat section of a rectilinear 7 spherical panorama.
s 9 Figure 2 is an overall diagram of a preferred embodiment of the invention. A
camera to unit 10 captures images. The images are sent to a computer 20 which stores the I 1 images. Computer 20 also controls camera unit 10. If desired the images can be 12 viewed by a real time viewer 30. The images are transferred from computer 20 to off 13 line computer 21. Computer 21 seams the images into panoramas, transforms the 14 images to equirectangular format, adds other information to the images, compresses the panoramas, and links the panoramas into a pan movie. Finally the pan movie is 16 viewed on viewer 22.

1s The operations performed by the units in Figure 2 are shown in Figures 3A, 3B, 3C
19 and 3D. As shown in Figure 3A, block 11 a, camera unit 10 captures a number of 2o single view images. As indicated by block 11 b these images are compressed and 2~ sent to a computer 20. Computer 20 activates camera 10 to capture the images as 22 indicated by block 20a. It then accepts the images as indicated by block 20b and 23 stores them.

The stored images are manually transferred to off line computer 21 which is 26 programmed to perform the operations shown in Figure 3C. First the images are 2~ decompresses as indicated by block 20a so that they can be manipulated.
Next the 28 single view images are seamed into a panorama and transformed to equirectangular 29 format as indicated by block 21 b. Hot spots which indicate break points in a 3o sequence of images and sound tracks are added next as indicated by block 21 c.
31 Finally the images are compressed as indicated by block 21d and stored with an 32 index file as indicated by block 21 e. Each panorama is termed a "key frame". A
33 series of key frames (or more precisely a sequence of view windows) projected in 34 sequence is a pan movie.

A viewer program in viewer computer 22 is used to view the pan movies. The viewer 2 22 displays in sequence a series of images, that is, a series of key frames.
For each 3 key frame displayed the viewer 22 determines an appropriate view window as 4 indicated by block 22a. The portion of the key frame which corresponds to the view window is then de-compressed and displayed as indicated by block 22b. As 6 indicated by block 22c, sound is played if appropriate.
8 It is noted that the operations indicated by blocks 20a, 20b, 21 a to 21 e, 22a, 22b, and 9 22c are implemented by means of computer programs which perform the functions 1o shown. Computer programs are given in appendices A, B, C, and D.

12 Figure 4A represents or illustrates a sequence or series of panoramic images in a 13 pan movie. Each arrow in Figure 4A represents one key frame. At any particular 14 time, only a part (i.e. the view window) from one key frame is visible to a user or observer. The direction of each arrow indicates the direction of view, that is, the view 16 window or part of the key frame that is projected on a screen for observation. The 17 arrows in Figure 4A are meant to represent a particular "view window" from each key 1s frame. As indicated by the change in direction of the arrows in the area of Figure 4A
1~ designated by the letter E, a viewer can change his direction of view as the pan 2o movie progresses. It is noted that when a user is viewing a panorama, a user can 21 point toward the top or bottom of the screen and thus can view images located in a 22 360 degree circle from top to bottom in addition to the horizontal directions illustrated 23 by the arrows shown in Figure 4A.

The sequence of images begins at the point or at the key frame indicated by the 26 letter A and the sequence proceeds to the point or key frame indicated by the letter 2~ B. At this point the viewer can select to either go toward point C or toward point D.
28 The selection may be made by "clicking" on a designated "hot spot" in the panorama 29 designated B or it may be made depending on some other criteria or action by the 3o user. An important point is that at the branch point B, the direction of view (indicated 31 by the direction of the arrows) remains the same irrespective of which path of travel 32 is chosen. The view from the first frame after the branch point will be almost identical 33 in both paths. As time progresses and the viewer moves further from the branch 34 point, the view will gradually change. This is the effect that a person experiences __ 1 when one arrives at a dividing point in a path. When a person takes the first step on 2 a branching path, the persons field of view remains practically identical.

4 It is noted that at branch point B, the arrows are not pointing in the direction of the path leading to point D. Normally, a viewer would be looking in the direction of a branch point when the viewer selects to travel in the direction of the branch point.
7 Thus, a viewer looking in the direction of the arrows shown in Figure 4A
would 8 normally continue to point C rather than selecting the path to point D.

to Sequences of key frames can either be joined at branch points such as branch point I I B or alternatively a branch point may be located at the end of a sequence of key 12 frames. That is, a branch point may be located at the terminal frame of a sequence 13 of key frames. Such a branch point could have two alternative sequences, one of 14 which can be selected by a user by clicking on one of two hot spots.
Alternatively at I5 the end of a sequence of key frames, there can be an implicit branch point.
At such 16 an implicit branch point a new sequence of frames would be selected by the system 17 without any action by the user.

19 There is a one to one ratio of key frames to possible user positions.
Hence, there 2o exists a correlation between frame rate and user motion speed. If the user is moving 21 through the environment, every frame displayed is a new key frame. The faster the 22 frame rate for a given frame spacing, the faster the user travels. Given a fixed frame 23 rate, the user's travel speed may be dictated by the relative spacing of key frames.
24 The closer the key frames are, the slower the user will travel. For example, for a 25 travel speed of approximately 5 mph and a playback frame rate of 15 fps, individual 26 panoramic frames should be captured at about 6 inch increments. The math is as 2~ follows: (5 miles/hour * 63,360 inches/mile)/ (3600 sec/hour * 15 frames/sec) = 6 28 inches per frame. When the movie is being displayed, speed of travel can be 29 increased by skipping some of the frames (for example if every other frame is 30 skipped the speed of travel is doubled). Skipping frames reduces the rate at which 31 frames need be sent to the viewer and thus reduces the bandwidth required.

33 In addition to the spacing of key frames to achieve different travel speeds, the 34 orientation of individual key frames may be adjusted in order to achieve a desired 35 motion effect, such as gate, slumber, waddle, crawl, skip, etc. The orientation of a _8__ 1 key frame is defined to be the default view (or point of focus) of the user within the 2 panoramic image if no other point of view is specifically selected.

4 Sound can accompany the visual effect provided by pan movies. Figure 4B
indicates that each key frame can have one or more associated digital sound tracks. The 6 digital sound tracks are indicated in Figure 4B by the dotted line which is associated 7 with each of the arrows. As shown in Figure 11 and described later, there can be 8 several different sound tracks associated with each key frame.

to Figure 5A is a perspective view of the six lens camera unit 10 which is used to 11 digitally capture panoramic images, that is, key frames. Figure 5B is a top view of 12 camera 10 which is included to show that the unit has six lenses 41a to 41f. Each 13 lens 41 a to 41f has a 110 degree field of view. The images captured by lenses 41 a 14 to 41f are transmitted to computer 20 through a serial connection. Computer "seams" the individual images from lenses 41a to 41f into panoramic images or key 16 frames, compresses the key frames and stores them for future display.
Additionally, 17 a real time viewer can 30 can be used to view the images as they are being captured 18 and seamed.

2o In the preferred embodiment the connection from camera unit 10 to computer 20 and 21 from computer 20 to real time viewer 30 is a "HOTlink" serial bus. Such connections 22 are commercially available from suppliers such a Cypress Semiconductor Corp. or 23 from Dataforth Corporation which is a division of Burr-Brow Company.
Alternatively 24 other types of high speed connections could be used. For example the connection could be a standard SCSI connection.

27 Figure 6 shows the electronic components in camera 20. The components 28 associated with each lens 41 a to 41f are substantially identical to the components in 29 commercially available digital cameras. The internal operation of the camera 20 is 3o controlled by a conventional embedded programmed computer 45 which, for 31 example, may be a model 29000 computer available from Advanced Micro Devices 32 Corporation. Many different suitable embedded processors are commercially 33 available. Embedded computer 45 receives commands from computer 20 which has 34 10 units which allow an operator to enter commands. For example computer 20 sends commands to computer 45 which set the aperture of the lenses 41 a to 41 f and 1 which starts and stops the operation of the camera. While computer 20 sends 2 general commands to computer 45 such as set aperture, start, stop, etc., computer 3 45 sends to detailed commands which control CCD arrays 43a to 43f and which 4 control compression chips 44a to 44f. Such commands are conventional.
6 Each of the lenses 41 b to 41f has a set of associated components similar to the 7 components associated with lens 41 a.. The following will discuss the components 8 associated with lens 41 a. It should be understood that the other lenses 41 b to 41f 9 have a similar sets of components.
to 11 The image from lens 41 a is focused on a CCD (Charge Coupled Device) array 43a.
12 CCD array 43a captures the image from lens 41 a and send this image to embedded 13 computer 45. CCD arrays 43a is controlled and operated by embedded computer 14 45. By resetting and reading the CCD array 43a in a particular time period, the embedded computer 45 in effect controls or provides an electronic shutter for lens 16 41 a. The electronic shutters associated with each of the lenses 40a to 40f open and 17 close simultaneously. Each CCD array 43a to 43f captures 30 images per second 18 under normal operation.

2o The output of CCD array 43a is fed into a JPEG data compression chip 44a.
Chip 21 44a compresses the image from lens 41 a so that the image can be more easily 22 transmitted to computer 40. The output from compression chip 41 a fed to an 23 embedded controller 45 which transmits signals to computer 40 on a serial time slice 24 basis.
26 The lenses 41 and the CCD arrays 43, and are similar to the components found in 27 commercially available digital cameras. JPEG compression chips 44 and embedded 28 computer 45 are also commercially available components. For example such 29 components are available from suppliers such as Zoran Corporation or Atmel 3o Corporation.

32 The electronic shutters associated with lenses 41 operate at 30 cycles per second 33 and hence computer 21 receives six images (one from each lens) each 1/30th of a 34 second. The six images received each 1/30th of a second must be seamed and _ l o __ 1 transformed to equirectangular format to form one panorama as indicated by step 2 21 b in Figure 2.

While the specific embodiment of the invention shown herein utilizes a digital camera to take the initial single view images which are compressed and sent to computer 20 6 for storage, it should be understood that one could use a variety of other types of 7 cameras to take these initial images. For example the images simultaneously taken 8 from a number of lenses could be recorded on tape for later processing by off line 9 computer 21.
11 The seaming operation is done by the program in computer 21. In general the 12 seaming operation connects the individual images into a panoramic image by finding 13 the best possible fit between the various individual images. The process of seaming 14 images into a panoramic image is known. For example U.S. patent 5,694,531 describes seaming polygons into a panorama which has a low root-mean-square 16 error.

18 After the seaming operation is complete each seamed image is a panoramic image 19 (called a panorama) and each panorama is a frame of a pan movie. Prior to storage 2o the seamed images are compressed so as that the file size will be manageable. A
21 commercially available compression program known as "Indeo" is used to compress 22 the images. The Indeo program was developed by and is marketed by the Intel 23 Corporation. The Indeo compression program provides a mode of operation which 24 does not utilize any inter-frame compression. The no inter-frame compression mode of the Indeo program is used with the present embodiment of the invention.
Since 26 there is no inter frame compression, the key frames can be accessed and viewed in 27 either the forward or the reverse direction. Furthermore, only the portion of a 28 panorama required for a particular view window is decompressed, thereby saving 29 time and computational resources.
31 The compressed panoramic images are stored in files on computer disks, tape or 32 compact discs (CDs). Each file includes a header and an index as shown in Figure 7.
33 The header includes information such as the following:
34 File Type Tag:
File Size: (total bytes used by the file) 1 Index Size: (Number of entries in frame Index) 2 Max Frame Size: (total bytes used by largest compressed frame) 3 Codec: (Codec used to compress frames.
After the file header, a frame index is provided (see Figure 7). Each frame index points to the location of the associated frame as indicated by the arrows in Figure 7.
6 Thus, individual frames can be read in any order by obtaining their location from the 7 frame index.

9 The indexing mechanism would not be necessary if the key frames were always to going to be used in frame order. However, in the present embodiment, the system 11 can play the key frames which comprise the pan movie in either forward or backward 12 direction. Hence the system must be able to locate individual frames quickly in any 13 order. Furthermore, it is desirable that the system be able to locate a key frame with 14 only a single disk access. Consider the situation were the user is moving "backward"
(in the opposite direction of the key frame disk storage) at a fast travel speed (to 16 increase speed of movement some key-frames are skipped). Without a key frame 17 directory, the disk would have to be searched in a "reverse-linear" manner in order to 18 find and load the next appropriate key frame. With a key frame directory, the next 19 key frame location is located immediately, and loaded with a single disk access (given the directory itself is stored in RAM memory).

22 As indicated in Figure 4A, a viewer can branch from one sequence of images to 23 another sequence of images. This is indicated by branch point B in Figure 4A. By 24 branching a user in effect changes the direction of the simulated travel. A
user indicates a desire to change direction by "clicking" on a visible "hot spot"
or by 26 otherwise activating a hidden hot spot. A visible hot spot can be indicated by any 27 type of visible symbol that is visible in a view window. For example a hot spot may 28 be indicated by a bright red dot in the view window. Alternatively, a hot spot may be 29 indicated by the fact that the cursor changes to a different shape when the cursor is over a hot spot.

32 It is noted that not all visually apparent alternate paths visible in any panorama are 33 actually available as a pan movie branch. For example, at a street intersection, 34 branches may not be provided to all visible streets. Care must be taken to insure 1 that a viewer is given an indication of the branch points that are actually available to 2 the viewer.

4 At a playback rate of 30 frames per second a user would have to be very "fast" (i.e. it would in fact be practically impossible) for a viewer to see and click on a hot spot that 6 appears on a single frame. Without advanced notice, the viewer would have great difficulty actually taking a specific action to activate a branch during a specific single 8 frame since in normal operation a particular frame is only displayed for about 1/30th 9 of a second. In order to be effective and user friendly a user must be given an early to indication of a upcoming branch opportunity that requires user action. A
hot spot in a 11 pan movie must be visible by a viewer in a relatively large number of key frames. For 12 example a hot spot might be visible in the thirty key frames that precede (or follow for 13 reverse operation) a branch point.

Hot spots are inserted into a pan movie in the manner illustrated in Figure 8.
The hot 16 spots are inserted into the key frames by computer 21 before the frames are 17 compressed as indicated by blocks 21 c and 21 d in Figure 3C. It is noted that hot 18 spots may be inserted into a pan movie by altering the original panoramic image so 19 that it includes the hot spot or alternately by providing a overlay image which 2o contains the hot spot image. If an overlay is used, the overlay image needs be 21 projected at the same time as the original image. As indicated by block 87a one 22 must first determine how much in advance one wants to warn the user. If a hot spot 23 is to have a particular size at the time action is needed, when viewed in advance (i.e.
24 from a distance) the hot spot will be much smaller. As indicated by block 87b, in order to insert hot spots in a pan movie, one must select the region where the hot 26 spot is to be located. In general this will be in a view looking toward the direction 27 where the branch will take place. The hot spot is then inserted into the panorama by 28 modifying the images. The hot spot may be indicated by a light colored outline 29 superimposed over the region. The area within the outline may be slightly darkened or lightened. The object is to highlight the region without obscuring the image itself.
3 ~ Various other alternative indications can also be used.

33 The process repeats as indicated by blocks 87d and 87e until the key frame at the 34 branch point is reached. Finally the process is repeated from the opposite direction _ 13 __ 1 from the branch point so that the branch point will be visible if the pan movie is 2 shown in the reverse direction.

4 The changes to the individual key frames may be made manually with a conventional image editor, or the process can be automated by a program designed just for this 6 purpose.

8 In order to avoid unnecessary user intervention, "hidden" hot spots may be added to 9 connect multiple pan movies. A hidden hotspot is one that does not need to be 1o manually selected by the user. With a hidden hot spot, if the user "travels" into a 11 particular key frame which has a hidden hot spot, and the user is "looking"
in the hot 12 spot's general direction, then the system will react based upon the user's implicit 13 selection of the hotspot and the user will be sent along the path directed by the hot t4 spot.
16 Figure 9A is a block diagram of the viewer 22 which plays or displays pan movies.
17 The main components of the viewer 22 are a CD disk reader 80, a computer 81, a 18 display 82, a keyboard 84 and a mouse 85. Computer 81 reads key frames from disk 19 80 and displays the view widow from each key frame on display 82. The operator or 2o user utilizes mouse 85 to indicate a view direction. The view direction determines the 21 view window which is displayed on display 82 by computer 81. A program which 22 implements blocks 22a to 22c (shown in Figure 3D) is stored in and executed by 23 computer 81.

Figure 9B is a block diagram of the real time viewer 30. As an option, the images 26 captured by camera 10 can be viewed in real time. Images are transferred from 27 computer 21 to viewer 22 in real time. The transfer is by means of a HOTlink bus to 28 HOTlink card 86a. The images go from card 86a to RAM memory 86b and then to 29 decompression card 86c which does the de-compression. From the de-compression board 86c the images go back to memory and then to CPU 86d which combines i.e.
31 seams the images as necessary and transfers them to video card 86e which displays 32 them on monitor 86f. Viewer 30 is controlled via a conventional mouse 86m and 33 keyboard 86k.

1 Figure 10 is block diagram of a program for displaying pan movies. The program 2 shown in block diagram in Figure 10 is executed by the computer 81 in Figure 9A.
3 The process begins at block 91 with user input. The user must indicate a start 4 location (at the beginning of the process this would normally be the first frame in the movie). The user must also specify direction of motion, speed and direction of view.
As indicated by blocks 92, 92a, 92b and 92c the system determines and then reads 7 the appropriate pan frame data. As indicated by block 96 and 96a, the system 8 determines the portion of the pan frame that is in the selected view window and that 9 portion of the frame is decompressed. As indicated by blocks 97 and 97a, the image 1o is re-projected to obtain a perspective view. If the hot spots have not been placed on 11 the actual key frames but are contained in a separate file, the hot spot imagery is 12 overlaid on the image. Finally, as indicated by block 98, the part of the image which 13 constitutes the view window is projected on the screen.

As a user travels, the next required key frame is determined by the current user 16 position and direction of travel. The location of this key frame within the file of 17 images is determined via the file index directory; and the key frame is loaded into 1s RAM memory, decompressed, and displayed. To increase performance, only the 19 view window (depending on current user view) portions of the key frame need be loaded into RAM. If for ease of programming the entire key frame is loaded into 21 memory, only view window portions of the key frame need be decompressed. If the 22 entire key frame is compressed as a whole, then a de-compressor supporting "local 23 decompression" is more efficient, e.g., Intel Indeo. To determine the portion of the 24 panorama needed to display a particular view, each of the corner coordinates of the perspective view plane (display window) is converted to panorama coordinates.
The 25 resulting panorama coordinates do not necessarily represent a rectangle, therefore 27 the bounding rectangle of these panorama data is needed to derive a perspective 28 view at a given view orientation.

3o Once the corners of the desired bounding rectangle are determined the Indeo de 31 compression program is instructed to decompress only that portion of the key frame 32 needed for the particular view window. In order to do this, the program must call the 33 Video For Windows function ICSetState prior to decompressing the frame. The C
34 code to accomplish this follows.

1 #include "windows.h"
2 #include "vfw.h"
3 #include "vfw_spec.h"

extern HIC hic; // Opened CODEC (IV41);
6 extern RECT *viewRect; // Determined elsewhere 7 static R4 DEC FRAME DATA Statelnfo;

9 void SetRectState to 11 HIC hic; // Opened CODEC (IV41);
12 RECT *viewRect; // Local Rectangle of interest 13 ) 14 {
R4 DEC FRAME DATA Statelnfo;

17 memset(&Statelnfo,O,sizeof(R4_DEC_FRAME_DATA));
18 Statelnfo.dwSize = sizeof(R4_DEC_FRAME_DATA);
19 Statelnfo.dwFourCC = mmioStringToFOURCC("IV41",0); // Intel Video 4.1 2o Statelnfo.dwVersion = SPECIFIC_INTERFACE VERSION;
21 Statelnfo.mtType = MT_DECODE_FRAME VALUE;
22 Statelnfo.oeEnvironment = OE 32;
23 Statelnfo.dwFlags = DECFRAME VALID ~ DECFRAME_DECODE_RECT;

Statelnfo.rDecodeRect.dwX = min(viewRect->Ieft,viewRect->right);
26 Statelnfo.rDecodeRect.dwY = min(viewRect->top,viewRect->bottom);
27 Statelnfo.rDecodeRect.dwWidth = abs((viewRect->right-viewRect->left)+1);
28 Statelnfo.rDecodeRect.dwHeight = abs((viewRect->bottom-viewRect-29 >top)+1 );
31 ICSetState(hic,&Statelnfo,sizeof(R4_DEC_FRAME_DATA));
32 }
33 If the projection used to store the pan-frame is such that there exists a discontinuity 34 in pixels with respect to the spherical coordinates they represent, then the local region required may be the combination of multiple continuous regions. For a full i cylinder/sphere equirectangular projection (centered about 0 degrees), the left pixel 2 edge represents -180 degrees and the right pixel edge represents 180 degrees. In 3 spherical coordinates, -180 degrees is the same as 180 degrees. Therefore, the discontinuous left/right pixels represent a continuous "wrap-around" in spherical coordinates.

7 The math to determine the portion of the source key-frame panorama needed for a 8 particular view window depends on the projection used to store the panorama.
9 Optionally, the viewer may predict the next key-frame to be loaded (depending on to user travel direction and speed), and pre-load it in order to increase performance.
11 For an equirectangular projection of a full sphere panorama frame, the equations for 12 determining the required portion are as follows:
13 where:
14 Scaler variables are lower case, vectors are bold lower case, and matrices are bold upper case.
16 Panorama point (s,t) is derived from any perspective plane point (u.v).
17 The perspective plane has a focal length I from the center of projection.
1s 19 In addition, the perspective plane can be arbitrarily rotated through a given view orientation, namely heading, pitch, and bank (h,p,b).
21 Any point in the perspective plane is specified by the 3D vector:
22 w = <u, v, I>
23 The rotations are applied by using a standard matrix-vector product. The 24 three matrices accounting for Heading, Pitch and Bank are as follows:
~ cos(h) 0 sin(h) 26 H = ~ 0 1 0 27 ~ -sin(h)0 cosh) 29 ~ 1 0 0 P = ~ 0 cos(p) -sin(p) 31 ~ 0 sin(p) cos(p) 33 Icos(b) sin(b) 0 34 B = ~-sin(b) cos(b) 0 1 ~0 0 1 3 The vector w is rotated using the above matrices to attain w' like such"
4 w' = H*P*B*w The final step is converting from rectangular to spherical coordinates.
Denoting the 3 6 components of the vector w' as x, y, z, then the conversion is:
s = atan2(x, z) 8 t = atan2(y, sqrt(x*x + z*z)) 9 Note: atan2(a, b) is a standard C-function very similar to atan(a/b), but atan2 to correctly handles the different cases that arise if a or b is negative or if b is 0.

12 Optionally, the viewer may predict the next key-frame to be loaded (depending on 13 user travel direction and speed), and pre-load this key frame in order to increase 14 performance.
16 Due to the one to one ratio of key frames to possible user positions, there exists an 17 exact correlation between frame rate and user motion speed. If the user is currently 18 moving through the environment, every frame displayed is a new key frame, thus the 19 faster the frame rate, the faster the user travels. For this reason, the frame rate is "capped" during user travel to eliminate the problem of excessive user travel speed.
2~ In order to retain smooth motion, the frame rate is not decreased to below standard 22 video frame rates (15 frames/sec.) The frame rate is not increased in order to keep 23 the relative spacing of key frames to a manageable distance; the faster the frame 24 rate, the closer the key frames must be to achieve the same user travel speed. The viewer may optionally skip key-frames in order to increase the user's travel speed 26 through the environment. The more key-frames skipped, the faster the user will 27 travel; if no key-frames are skipped, the user will travel at the slowest possible rate 28 (given a constant frame rate.) 3o The system can link pan movie segments so as to permit branching and thereby 31 follow a path selected by a user. Multiple linear (one dimensional) pan movies may 32 be linked together to create a "graph" of pan movies (see appendix B). For each pan 33 movie, the end of one segment may be associated with the start of a "next"
pan 34 movie. This association (in conjunction with the length of the individual pan movies) is the basis for the graph shape. In order to achieve smooth transitions, the "last"
_ 1 g __ 1 frame in the "first" pan movie must be the same as (or one frame off from) the "first"
2 frame of the "next" pan movie. In addition to positional correctness, the relative view 3 orientations of the joining frames must be known. For example, if the "last"
frame of the "first" pan movie faces "north", and the "first" frame of the "next" Pan Movie faces "east", then the viewing software must be alerted to this orientation change.
Without 6 this information, there would be a 90 degree "snap" in the transition between the two 7 Pan Movies. All this graph information may be stored in a separate file (text or binary 8 form.) to The audio information associated with each frame of a pan movie must take into 11 account the fact that a viewer of a pan movie has a great deal of control over what is 12 presented on the screen. In addition to the ability to select branch points a user may 13 choose to change the direction of view or to stop and backup. The audio information t4 associated with each key frame must accommodate this flexibility.
16 As illustrated in Figure 11, the audio information stored with each key frame includes 17 five audio tracks designated A, B, C, D, E and control information. Figure 11 shows 18 eight key frames Fa to Fi each of which has five associated audio tracks and a 19 control field. Audio track A is the track that is played if the pan movie is moving 2o forward in the normal direction at the normal rate of thirty frames per second. Audio 21 track B is the track that is played if the pan movie is being displayed in reverse 22 direction. Audio track C is the audio track that is played if the movie is moving 23 forward at half speed. Audio track D is the track that is played if the movie is being 24 played in the reverse direction at one half speed. Finally audio track E is the track that is repeatedly played if the movie has stopped at one frame. Naturally a variety 26 of other audio tracks could be added for use in a number of other situations. For 27 example, tracks can point to audio clips or to other audio tracks.

29 The control information that is recorded with each frame controls certain special 3o effects. For example the control information on one frame can tell the program to 31 continue playing the audio tracks from the following frame even if the user has 32 stopped the movie at one particular frame. As the sound track on each frame is 33 played, the control information on that frame is interrogated to determine what to do 34 next. What sound is played at any particular time is determined by a combination of the control information on the particular frame being viewed and the action being 1 taken by the viewer at that time. From a programming point of view, the commands 2 associated with each rack are de-compressed and read when the view window for 3 the associated frame is de-compressed and read. As a particular view window is 4 being displayed (or slightly before) the commands stored in the control field are read and executed so that the appropriate sound can be de-compressed and played when 6 the view window is displayed.

8 For example the control information could provide the following types of commands:
9 Stop this audio track if user stops pan movie here (typical setting). If this is to not set the audio will continue playing in same direction until audio for this 11 track ends 13 Start or continue to play this audio track if user is viewing pan movie in 14 forward direction (typical setting) 16 Start or continue to play this audio track backwards if user if viewing pan 17 move in a backwards direction. (note if the same audio information is played 18 is reverse it may be distorted) 2o Start this audio track when image frames are in motion and being played in a 21 reverse direction. This allows high quality audio to be played while reverse 22 viewing 24 Continue audio track from / on other file structure (branch most likely has occurred) modify volume This is used to fade out an audio track that may 26 have played ahead earlier 28 Stop all audio tracks 3o Stop this audio track if user slows pan movie playback 32 Start audio file X: where X is a conventional audio file that is separate from 33 the pan movie.

1 A wide variety of other commands may be implements as desired by the 2 designer of a particular movie.

4 The audio information can be recorded with a normal recorder when the initial images are recorded or it can be recorded separately. The audio data is merged with the key frames by computer 21. This can be done manually on a frame by frame 7 basis or the process can be automated. When the sound is merged with the key 8 frames the appropriate control information is added.

1o Figures 12A, 12B and 12C illustrate another aspect of the present invention. A hot 1 l spot on a key frame can contain and display information that is independent from, 12 and in addition to the information in the base images which are used to form the 13 panoramas. For example, as a pan movie simulates movement past a billboard, a 14 "regular" motion picture (which might for example be an advertisement for a product) can be displayed on the billboard. The motion picture on the billboard would be 16 integrated with the various key frames in the same manner as hot spots are added to 17 key frames. As illustrated in Figure 12A, 12B and 12C, such images displayed on a 18 billboard passed which motion is simulated must be corrected for the fact that the 19 viewer is not directly viewing the image when he is approaching it. The image is only 2o rectangular when the viewer is adjacent the image as shown in Figure 12C.
As the 21 viewer is approaching the image it is distorted as illustrated in Figures 12A and 12B.

23 The attached appendices provide computer programs which implement various 24 aspects of the present invention. These programs are designed to run under a conventional operating system such as the "Windows" operating system marketed by 26 the Microsoft Corporation.

28 The program given in Appendix A will retrieve frames for a move, correct the 29 perspective in accordance with known equations and then display the images of the movie in sequence.

32 Appendix B is an example of a link control file for the frames of a pan movie.
33 Appendix C is pseudocode showing how sequences of images are linked to form a 34 pan movie.

1 It is noted that in a pan movie the frames do not all have to have the same resolution.
2 Some frames may be of a higher resolution. For example, at the most interesting 3 places in the Pan Movie may have a higher resolution.

Many alternative embodiments of the invention are possible. For example, the initial 6 capture process could record the images on video tape rather than recording the 7 images digitally. Electronic cameras could be used which include image capture 8 devices other than CCD arrays to capture images. Branching can provide three or 9 more optional paths rather than just two pats as shown in Figure 4; and branching 1o can provide for going left or right at an intersection.

12 It is noted that in alternative embodiments, compression schemes or techniques 13 other than Intel Indio can be used. Furthermore, alternative embodiments could use 14 no compression at all if enough storage and bandwidth were available.
16 While in the embodiment shown the images files are manually transferred between 17 some of the units in the embodiment shown, in alternative embodiments these files 18 could be transferred between units by electronic connections between the units.

While in the embodiment described above the camera has six lenses which record all 21 six sides of the cube, in alternative embodiments the camera could record less than 22 an entire sphere. For example the lens pointing down could be eliminated in some 23 embodiments. Still other alternative embodiments could use lenses with wider or 24 narrower fields of view. For example less lenses each with a wider field of view could be used. Furthermore, while the embodiment described above utilizes spherical 26 panorama, other types of panoramas could be used. Various types of projections 27 such as cubic could be used instead of equi-rectangular.

29 The embodiment shown includes a number of sound tracks with each key frame and 3o control information which indicates which sound track should be played when the key 31 frame is displayed depending on whether or not certain special conditions exist.
32 Alternatively, there could be a single sound track associated with each frame. In 33 such an embodiment the single sound track on each key frame could be the sounds 34 recorded when the images is the particular frame were recorded. In other 1 alternatively embodiments, there could be no sound tracks and in such case the 2 images would be displayed without accompanying sound.

Having described and illustrated the principles of the invention in various embodiments thereof, it should be apparent that the invention can be modified in 6 arrangement and detail without departing from the principles of the invention. We 7 claim all modifications and variation coming within the spirit and scope of the s following claims.

APPENDIX A: FRAME RETRIEVAL CODE
3 #include "windows. h"
4 #include "mmsystem.h"
#include "vfw.h"
6 #include "vfw_spec.h"

8 #define S_BMIH sizeof(BITMAPINFOHEADER) // Externally declared (and allocated) variables 11 extern UINT currentFrameNumber; // Current Pan Movie file frame number 12 (user position) 13 extern HANDLE hFile; // Open file handle of Pan Movie file 14 extern HIC hic; // Open IC handle (installed compressor) extern DWORD *Index; // Pan Movie Frame Index (read from file at load 16 time) 17 extern LPBITMAPINFOHEADER viewFrame; // Buffer large enough to hold 18 image the size of the display window 19 extern LPBITMAPINFOHEADER panFrame; // Buffer large enough to hold largest uncompressed frame 21 extern LPBITMAPINFOHEADER compressedFrame; // Buffer large enough to 22 hold largest compressed frame 24 // Function prototypes extern void ViewToPan(int viewWidth,int viewHeight,int panWidth,int 26 panHeight,float heading,float pitch,float bank,float zoom,POINT *point);
27 static LPBITMAPINFOHEADER RetrievePanFrame(int frameNumber,RECT
28 *viewRect);

//
31 // This function generates a perspectively correct bitmap image given a 32 user view orientation and travel speed 34 static LPBITMAPINFOHEADER RetrieveViewFrame(float userHeading,float userPitch,float userBank,float userZoom,int userTraveISpeed) 36 {
37 // Determine Decode BoundingBox 38 POINT point;
39 RECT IocaIDecompressionRect;
41 // Upper left corner of viewFrame 42 point.x = 0; pointy = 0;

44 ViewToPan(viewFrame->biWidth,viewFrame->biHeight,panFrame->biWidth,panFrame->biHei ght,userHeading,userPitch,userBank,userZoom,&point);
46 IocaIDecompressionRect.top = pointy;
47 IocaIDecompressionRect.left = point.x;

49 // Upper right corner of viewFrame point.x = viewFrame->biWidth-1; pointy = 0;

52 ViewToPan(viewFrame->biWidth,viewFrame->biHeight,panFramP->biWidth,panFrame->biHei 53 ght,userHeading,userPitch,userBank,userZoom,&point);
54 IocaIDecompressionRect.top = min(IocaIDecompressionRect.top,point.y);
IocaIDecompressionRect.right = point.x;

1 // Lower left corner of viewFrame 2 point.x = 0; pointy = viewFrame->biHeight-1;

4 ViewToPan(viewFrame->biWidth,viewFrame->biHeight,panFrame->biWidth,panFrame->biHei ght,userHeading,userPitch,userBank,userZoom,&point);
6 IocaIDecompressionRect.bottom= pointy;
7 IocaIDecompressionRect.left -8 min(IocaIDecompressionRect.left,point.x);

// Lower right corner of viewFrame 11 point.x = viewFrame->biWidth-1; pointy = viewFrame->biHeight-1;

13 ViewToPan(viewFrame->biWidth,viewFrame->biHeight,panFrame->biWidth,panFrame->biHei 14 ght,userHeading,userPitch,userBank,userZoom,&point);
IocaIDecompressionRect.bottom=
16 max(IocaIDecompressionRect.bottom,point.y);
17 IocaIDecompressionRect.right -18 max(IocaIDecompressionRect.right,point.x);

// Get Pan Frame (or "userDecompressionRect" portion thereof) 21 currentFrameNumber += userTraveISpeed; // userTraveISpeed is negative 22 if traveling backwards 23 LPBITMAPINFOHEADER pFrame =
24 RetrievePanFrame(currentFrameNumber,&IocaIDecompressionRect);
26 if(pFrame == NULL) {
27 currentFrameNumber -= userTraveISpeed;
28 return NULL;
29 }
31 // A very slow warping routine (assumes 24-bit pixels) 32 LPBYTE srcPixels = ((LPBYTE)pFrame) + S_BMIH;
33 LPBYTE dstPixels = ((LPBYTE)viewFrame) + S_BMIH;
34 for(int y = 0; y < viewFrame->biHeight; y++) {
for(int x = 0; x < viewFrame->biHeight; x++) {
36 pointy = y; point.x = x;

38 ViewToPan(viewFrame->biWidth,viewFrame->biHeight,pFrame->biWidth,pFrame->biHeight,u 39 serHeading,userPitch,userBank,userZoom,&point);
memcpy(&dstPixels[3*(x +
41 y*viewFrame->biWidth)],&srcPixels[3*(point.x 42 + pointy*pFrame->biWidth)],3); // supports 24-Bit Pixels only 43 }
44 }
46 return viewFrame;
47 }

// This function reads and decompresses a Pan Frame bitmap image from a 51 Pan Movie file 53 static LPBITMAPINFOHEADER RetrievePanFrame(int frameNumber,RECT
54 *viewRect) {
56 DWORD d;
57 UINT frameSize= Index[frameNumber+1]-Index[frameNumber];

2 // Set the file pointer to the start of the requested frame and read in 3 the bitmap header 4 SetFilePointer(hFile,Index[frameNumber],NULL,FILE_BEGIN);
ReadFile(hFile,panFrame,S_BMIH,&d,NULL);

7 if(panFrame->biCompression == 0) { // Uncompressed frame (read rest of 8 frame and return) 9 ReadFile(hFile,((BYTE*)panFrame)+S_BMIH,frameSize-S_BMIH,&d,NULL);
return panFrame;
11 }

13 // Read the remainder of the compressed frame 14 *compressedFrame = *panFrame;
16 ReadFile(hFile,((BYTE*)compressedFrame)+S BMIH,frameSize-S_BMIH,&d,NULL);

18 // Set up decompressed bitmap header 19 panFrame->biCompression = 0;
panFrame->biSizelmage = 0;
21 panFrame->biBitCount = 24;
22 panFrame->biCIrUsed = 0;

24 LPBITMAPINFOHEADER biSrc = compressedFrame;
LPBITMAPINFOHEADER biDst = panFrame;
26 LPBYTE srcPixels = (BYTE*)biSrc + S_BMIH;
27 LPBYTE dstPixels = (BYTE*)biDst + S_BMIH;

29 // If the frame is compressed with Intel Indeo 4 and a local rect was requested, then perform local decompression 31 if(viewRect && biSrc->biCompression == mmioFOURCC('i','v','4','1')) {
32 // Intel Indeo 4.1 33 R4_DEC FRAME DATA Statelnfo;

memset(&Statelnfo,0,sizeof(R4 DEC_FRAME_DATA));
36 Statelnfo.dwSize = sizeof(R4_DEC FRAME_DATA);
37 Statelnfo.dwFourCC = biSrc->biCompression;
38 Statelnfo.dwVersion = SPECIFIC_INTERFACE_VERSION;
39 Statelnfo.mtType = MT_DECODE_FRAME VALUE;
Statelnfo.oeEnvironment = OE_32;
41 Statelnfo.dwFlags = DECFRAME_VALID ~
42 DECFRAME_DECODE_RECT;
43 Statelnfo.rDecodeRect.dwX = min(viewRect->Ieft,viewRect->right);
44 Statelnfo.rDecodeRect.dwY = min(viewRect->top,viewRect->bottom);
Statelnfo.rDecodeRect.dwWidth =
46 abs((viewRect->right-viewRect->left))+1;
47 Statelnfo.rDecodeRect.dwHeight=
48 abs((viewRect->bottom-viewRect->top))+1;

ICSetState(hic,&Statelnfo,sizeof(R4_DEC FRAME DATA));

53 if(ICDecompressEx(hic,O,biSrc,srcPixels,0,0,biSrc->biWidth,biSrc->biHeight,biDst,dstPixels,0, 54 O,biDst->biWidth,biDst->biHeight) != ICERR_OK ) 56 return NULL;

1 else { II Decompress entire frame 3 if(ICDecompressEx(hic,O,biSrc,srcPixels,0,0,biSrc->biWidth,biSrc->biHeight,biDst,dstPixels,0, 4 O,biDst->biWidth,biDst->biHeight) != ICERR OK ) return NULL;
7 }

9 return panFrame;

18 ~ Infinite Pictures 1998 _ 27 __ 2 APPENDIX B: SAMPLE PAN MOVIE LINK CONTROL FILE

<_______________ __________________>

<_ C ~ B _>

9 ~A
I

13 [Segment-A (start)]
14 File= "A. pan"
North= 0 17 [Segment-A (end)]
18 File= "A. pan"
19 North= 0 Link 90= "Segment-B (start)"
21 Link 270= "Segment-C (start)"

23 [Segment-B (start)]
24 File= "B.pan"
North= 90 26 Link 90= "Segment-A (end)"
27 Link 180= "Segment-C (start)"

29 [Segment-B (end)]
File= "B.pan"
31 North= 90 33 (Segment-C (start)]
34 File= "C.pan"
North= 270 36 Link 270= "Segment-A (end)"
37 Link 180= "Segment-B (start)"

39 [Segment-C (end)]
File= "C.pan"
41 North= 270 43 ~ Infinite Pictures 1998 _2g__ APPENDIX C PSEUDOCODE FOR
LINKED PAN MOVIES (VIA CONTROL FILE) 4 GLOBAL FILE controlFile // Control file GLOBAL STRING currentSegment // The name of the current pan movie 6 segment 7 GLOBAL INTEGER currentFrameNumber // The current frame number of the 8 current Pan Movie 9 GLOBAL INTEGER currentHeading // The current user view horizontal pan orientation 13 // This function will read the control file and determine which linked 14 segment is closest // to the current user heading orientation 16 // It will also determine the new frame number of the new segment 17 Il 18 BOOLEAN RetrieveLink() 19 {
INTEGER minAngle 21 STRING nextSegment 23 if currentFrameNumber == 0 24 currentSegment = currentSegment + (start) else 26 currentSegment = currentSegment + (end) 28 if no links in section currentSegment of controlFile 29 return FALSE
31 minAngle = link angle closest to currentHeading 32 nextSegment = GetString(minAngle) 34 if AngIeDifference(currentHeading,MinAngle) > 45 degrees return FALSE;

37 INTEGER nextNorth = GetNorth(nextSegment) 38 INTEGER currentNorth = GetNorth(currentSegment) currentHeading = currentHeading + (nextNorth - currentNorth) 41 currentSegment = nextSegment 43 if stringFind(currentSegment,"(end)") 44 currentFrameNumber = -1 else 46 currentFrameNumber = 0 48 return TRUE
49 }

52 ~ Infinite Pictures 1998

Claims (35)

I claim:
1) A system for simulating movement through multidimensional space comprising in combination, a multi-lens camera for simultaneously capturing a plurality of digital images that cover the entire spherical view field, compression units for individually compressing said images into compressed images, means for transferring said images to a computer, a program operated by said computer which seams said images into panoramas and which links said images into a fixed sequence of images, a viewer which selectively displays a portion of each of said linked images in sequence.
2) The system recited in claim 1 wherein said camera has six lenses, one positioned on each side of a cube.
3) The system recited in claim 1 wherein said computer compresses said images after seaming the images.
4) The system recited in claim 3 wherein said compression is single frame compression with no inter-frame compression.
5) A method of simulating movement through multidimensional space comprising the steps of:
capturing a series of sets of individual images, each set of images covering at least a portion of a spherical view, individually compressing said images, transferring said images to a computer, decompressing said images, seaming the images in each set of images into a panorama, linking said panoramas into a fixed sequence of panoramas, compressing said panoramas without using any inter-frame compression, de-compressing at least a portion of each panorama which corresponds to a view window, and displaying said view windows in said fixed sequence.
6) The method recited in claim 5 wherein each set of images comprises six images taken from the six sides of a cube.
7) The method recited in claim 5 where the portion of each panorama de-compressed for viewing is selected by an operator who indicates a direction of view.
8) The method recited in claim 5 wherein said sequence of images has break points which provide at least two alternative sequences of images.
9) The method recited in claim 5 wherein said the sequence of images displayed after a break paint is selected in response to input from a user.
10) The method recited in claim 5 wherein sound is associated with each of said panoramas.
11) A three dimensional (3-D) panorama movie for enabling a user interactively to view movement through a three-dimensional space along a path through a series of viewpoints and to view in any viewing direction in the three-dimensional space, the 3-D panorama movie comprising:
a computer storage medium for storage of machine readable image data;
a file of machine readable image data stored on the storage medium, the image data including a plurality of panorama frames forming a fixed sequence of images of the three dimensional space in which each image in the sequence has a spatially different viewpoint;
a panframe directory stored on the storage medium in association with the file of machine readable image data, containing a set of frame indexes, each frame index identifying a location in the file of one of the panorama frames, and displaying in sequence a portion of each panorama is said sequence.
12) A panorama movie according to claim 11 in which there exists a direction of travel in three-dimensional space from each viewpoint, the direction of travel being stored in the panorama frame associated with each viewpoint as a predetermined point within the panorama frame to define a frame of reference for a viewing direction during playback.
13) A panorama movie according to claim 12 in which the predetermined point is a centerpoint of the panorama frame.
14) A panorama movie according to claim 11 in which the panorama frames are compressed using intraframe compression, defining a key frame, so that each frame can be decompressed during playback independently of each other frame.
15) A panorama movie according to claim 11 in which the file includes a hot spot associated with at least one of the frames, the hotspot being operable during playback to superimpose a function on the playback of that frame.
16) A panorama movie according to claim 15 in which the hotspot includes an image having a predetermined geometric shape to be superimposed over a feature in one or more of the panorama frames.
17) A panorama movie according to claim 15 in which the hotspot image is stored with an orientation in each panorama frame such that the superimposed image appears in each frame at playback with a position and shape that conforms to a perspective corrected shape and position of the feature over which it is superimposed.
18) A panorama movie according to claim 11 which includes two or more of said files, each forming a segment of the movie and having its own respective panframe directory which includes terminal file indices defining a firstfile index and a lastfile index for the respective terminal frames of each file, the movie further including a control file containing linking data for linking terminal indices to link segments together.
19) A panorama movie according to claim 18 in which the control file includes orientation information defining a new direction of travel to proceed from a terminal frame of first segment along a second segment linked to the first segment.
20. A panorama movie according to claim 19 in which a third segments is also linked to the terminal frame of first segment, the control file orientation information including an alternative new direction of travel selectable by the user during playback upon reaching the terminal frame of first segment, to branch the movie.
21) A panorama movie according to claim 19 in which the new direction of travel is selectable by the user selecting a viewing direction that approximately coincides with the new direction of travel.
22) A panorama movie method for simulating movement through multidimensional space comprising the steps of:
capturing a plurality of images consisting of imagery taken from a plurality of spatial positions in multidimensional space;
seaming said images into key-frame panoramas, storing said key-frame panoramas in a file in a storage medium;
indexing said key-frame panoramas within a key-frame directory according to a position of each key-frame within the file and storing said key-frame directory on said storage medium;

displaying a portion of a first key-frame panorama according to a user position and viewing direction within the multidimensional space;
accessing said key-frame directory to determine a next key-frame image to be displayed according to a user travel speed and travel direction; and displaying a portion of a second key-frame panorama subsequent to the display of said first key-frame panorama according to a user position and viewing direction.
23) The method of claim 22 wherein the step of capturing the plurality of key-frame images includes taking photographs.
24) The method of claim 22 wherein the step of capturing the plurality of key-frame images includes rendering the images within a computer system.
25) The method of claim 22 wherein a forward or reverse travel direction is determined by the order in which the keyframes are accessed.
26) The method of claim 22 wherein a travel speed is determined by accessing each nth key frame where n is an integer.
27) The method of claim 22 wherein a first segment of key-frame images is linked to a second segment of key-frame images by a control file stored in the storage medium, the control file including stored travel direction for the second segment which is selected by the user selecting a viewing direction at a terminal frame of the fist segment that approximately coincides with the stored travel direction for the second segment.
28) The method of claim 22 wherein the user actuates an input device to move from a first position through a series of subsequent positions, the method including display successive views of imagery from each of the successive positions.
29) The method of claim 22 including changing the viewing direction using the input device.
30) The method of claim 22 including superimposing objects into the display of multiple successive frames at a location that coincides with apparent movement in the 3-D space.
31) The method of claim 22 including compressing each panorama image using intraframe compression to form the key-frame images, and during playback selecting a portion of the key frame that includes the desired viewed selected by the user for local decompression.
32) The method of claim 22 including transforming the panorama images to a rectangular projection prior to compression.
33) The system recited in claim 1 wherein said program adds at least one sound track to each panorama.
34) The system recited in claim 1 wherein said sequence of images includes break points where said sequence can continue in at least one of two directions.
35) The system recited in claim 1 wherein the portion of said linked images displayed is determined by the direction of view selected by a uses and wherein said direction of view remains constant at break points until changed by a user.
CA002371349A 1998-05-13 1999-05-12 Panoramic movies which simulate movement through multidimensional space Abandoned CA2371349A1 (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US8531998P 1998-05-13 1998-05-13
US60/085,319 1998-05-13
PCT/US1999/010403 WO1999059026A2 (en) 1998-05-13 1999-05-12 Panoramic movies which simulate movement through multidimensional space

Publications (1)

Publication Number Publication Date
CA2371349A1 true CA2371349A1 (en) 1999-11-18

Family

ID=22190816

Family Applications (1)

Application Number Title Priority Date Filing Date
CA002371349A Abandoned CA2371349A1 (en) 1998-05-13 1999-05-12 Panoramic movies which simulate movement through multidimensional space

Country Status (6)

Country Link
US (2) US6337683B1 (en)
EP (1) EP1099343A4 (en)
JP (1) JP2002514875A (en)
AU (1) AU4184399A (en)
CA (1) CA2371349A1 (en)
WO (1) WO1999059026A2 (en)

Families Citing this family (208)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6337683B1 (en) 1998-05-13 2002-01-08 Imove Inc. Panoramic movies which simulate movement through multidimensional space
AU4196299A (en) * 1998-05-23 1999-12-13 Eolas Technologies, Incorporated Identification of features of multi-dimensional image data in hypermedia systems
DE19921488A1 (en) * 1999-05-08 2000-11-16 Bosch Gmbh Robert Method and device for monitoring the interior and surroundings of a vehicle
US7620909B2 (en) * 1999-05-12 2009-11-17 Imove Inc. Interactive image seamer for panoramic images
US7050085B1 (en) * 2000-10-26 2006-05-23 Imove, Inc. System and method for camera calibration
US20040075738A1 (en) * 1999-05-12 2004-04-22 Sean Burke Spherical surveillance system architecture
US6738073B2 (en) * 1999-05-12 2004-05-18 Imove, Inc. Camera system with both a wide angle view and a high resolution view
US6690374B2 (en) 1999-05-12 2004-02-10 Imove, Inc. Security camera system for tracking moving objects in both forward and reverse directions
US20020060692A1 (en) * 1999-11-16 2002-05-23 Pixel Kinetix, Inc. Method for increasing multimedia data accessibility
US6434747B1 (en) 2000-01-19 2002-08-13 Individual Network, Inc. Method and system for providing a customized media list
US6693959B1 (en) * 2000-03-03 2004-02-17 Ati International Srl Method and apparatus for indexing and locating key frames in streaming and variable-frame-length data
CA2411852A1 (en) * 2000-06-09 2001-12-13 Imove, Inc. Streaming panoramic video
US20010056574A1 (en) * 2000-06-26 2001-12-27 Richards Angus Duncan VTV system
US6859557B1 (en) * 2000-07-07 2005-02-22 Microsoft Corp. System and method for selective decoding and decompression
IL139995A (en) 2000-11-29 2007-07-24 Rvc Llc System and method for spherical stereoscopic photographing
FR2817441B1 (en) * 2000-11-29 2004-01-16 Thomson Multimedia Sa METHOD OF VIEWING A VIDEO SEQUENCE IN A PANORAMA WINDOW
JP2002209208A (en) * 2001-01-11 2002-07-26 Mixed Reality Systems Laboratory Inc Image processing unit and its method, and storage medium
US7126630B1 (en) * 2001-02-09 2006-10-24 Kujin Lee Method and apparatus for omni-directional image and 3-dimensional data acquisition with data annotation and dynamic range extension method
US6747647B2 (en) * 2001-05-02 2004-06-08 Enroute, Inc. System and method for displaying immersive video
US7012637B1 (en) * 2001-07-27 2006-03-14 Be Here Corporation Capture structure for alignment of multi-camera capture systems
US6947059B2 (en) * 2001-08-10 2005-09-20 Micoy Corporation Stereoscopic panoramic image capture device
FR2828754A1 (en) * 2001-08-14 2003-02-21 Koninkl Philips Electronics Nv VISUALIZATION OF A PANORAMIC VIDEO EDITION BY APPLYING NAVIGATION COMMANDS TO THE SAME
JP5037765B2 (en) * 2001-09-07 2012-10-03 株式会社トプコン Operator guidance system
US20030179216A1 (en) * 2002-03-22 2003-09-25 Enroute, Inc. Multi-resolution video-caching scheme for interactive and immersive videos
US20030220971A1 (en) * 2002-05-23 2003-11-27 International Business Machines Corporation Method and apparatus for video conferencing with audio redirection within a 360 degree view
WO2004017138A1 (en) * 2002-08-19 2004-02-26 Koninklijke Philips Electronics N.V. Projection unit for displaying multiple images
GB0226002D0 (en) * 2002-11-07 2002-12-11 Home Networking Ltd Surveillance device
US9063633B2 (en) * 2006-03-30 2015-06-23 Arjuna Indraeswaran Rajasingham Virtual navigation system for virtual and real spaces
JP3931843B2 (en) * 2003-06-13 2007-06-20 株式会社日立製作所 Recording medium and reproducing method
US20050046698A1 (en) * 2003-09-02 2005-03-03 Knight Andrew Frederick System and method for producing a selectable view of an object space
JP2005165045A (en) * 2003-12-03 2005-06-23 Denso Corp Electronic apparatus with map display function and program
JP2007517264A (en) 2003-12-26 2007-06-28 マイコイ・コーポレーション Multidimensional imaging apparatus, system and method
US20060114251A1 (en) * 2004-02-11 2006-06-01 Miller Jacob J Methods for simulating movement of a computer user through a remote environment
US7921136B1 (en) * 2004-03-11 2011-04-05 Navteq North America, Llc Method and system for using geographic data for developing scenes for entertainment features
US7746376B2 (en) * 2004-06-16 2010-06-29 Felipe Mendoza Method and apparatus for accessing multi-dimensional mapping and information
US7183549B2 (en) * 2004-09-09 2007-02-27 Flir Systems, Inc. Multiple camera systems and methods
US7381952B2 (en) * 2004-09-09 2008-06-03 Flir Systems, Inc. Multiple camera systems and methods
US8510283B2 (en) 2006-07-31 2013-08-13 Ricoh Co., Ltd. Automatic adaption of an image recognition system to image capture devices
US8184155B2 (en) 2007-07-11 2012-05-22 Ricoh Co. Ltd. Recognition and tracking using invisible junctions
US8825682B2 (en) 2006-07-31 2014-09-02 Ricoh Co., Ltd. Architecture for mixed media reality retrieval of locations and registration of images
US8335789B2 (en) 2004-10-01 2012-12-18 Ricoh Co., Ltd. Method and system for document fingerprint matching in a mixed media environment
US8838591B2 (en) 2005-08-23 2014-09-16 Ricoh Co., Ltd. Embedding hot spots in electronic documents
US8521737B2 (en) 2004-10-01 2013-08-27 Ricoh Co., Ltd. Method and system for multi-tier image matching in a mixed media environment
US8385589B2 (en) 2008-05-15 2013-02-26 Berna Erol Web-based content detection in images, extraction and recognition
US8600989B2 (en) 2004-10-01 2013-12-03 Ricoh Co., Ltd. Method and system for image matching in a mixed media environment
US8949287B2 (en) 2005-08-23 2015-02-03 Ricoh Co., Ltd. Embedding hot spots in imaged documents
US8156427B2 (en) * 2005-08-23 2012-04-10 Ricoh Co. Ltd. User interface for mixed media reality
US8156115B1 (en) 2007-07-11 2012-04-10 Ricoh Co. Ltd. Document-based networking with mixed media reality
US8156116B2 (en) 2006-07-31 2012-04-10 Ricoh Co., Ltd Dynamic presentation of targeted information in a mixed media reality recognition system
US7970171B2 (en) 2007-01-18 2011-06-28 Ricoh Co., Ltd. Synthetic image and video generation from ground truth data
US8144921B2 (en) 2007-07-11 2012-03-27 Ricoh Co., Ltd. Information retrieval using invisible junctions and geometric constraints
US9384619B2 (en) 2006-07-31 2016-07-05 Ricoh Co., Ltd. Searching media content for objects specified using identifiers
US7702673B2 (en) 2004-10-01 2010-04-20 Ricoh Co., Ltd. System and methods for creation and use of a mixed media environment
US9405751B2 (en) 2005-08-23 2016-08-02 Ricoh Co., Ltd. Database for mixed media document system
US8332401B2 (en) 2004-10-01 2012-12-11 Ricoh Co., Ltd Method and system for position-based image matching in a mixed media environment
US8369655B2 (en) 2006-07-31 2013-02-05 Ricoh Co., Ltd. Mixed media reality recognition using multiple specialized indexes
US8176054B2 (en) 2007-07-12 2012-05-08 Ricoh Co. Ltd Retrieving electronic documents by converting them to synthetic text
US8856108B2 (en) 2006-07-31 2014-10-07 Ricoh Co., Ltd. Combining results of image retrieval processes
US9171202B2 (en) 2005-08-23 2015-10-27 Ricoh Co., Ltd. Data organization and access for mixed media document system
US9373029B2 (en) * 2007-07-11 2016-06-21 Ricoh Co., Ltd. Invisible junction feature recognition for document security or annotation
US8276088B2 (en) 2007-07-11 2012-09-25 Ricoh Co., Ltd. User interface for three-dimensional navigation
US9530050B1 (en) 2007-07-11 2016-12-27 Ricoh Co., Ltd. Document annotation sharing
US8195659B2 (en) * 2005-08-23 2012-06-05 Ricoh Co. Ltd. Integration and use of mixed media documents
US8868555B2 (en) 2006-07-31 2014-10-21 Ricoh Co., Ltd. Computation of a recongnizability score (quality predictor) for image retrieval
WO2006053271A1 (en) 2004-11-12 2006-05-18 Mok3, Inc. Method for inter-scene transitions
US7965314B1 (en) 2005-02-09 2011-06-21 Flir Systems, Inc. Foveal camera systems and methods
US7663662B2 (en) * 2005-02-09 2010-02-16 Flir Systems, Inc. High and low resolution camera systems and methods
US7872665B2 (en) 2005-05-13 2011-01-18 Micoy Corporation Image capture and processing
NO323509B1 (en) * 2005-08-10 2007-05-29 Telenor Asa Method of animating a series of still images
EP1955205B1 (en) * 2005-11-15 2012-08-29 Yissum Research Development Company Of The Hebrew University Of Jerusalem Method and system for producing a video synopsis
US8949235B2 (en) * 2005-11-15 2015-02-03 Yissum Research Development Company Of The Hebrew University Of Jerusalem Ltd. Methods and systems for producing a video synopsis using clustering
US8130330B2 (en) * 2005-12-05 2012-03-06 Seiko Epson Corporation Immersive surround visual fields
US20070141545A1 (en) * 2005-12-05 2007-06-21 Kar-Han Tan Content-Based Indexing and Retrieval Methods for Surround Video Synthesis
US20070126932A1 (en) * 2005-12-05 2007-06-07 Kiran Bhat Systems and methods for utilizing idle display area
US20070126864A1 (en) * 2005-12-05 2007-06-07 Kiran Bhat Synthesizing three-dimensional surround visual field
DE102006003524A1 (en) * 2006-01-24 2007-07-26 Oerlikon Contraves Ag Panoramic view system especially in combat vehicles
US20070174010A1 (en) * 2006-01-24 2007-07-26 Kiran Bhat Collective Behavior Modeling for Content Synthesis
US7834910B2 (en) * 2006-03-01 2010-11-16 David M. DeLorme Method and apparatus for panoramic imaging
JP4825561B2 (en) * 2006-03-29 2011-11-30 株式会社東芝 Image display device
JP4876687B2 (en) * 2006-04-19 2012-02-15 株式会社日立製作所 Attention level measuring device and attention level measuring system
US20080018792A1 (en) * 2006-07-19 2008-01-24 Kiran Bhat Systems and Methods for Interactive Surround Visual Field
US8201076B2 (en) * 2006-07-31 2012-06-12 Ricoh Co., Ltd. Capturing symbolic information from documents upon printing
US9063952B2 (en) 2006-07-31 2015-06-23 Ricoh Co., Ltd. Mixed media reality recognition with image tracking
US8489987B2 (en) 2006-07-31 2013-07-16 Ricoh Co., Ltd. Monitoring and analyzing creation and usage of visual content using image and hotspot interaction
US9176984B2 (en) * 2006-07-31 2015-11-03 Ricoh Co., Ltd Mixed media reality retrieval of differentially-weighted links
US9020966B2 (en) 2006-07-31 2015-04-28 Ricoh Co., Ltd. Client device for interacting with a mixed media reality recognition system
US8676810B2 (en) 2006-07-31 2014-03-18 Ricoh Co., Ltd. Multiple index mixed media reality recognition using unequal priority indexes
US8078603B1 (en) 2006-10-05 2011-12-13 Blinkx Uk Ltd Various methods and apparatuses for moving thumbnails
US8196045B2 (en) * 2006-10-05 2012-06-05 Blinkx Uk Limited Various methods and apparatus for moving thumbnails with metadata
AU2007319441A1 (en) 2006-11-13 2008-05-22 Everyscape, Inc. Method for scripting inter-scene transitions
US8094182B2 (en) 2006-11-16 2012-01-10 Imove, Inc. Distributed video sensor panoramic imaging system
US8074241B2 (en) * 2007-03-30 2011-12-06 The Board Of Trustees Of The Leland Stanford Jr. University Process for displaying and navigating panoramic video, and method and user interface for streaming panoramic video and images between a server and browser-based client application
FR2916866A1 (en) * 2007-05-29 2008-12-05 Thomson Licensing Sas METHOD FOR CREATING AND REPRODUCING A PANORAMIC SOUND IMAGE, AND APPARATUS FOR REPRODUCING SUCH IMAGE
FR2918240A1 (en) * 2007-06-26 2009-01-02 Thomson Licensing Sa METHOD FOR CREATING A SOUND SUITE OF PHOTOGRAPHS, AND APPARATUS FOR CREATING SUCH A SOUND SUITE
US10063848B2 (en) * 2007-08-24 2018-08-28 John G. Posa Perspective altering display system
EP2048640A2 (en) * 2007-10-12 2009-04-15 Gruentjens, Norbert A method and an apparatus for controlling a simulated moving object
US20090184981A1 (en) * 2008-01-23 2009-07-23 De Matos Lucio D Orazio Pedro system, method and computer program product for displaying images according to user position
US8174561B2 (en) * 2008-03-14 2012-05-08 Sony Ericsson Mobile Communications Ab Device, method and program for creating and displaying composite images generated from images related by capture position
US8319819B2 (en) * 2008-03-26 2012-11-27 Cisco Technology, Inc. Virtual round-table videoconference
US20100050221A1 (en) * 2008-06-20 2010-02-25 Mccutchen David J Image Delivery System with Image Quality Varying with Frame Rate
US8694658B2 (en) 2008-09-19 2014-04-08 Cisco Technology, Inc. System and method for enabling communication sessions in a network environment
JP2010103692A (en) * 2008-10-22 2010-05-06 Canon Inc Image output apparatus, image output method, and control program
KR20100062575A (en) * 2008-12-02 2010-06-10 삼성테크윈 주식회사 Method to control monitoring camera and control apparatus using the same
US20100156906A1 (en) * 2008-12-19 2010-06-24 David Montgomery Shot generation from previsualization of a physical environment
US8477175B2 (en) 2009-03-09 2013-07-02 Cisco Technology, Inc. System and method for providing three dimensional imaging in a network environment
US8659639B2 (en) 2009-05-29 2014-02-25 Cisco Technology, Inc. System and method for extending communications between participants in a conferencing environment
US9843743B2 (en) 2009-06-03 2017-12-12 Flir Systems, Inc. Infant monitoring systems and methods using thermal imaging
US8385660B2 (en) 2009-06-24 2013-02-26 Ricoh Co., Ltd. Mixed media reality indexing and retrieval for repeated content
US9082297B2 (en) 2009-08-11 2015-07-14 Cisco Technology, Inc. System and method for verifying parameters in an audiovisual environment
KR101631912B1 (en) * 2009-11-03 2016-06-20 엘지전자 주식회사 Mobile terminal and control method thereof
SG171494A1 (en) * 2009-12-01 2011-06-29 Creative Tech Ltd A method for showcasing a built-up structure and an apparatus enabling the aforementioned method
US20110141226A1 (en) * 2009-12-11 2011-06-16 Fotonation Ireland Limited Panorama imaging based on a lo-res map
US20110141225A1 (en) * 2009-12-11 2011-06-16 Fotonation Ireland Limited Panorama Imaging Based on Low-Res Images
US10080006B2 (en) 2009-12-11 2018-09-18 Fotonation Limited Stereoscopic (3D) panorama creation on handheld device
US20110141224A1 (en) * 2009-12-11 2011-06-16 Fotonation Ireland Limited Panorama Imaging Using Lo-Res Images
US20110141229A1 (en) * 2009-12-11 2011-06-16 Fotonation Ireland Limited Panorama imaging using super-resolution
US8294748B2 (en) * 2009-12-11 2012-10-23 DigitalOptics Corporation Europe Limited Panorama imaging using a blending map
US8447136B2 (en) * 2010-01-12 2013-05-21 Microsoft Corporation Viewing media in the context of street-level images
US9225916B2 (en) 2010-03-18 2015-12-29 Cisco Technology, Inc. System and method for enhancing video images in a conferencing environment
US9313452B2 (en) 2010-05-17 2016-04-12 Cisco Technology, Inc. System and method for providing retracting optics in a video conferencing environment
US8896655B2 (en) 2010-08-31 2014-11-25 Cisco Technology, Inc. System and method for providing depth adaptive video conferencing
US8599865B2 (en) 2010-10-26 2013-12-03 Cisco Technology, Inc. System and method for provisioning flows in a mobile network environment
US8902244B2 (en) 2010-11-15 2014-12-02 Cisco Technology, Inc. System and method for providing enhanced graphics in a video environment
US9143725B2 (en) 2010-11-15 2015-09-22 Cisco Technology, Inc. System and method for providing enhanced graphics in a video environment
US9338394B2 (en) 2010-11-15 2016-05-10 Cisco Technology, Inc. System and method for providing enhanced audio in a video environment
US9111138B2 (en) 2010-11-30 2015-08-18 Cisco Technology, Inc. System and method for gesture interface control
US8692862B2 (en) 2011-02-28 2014-04-08 Cisco Technology, Inc. System and method for selection of video data in a video conference environment
US8934026B2 (en) 2011-05-12 2015-01-13 Cisco Technology, Inc. System and method for video coding in a dynamic environment
US9058331B2 (en) 2011-07-27 2015-06-16 Ricoh Co., Ltd. Generating a conversation in a social network based on visual search results
US9557885B2 (en) 2011-08-09 2017-01-31 Gopro, Inc. Digital media editing
US8947493B2 (en) 2011-11-16 2015-02-03 Cisco Technology, Inc. System and method for alerting a participant in a video conference
WO2013093175A1 (en) 2011-12-22 2013-06-27 Nokia Corporation A method, an apparatus and a computer program for determination of an audio track
US9681154B2 (en) 2012-12-06 2017-06-13 Patent Capital Group System and method for depth-guided filtering in a video conference environment
US9197682B2 (en) 2012-12-21 2015-11-24 Nokia Technologies Oy Method, apparatus, and computer program product for generating a video stream of a mapped route
US9449372B2 (en) 2013-08-22 2016-09-20 Bae Systems Information And Electronic Systems Integration Inc. Dust removal technology for driver vision leverage
US9973692B2 (en) 2013-10-03 2018-05-15 Flir Systems, Inc. Situational awareness by compressed display of panoramic views
US8977113B1 (en) * 2013-10-25 2015-03-10 Joseph Rumteen Mobile device video decision tree
US10032479B2 (en) * 2014-01-31 2018-07-24 Nbcuniversal Media, Llc Fingerprint-defined segment-based content delivery
WO2015134537A1 (en) 2014-03-04 2015-09-11 Gopro, Inc. Generation of video based on spherical content
US9570113B2 (en) 2014-07-03 2017-02-14 Gopro, Inc. Automatic generation of video and directional audio from spherical content
US10204658B2 (en) 2014-07-14 2019-02-12 Sony Interactive Entertainment Inc. System and method for use in playing back panorama video content
US9685194B2 (en) 2014-07-23 2017-06-20 Gopro, Inc. Voice-based video tagging
US9792502B2 (en) 2014-07-23 2017-10-17 Gopro, Inc. Generating video summaries for a video using video summary templates
US9734870B2 (en) 2015-01-05 2017-08-15 Gopro, Inc. Media identifier generation for camera-captured media
US9679605B2 (en) 2015-01-29 2017-06-13 Gopro, Inc. Variable playback speed template for video editing application
WO2016187235A1 (en) 2015-05-20 2016-11-24 Gopro, Inc. Virtual lens simulation for video and photo cropping
KR102551239B1 (en) * 2015-09-02 2023-07-05 인터디지털 씨이 페이튼트 홀딩스, 에스에이에스 Method, apparatus and system for facilitating navigation in an extended scene
EP3353565B1 (en) 2015-09-23 2021-08-04 Nokia Technologies Oy Video recording method and apparatus
US10468066B2 (en) 2015-09-23 2019-11-05 Nokia Technologies Oy Video content selection
US9721611B2 (en) 2015-10-20 2017-08-01 Gopro, Inc. System and method of generating video from video clips based on moments of interest within the video clips
US10204273B2 (en) 2015-10-20 2019-02-12 Gopro, Inc. System and method of providing recommendations of moments of interest within video clips post capture
US9473758B1 (en) * 2015-12-06 2016-10-18 Sliver VR Technologies, Inc. Methods and systems for game video recording and virtual reality replay
US10109319B2 (en) 2016-01-08 2018-10-23 Gopro, Inc. Digital media editing
US9812175B2 (en) 2016-02-04 2017-11-07 Gopro, Inc. Systems and methods for annotating a video
US9838731B1 (en) 2016-04-07 2017-12-05 Gopro, Inc. Systems and methods for audio track selection in video editing with audio mixing option
US9838730B1 (en) 2016-04-07 2017-12-05 Gopro, Inc. Systems and methods for audio track selection in video editing
US9794632B1 (en) 2016-04-07 2017-10-17 Gopro, Inc. Systems and methods for synchronization based on audio track changes in video editing
EP3249651B1 (en) 2016-05-23 2018-08-29 Axis AB Generating a summary video sequence from a source video sequence
US9922398B1 (en) 2016-06-30 2018-03-20 Gopro, Inc. Systems and methods for generating stabilized visual content using spherical visual content
US11089280B2 (en) 2016-06-30 2021-08-10 Sony Interactive Entertainment Inc. Apparatus and method for capturing and displaying segmented content
US10185891B1 (en) 2016-07-08 2019-01-22 Gopro, Inc. Systems and methods for compact convolutional neural networks
US9836853B1 (en) 2016-09-06 2017-12-05 Gopro, Inc. Three-dimensional convolutional neural networks for video highlight detection
US10137893B2 (en) * 2016-09-26 2018-11-27 Keith J. Hanna Combining driver alertness with advanced driver assistance systems (ADAS)
JP7065836B6 (en) 2016-09-29 2022-06-06 コーニンクレッカ フィリップス エヌ ヴェ Image processing
US10043552B1 (en) 2016-10-08 2018-08-07 Gopro, Inc. Systems and methods for providing thumbnails for video content
US10684679B1 (en) 2016-10-21 2020-06-16 Gopro, Inc. Systems and methods for generating viewpoints for visual content based on gaze
US10284809B1 (en) 2016-11-07 2019-05-07 Gopro, Inc. Systems and methods for intelligently synchronizing events in visual content with musical features in audio content
US10262639B1 (en) 2016-11-08 2019-04-16 Gopro, Inc. Systems and methods for detecting musical features in audio content
US10244215B2 (en) 2016-11-29 2019-03-26 Microsoft Technology Licensing, Llc Re-projecting flat projections of pictures of panoramic video for rendering by application
US10244200B2 (en) 2016-11-29 2019-03-26 Microsoft Technology Licensing, Llc View-dependent operations during playback of panoramic video
US10242714B2 (en) 2016-12-19 2019-03-26 Microsoft Technology Licensing, Llc Interface for application-specified playback of panoramic video
US10534966B1 (en) 2017-02-02 2020-01-14 Gopro, Inc. Systems and methods for identifying activities and/or events represented in a video
US10127943B1 (en) 2017-03-02 2018-11-13 Gopro, Inc. Systems and methods for modifying videos based on music
US10185895B1 (en) 2017-03-23 2019-01-22 Gopro, Inc. Systems and methods for classifying activities captured within images
US10083718B1 (en) 2017-03-24 2018-09-25 Gopro, Inc. Systems and methods for editing videos based on motion
TWI659394B (en) * 2017-03-31 2019-05-11 聚星電子股份有限公司 Image processing method and image processing device
US10187690B1 (en) 2017-04-24 2019-01-22 Gopro, Inc. Systems and methods to detect and correlate user responses to media content
US10469818B1 (en) 2017-07-11 2019-11-05 Gopro, Inc. Systems and methods for facilitating consumption of video content
US10375306B2 (en) 2017-07-13 2019-08-06 Zillow Group, Inc. Capture and use of building interior data from mobile devices
US10530997B2 (en) 2017-07-13 2020-01-07 Zillow Group, Inc. Connecting and using building interior data acquired from mobile devices
DE112018002436T5 (en) * 2017-09-27 2020-02-20 Mediatek Inc. Method for processing a projection-based frame, which has at least one projection surface, which is packed in a 360-degree virtual reality projection arrangement
CN107945112B (en) 2017-11-17 2020-12-08 浙江大华技术股份有限公司 Panoramic image splicing method and device
KR102177401B1 (en) * 2018-02-02 2020-11-11 재단법인 다차원 스마트 아이티 융합시스템 연구단 A noiseless omnidirectional camera device
US10643386B2 (en) 2018-04-11 2020-05-05 Zillow Group, Inc. Presenting image transition sequences between viewing locations
US10587807B2 (en) 2018-05-18 2020-03-10 Gopro, Inc. Systems and methods for stabilizing videos
US10764494B2 (en) 2018-05-25 2020-09-01 Microsoft Technology Licensing, Llc Adaptive panoramic video streaming using composite pictures
US10666863B2 (en) 2018-05-25 2020-05-26 Microsoft Technology Licensing, Llc Adaptive panoramic video streaming using overlapping partitioned sections
US10750092B2 (en) 2018-09-19 2020-08-18 Gopro, Inc. Systems and methods for stabilizing videos
CA3058602C (en) 2018-10-11 2023-01-24 Zillow Group, Inc. Automated mapping information generation from inter-connected images
US10809066B2 (en) 2018-10-11 2020-10-20 Zillow Group, Inc. Automated mapping information generation from inter-connected images
US10708507B1 (en) 2018-10-11 2020-07-07 Zillow Group, Inc. Automated control of image acquisition via use of acquisition device sensors
US11243656B2 (en) 2019-08-28 2022-02-08 Zillow, Inc. Automated tools for generating mapping information for buildings
CN110515264B (en) * 2019-08-29 2021-09-10 深圳市圆周率软件科技有限责任公司 System for testing multi-lens exposure time of panoramic camera
US11164368B2 (en) 2019-10-07 2021-11-02 Zillow, Inc. Providing simulated lighting information for three-dimensional building models
US11164361B2 (en) 2019-10-28 2021-11-02 Zillow, Inc. Generating floor maps for buildings from automated analysis of visual data of the buildings' interiors
US10825247B1 (en) 2019-11-12 2020-11-03 Zillow Group, Inc. Presenting integrated building information using three-dimensional building models
US11676344B2 (en) 2019-11-12 2023-06-13 MFTB Holdco, Inc. Presenting building information using building models
US11405549B2 (en) 2020-06-05 2022-08-02 Zillow, Inc. Automated generation on mobile devices of panorama images for building locations and subsequent use
US11514674B2 (en) 2020-09-04 2022-11-29 Zillow, Inc. Automated analysis of image contents to determine the acquisition location of the image
US11592969B2 (en) 2020-10-13 2023-02-28 MFTB Holdco, Inc. Automated tools for generating building mapping information
US11481925B1 (en) 2020-11-23 2022-10-25 Zillow, Inc. Automated determination of image acquisition locations in building interiors using determined room shapes
CA3142154A1 (en) 2021-01-08 2022-07-08 Zillow, Inc. Automated determination of image acquisition locations in building interiors using multiple data capture devices
US11252329B1 (en) 2021-01-08 2022-02-15 Zillow, Inc. Automated determination of image acquisition locations in building interiors using multiple data capture devices
US11836973B2 (en) 2021-02-25 2023-12-05 MFTB Holdco, Inc. Automated direction of capturing in-room information for use in usability assessment of buildings
US11790648B2 (en) 2021-02-25 2023-10-17 MFTB Holdco, Inc. Automated usability assessment of buildings using visual data of captured in-room images
US11501492B1 (en) 2021-07-27 2022-11-15 Zillow, Inc. Automated room shape determination using visual data of multiple captured in-room images
US11842464B2 (en) 2021-09-22 2023-12-12 MFTB Holdco, Inc. Automated exchange and use of attribute information between building images of multiple types
US11830135B1 (en) 2022-07-13 2023-11-28 MFTB Holdco, Inc. Automated building identification using floor plans and acquired building images

Family Cites Families (43)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3909121A (en) 1974-06-25 1975-09-30 Mesquita Cardoso Edgar Antonio Panoramic photographic methods
US4873585A (en) * 1984-09-07 1989-10-10 Ivex Corporation Method of selectively retrieving video images from a video reproducer for simulating movement
US4807158A (en) 1986-09-30 1989-02-21 Daleco/Ivex Partners, Ltd. Method and apparatus for sampling images to simulate movement within a multidimensional space
FR2607298B1 (en) * 1986-11-24 1990-02-09 Giravions Dorand METHOD FOR READING A RECORDED MOBILE SCENE, IN PARTICULAR ON A VIDEO DISC AND APPLICATION TO DRIVING SIMULATORS
US4853764A (en) * 1988-09-16 1989-08-01 Pedalo, Inc. Method and apparatus for screenless panoramic stereo TV system
US5023725A (en) * 1989-10-23 1991-06-11 Mccutchen David Method and apparatus for dodecahedral imaging system
US5235198A (en) 1989-11-29 1993-08-10 Eastman Kodak Company Non-interlaced interline transfer CCD image sensing device with simplified electrode structure for each pixel
US5130794A (en) 1990-03-29 1992-07-14 Ritchey Kurtis J Panoramic display system
US5801716A (en) 1990-08-16 1998-09-01 Canon Kabushiki Kaisha Pipeline structures for full-color computer graphics
US6002430A (en) 1994-01-31 1999-12-14 Interactive Pictures Corporation Method and apparatus for simultaneous capture of a spherical image
US5355450A (en) 1992-04-10 1994-10-11 Avid Technology, Inc. Media composer with adjustable source material compression
US5684937A (en) 1992-12-14 1997-11-04 Oxaal; Ford Method and apparatus for performing perspective transformation on visible stimuli
US5903782A (en) 1995-11-15 1999-05-11 Oxaal; Ford Method and apparatus for producing a three-hundred and sixty degree spherical visual data set
US6243099B1 (en) 1996-11-14 2001-06-05 Ford Oxaal Method for interactive viewing full-surround image data and apparatus therefor
US5495576A (en) 1993-01-11 1996-02-27 Ritchey; Kurtis J. Panoramic image based virtual reality/telepresence audio-visual system and method
JP3253405B2 (en) * 1993-03-26 2002-02-04 オリンパス光学工業株式会社 Two-group zoom lens
WO1994029999A1 (en) * 1993-06-16 1994-12-22 Gould Kim V W System and method for transmitting video material
EP0650299B1 (en) 1993-10-20 1998-07-22 Laboratoires D'electronique Philips S.A.S. Method of processing luminance levels in a composite image and image processing system applying this method
US5677981A (en) 1994-06-14 1997-10-14 Matsushita Electric Industrial Co., Ltd. Video signal recording apparatus which receives a digital progressive scan TV signal and switches the progressive signal frame by frame alternately
US5774569A (en) 1994-07-25 1998-06-30 Waldenmaier; H. Eugene W. Surveillance system
US5886745A (en) 1994-12-09 1999-03-23 Matsushita Electric Industrial Co., Ltd. Progressive scanning conversion apparatus
US5729471A (en) * 1995-03-31 1998-03-17 The Regents Of The University Of California Machine dynamic selection of one video camera/image of a scene from multiple video cameras/images of the scene in accordance with a particular perspective on the scene, an object in the scene, or an event in the scene
US5990934A (en) 1995-04-28 1999-11-23 Lucent Technologies, Inc. Method and system for panoramic viewing
US5703604A (en) * 1995-05-22 1997-12-30 Dodeca Llc Immersive dodecaherdral video viewing system
US5657073A (en) 1995-06-01 1997-08-12 Panoramic Viewing Systems, Inc. Seamless multi-camera panoramic imaging with distortion correction and selectable field of view
US5748121A (en) 1995-12-06 1998-05-05 Intel Corporation Generation of huffman tables for signal encoding
US5872575A (en) * 1996-02-14 1999-02-16 Digital Media Interactive Method and system for the creation of and navigation through a multidimensional space using encoded digital video
US5852673A (en) 1996-03-27 1998-12-22 Chroma Graphics, Inc. Method for general image manipulation and composition
US5708469A (en) * 1996-05-03 1998-01-13 International Business Machines Corporation Multiple view telepresence camera system using a wire cage which surroundss a plurality of movable cameras and identifies fields of view
WO1997042601A1 (en) * 1996-05-06 1997-11-13 Sas Institute, Inc. Integrated interactive multimedia process
US6118474A (en) 1996-05-10 2000-09-12 The Trustees Of Columbia University In The City Of New York Omnidirectional imaging apparatus
US6118454A (en) 1996-10-16 2000-09-12 Oxaal; Ford Methods and apparatuses for producing a spherical visual data set using a spherical mirror and one or more cameras with long lenses
US6058397A (en) * 1997-04-08 2000-05-02 Mitsubishi Electric Information Technology Center America, Inc. 3D virtual environment creation management and delivery system
US6043837A (en) 1997-05-08 2000-03-28 Be Here Corporation Method and apparatus for electronically distributing images from a panoptic camera system
US5933137A (en) 1997-06-10 1999-08-03 Flashpoint Technology, Inc. Method and system for acclerating a user interface of an image capture unit during play mode
US6101534A (en) 1997-09-03 2000-08-08 Rothschild; Leigh M. Interactive, remote, computer interface system
US6226035B1 (en) 1998-03-04 2001-05-01 Cyclo Vision Technologies, Inc. Adjustable imaging system with wide angle capability
US6215519B1 (en) 1998-03-04 2001-04-10 The Trustees Of Columbia University In The City Of New York Combined wide angle and narrow angle imaging system and method for surveillance and monitoring
US6237647B1 (en) 1998-04-06 2001-05-29 William Pong Automatic refueling station
US6337683B1 (en) 1998-05-13 2002-01-08 Imove Inc. Panoramic movies which simulate movement through multidimensional space
US6323858B1 (en) 1998-05-13 2001-11-27 Imove Inc. System for digitally capturing and recording panoramic movies
US6346950B1 (en) 1999-05-20 2002-02-12 Compaq Computer Corporation System and method for display images using anamorphic video
US6567086B1 (en) 2000-07-25 2003-05-20 Enroute, Inc. Immersive video system using multiple video streams

Also Published As

Publication number Publication date
US20020063709A1 (en) 2002-05-30
WO1999059026A3 (en) 2000-01-06
JP2002514875A (en) 2002-05-21
EP1099343A4 (en) 2007-10-17
US6654019B2 (en) 2003-11-25
US6337683B1 (en) 2002-01-08
WO1999059026A2 (en) 1999-11-18
AU4184399A (en) 1999-11-29
EP1099343A2 (en) 2001-05-16

Similar Documents

Publication Publication Date Title
US6337683B1 (en) Panoramic movies which simulate movement through multidimensional space
CA2372110C (en) A system for digitally capturing and recording panoramic images
US20020046218A1 (en) System for digitally capturing and recording panoramic movies
US6559846B1 (en) System and process for viewing panoramic video
JP3177221B2 (en) Method and apparatus for displaying an image of an interesting scene
US9367942B2 (en) Method, system and software program for shooting and editing a film comprising at least one image of a 3D computer-generated animation
US8768097B2 (en) Image processing apparatus, moving image reproducing apparatus, and processing method and program therefor
US6956573B1 (en) Method and apparatus for efficiently representing storing and accessing video information
KR20100043139A (en) Image processing device, dynamic image reproduction device, and processing method and program in them
US8754959B2 (en) Image processing device, dynamic image reproduction device, and processing method and program in them
US8421871B2 (en) Method and apparatus for image pickup and image processing
JP5578011B2 (en) Method and apparatus for superimposing a wide-angle image
KR20100103775A (en) Image processor, animation reproduction apparatus, and processing method and program for the processor and apparatus
BRPI0807400A2 (en) IMAGE CAPTURE APPARATUS, CONTROL METHOD FOR AN IMAGE CAPTURE APPARATUS, AND PROGRAM FOR AN IMAGE CAPTURE APPARATUS.
EP0976089A1 (en) Method and apparatus for efficiently representing, storing and accessing video information
US8515256B2 (en) Image processing apparatus, moving image reproducing apparatus, and processing method and program therefor
CN101627623A (en) Image processing device, dynamic image reproduction device, and processing method and program in them
JP2008131617A (en) Video processing apparatus
JP3372096B2 (en) Image information access device
JP3532823B2 (en) Image composition method and medium recording image composition program
JP2821395B2 (en) Image editing system
CN101617531A (en) Image processing apparatus, moving image playing device and processing method and program
JP3024204U (en) Chroma key system

Legal Events

Date Code Title Description
EEER Examination request
FZDE Discontinued
FZDE Discontinued

Effective date: 20050922