WO2009069165A2 - Transition method between two three-dimensional geo-referenced maps - Google Patents
Transition method between two three-dimensional geo-referenced maps Download PDFInfo
- Publication number
- WO2009069165A2 WO2009069165A2 PCT/IT2008/000700 IT2008000700W WO2009069165A2 WO 2009069165 A2 WO2009069165 A2 WO 2009069165A2 IT 2008000700 W IT2008000700 W IT 2008000700W WO 2009069165 A2 WO2009069165 A2 WO 2009069165A2
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- map
- image
- maps
- perspective
- transition
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T17/00—Three dimensional [3D] modelling, e.g. data description of 3D objects
- G06T17/05—Geographic models
Definitions
- the present invention refers to a transition method between two three-dimensional geo-referenced maps .
- Main object of the method is integrating a program of simulated or real navigation through geo-referenced maps, with a graphic transition procedure between different maps.
- the "Users" of a network comprising a satellite navigation program, are allowed to take part to an interesting and pleasant visual effect during the transition from a first three-dimensional map obtain through a satellite, to a second, three-dimensional "immersive" map, namely a map of the type composed of the sequence of 360° geo-referenced photographs.
- the dealt-with subject falls within: photographic systems (G03B: APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR) ; methods of use of photographic systems (G06K9/00: METHODS OR ARRANGEMENTS FOR READING OR RECOGNISING PRINTED OR WRITTEN CHARACTERS OR FOR RECOGNISING PATTERNS, E.G.
- FINGERPRINTS measuring and navigation systems
- GlC MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- methods for transforming three- dimensional image models G06T15/00: THREE DIMENSIONAL (3D) IMAGE RENDERING, E.G. FROM A MODEL TO A BIT-MAPPED IMAGE
- the proposed method integrates the acquisition of panoramic images or sequences of panoramic images and their processing into a spherical reference system which is bi-univocally related to a plane system on which the image captured by a photo-camera lays; displaying of panoramic, 2- or 3-dimensional photographs shows position and direction with respect to a geo-stationary reference.
- the present invention deals with a method for creating the visual "landing" effect in an exact spot of a cartographic - aerial-photographic or satellite - map.
- Said "landing" effect must be able to create the feeling of visual continuity during the passage from consulting a "plate” (2D) satellite map to consulting a panoramic 360° photograph taken in a geo-referenced spot of the map itself.
- the method starts from the above-described prior art to join the two types of maps, creating the continuity in passing from one to the other. Such passage occurs with a visual effect of a "simulated landing", approaching and rotating the view of the virtual camera, changing the angles of the 2D-map shot, trying to overlap and join the perspective of the 2-D map with the perspective of the 360° panoramic photograph.
- a rendering technique of immersive maps connected through transition requires in particular the following requirements aimed to increase the positive psychological feeling of a network user:
- This is a method for manipulating immersive maps composed of a sequence of geo-located photographs, whose image rendering, starting from a two-dimensional or three-dimensional map, corresponds to an image projected in a virtual space in spherical coordinates.
- Said rendering comprises a post-processing application for operating a transition between at least two maps having the same geographic reference, expressed as latitude and longitude, said transition being characterised by a sequence of virtual perspective images, that are intuitive, simple, and however depending on objects belonging to the starting map image .
- FIG. 1 shows the virtual spherical space in which the annular image is projected (image rendering from the model to the bit-mapped image) ;
- FIG. 2 shows the image projected by a visor composed of a three-dimensional map seen in plan view and the corresponding three-dimensional immersive map after processing in a spherical reference system with respect to the same geostationary reference;
- FIG. 3 shows the 2-D image in plan view of a map of the satellite type
- FIG. 4a - 4c show the sequence of a transition of the map in FIG. 3 to the three-dimensional arrival map of FIG. 5a and 5b;
- FIG. 5a and 5b show a plane image resulting from the projection of the image captured by an optics of the fish-eye, or parabolic type.
- ⁇ transition has to be performed from the three-dimensional image 3, seen in plan view, to the three-dimensional image 4, seen in perspective.
- the two images, 3 and 4 have in common their geostationary coordinates and the direction with which an observer sees both images 3 and 4 with respect to an inertial reference.
- the image 3 derives from a map of the satellite type, two-dimensional and projected in plan view;
- the image 4 is the projection in spherical coordinates 1 of a plane photograph resulting from the projection of the image captured by an optics of the fish-eye, or parabolic type.
- the transition procedure that sees the replacement of an image 3 with an image 4 provides for a sequence of perspective images 5' , 5'', 5' ' ' , 6', 6", 6' ' ' , of the type included in FIG. 4a-4c deriving from a starting image 5, 6, of the type included in FIG. 3.
- the perspective images are followed by a cross-fading effect.
- the transition procedure ends by presenting an image of the type included in FIG. 5a and 5b anticipated by a fading effect formed by overlapping a perspective image of the type in FIG. 4c and a projected image in the arrival map of the type in FIG. 5a and 5b.
- FIG.4a first perspective distortion of the image in FIG. 3. It is the virtual perspective rotation of objects 5 and 6 obtained by FIG.3.
- 2/3 transition FIG. 4b: second perspective distortion of the image in FIG. 3.
- the image in the arrival map is overlapped to the image containing the rotated virtual objects depending on FIG.3.
- the camera has almost landed, its position being next to lay on the "plane” where the image of the satellite map is projected, with a perspective view. Cross-fading between the plane where the image of the satellite map is projected and the 360° photograph.
- the invention has the advantage of exploiting the graphic transition technique, known in the three-dimensional design field, towards the field of view of geo-referenced maps using a small space in the graphic processing memory, though reaching the aim of guaranteeing a transition of virtual perspective images but derived from a real starting image.
- the invention reaches the objects of creating interesting and pleasant effects for network users, navigation programs, etc.
Abstract
A connection method is disclosed, between two types of maps, creating the continuity in passing from one to the other. Such passage occurs with a visual 'simulated landing' effect, by approaching and rotating the view of the virtual camera, changing the angles of the shot in the 2-D map, trying to overlap and join the perspective on the 2-D map with the perspective of the 360° panoramic photograph. It is a post-processing transition application between at least two maps having the same geographic reference, expressed in latitude and longitude, said transition being characterised by a sequence of virtual perspective images, intuitive, simple and however depending on objects belonging to the image of the starting map.
Description
TRANSITION METHOD BETWEEN TWO THREE-DIMENSIONAL GEO-REFERENCED MAPS
The present invention refers to a transition method between two three-dimensional geo-referenced maps .
Main object of the method is integrating a program of simulated or real navigation through geo-referenced maps, with a graphic transition procedure between different maps.
Upon reaching such objective, the "Users" of a network, comprising a satellite navigation program, are allowed to take part to an interesting and pleasant visual effect during the transition from a first three-dimensional map obtain through a satellite, to a second, three-dimensional "immersive" map, namely a map of the type composed of the sequence of 360° geo-referenced photographs.
The dealt-with subject falls within: photographic systems (G03B: APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR
ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR) ; methods of use of photographic systems (G06K9/00: METHODS OR ARRANGEMENTS FOR READING OR RECOGNISING PRINTED OR WRITTEN CHARACTERS OR FOR RECOGNISING PATTERNS, E.G. FINGERPRINTS); measuring and navigation systems (GOlC: MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY) ; methods for transforming three- dimensional image models (G06T15/00: THREE DIMENSIONAL (3D) IMAGE RENDERING, E.G. FROM A MODEL TO A BIT-MAPPED IMAGE) .
The proposed method integrates the acquisition of panoramic images or sequences of panoramic images and their processing into a spherical reference system which is bi-univocally related to a plane system on which the image captured by a photo-camera lays; displaying of panoramic, 2- or 3-dimensional photographs shows position and direction with respect to a geo-stationary reference.
The prior art is represented by document US2006/0209061A1 in which the "rendering" procedure is described, performed through a Graphic
Processing Unit (GPU) , in whose main steps the following are performed: a) identification and following mapping of vertexes belonging to a first object; b) identification and mapping of vertexes belonging to a second object; c) performing the "rendering" of the transition between the corresponding vertexes belonging to the first and the second object.
Performing the rendering according to the prior art implies a considerable waste of memory of the Graphic Processing Unit (GPU) and an operating slow-down of the system that manages the visual transition procedure. A simplified and at the same time efficient rendering has therefore been studied, suitable to satisfy the displaying requirements of geo-referenced maps starting from the above-described known prior art.
The present invention deals with a method for creating the visual "landing" effect in an exact spot of a cartographic - aerial-photographic or satellite - map. Said "landing" effect must be able to create the feeling of visual continuity during the passage from consulting a "plate" (2D) satellite map to consulting a panoramic 360° photograph taken in a geo-referenced spot of the
map itself.
The method starts from the above-described prior art to join the two types of maps, creating the continuity in passing from one to the other. Such passage occurs with a visual effect of a "simulated landing", approaching and rotating the view of the virtual camera, changing the angles of the 2D-map shot, trying to overlap and join the perspective of the 2-D map with the perspective of the 360° panoramic photograph.
A rendering technique of immersive maps connected through transition requires in particular the following requirements aimed to increase the positive psychological feeling of a network user:
- creating an interesting and pleasant effect of a dynamic transition between different maps; transmitting the "feeling of a real image transition from one map to the other, removing the doubt that it is a repetitive artefact independent from the different geographic areas;
- reducing the waste of memory capacity for the Graphic Processing Unit (GPU) .
The above and other objects and advantages of the invention, as will result from the following description, are obtained with a method as
described in claim 1. Preferred embodiments and non-trivial variations of the present invention are the subject matter of the dependent claims.
This is a method for manipulating immersive maps composed of a sequence of geo-located photographs, whose image rendering, starting from a two-dimensional or three-dimensional map, corresponds to an image projected in a virtual space in spherical coordinates. Said rendering comprises a post-processing application for operating a transition between at least two maps having the same geographic reference, expressed as latitude and longitude, said transition being characterised by a sequence of virtual perspective images, that are intuitive, simple, and however depending on objects belonging to the starting map image .
The present invention will be better described by some preferred embodiments thereof, provided as a non-limiting example, with reference to the enclosed drawings in which:
FIG. 1 shows the virtual spherical space in which the annular image is projected (image rendering from the model to the bit-mapped image) ;
FIG. 2 shows the image projected by a visor
composed of a three-dimensional map seen in plan view and the corresponding three-dimensional immersive map after processing in a spherical reference system with respect to the same geostationary reference;
FIG. 3 shows the 2-D image in plan view of a map of the satellite type;
FIG. 4a - 4c show the sequence of a transition of the map in FIG. 3 to the three-dimensional arrival map of FIG. 5a and 5b;
FIG. 5a and 5b show a plane image resulting from the projection of the image captured by an optics of the fish-eye, or parabolic type.
Α transition has to be performed from the three-dimensional image 3, seen in plan view, to the three-dimensional image 4, seen in perspective. The two images, 3 and 4, have in common their geostationary coordinates and the direction with which an observer sees both images 3 and 4 with respect to an inertial reference.
In this case, the image 3 derives from a map of the satellite type, two-dimensional and projected in plan view; the image 4 is the projection in spherical coordinates 1 of a plane photograph resulting from the projection of the
image captured by an optics of the fish-eye, or parabolic type.
The transition procedure that sees the replacement of an image 3 with an image 4 provides for a sequence of perspective images 5' , 5'', 5' ' ' , 6', 6", 6' ' ' , of the type included in FIG. 4a-4c deriving from a starting image 5, 6, of the type included in FIG. 3. The perspective images are followed by a cross-fading effect. The transition procedure ends by presenting an image of the type included in FIG. 5a and 5b anticipated by a fading effect formed by overlapping a perspective image of the type in FIG. 4c and a projected image in the arrival map of the type in FIG. 5a and 5b. POST-PROCESSING OF TRANSITION WITH FADING. a) vision and navigation in the satellite map of FIG. 3; b) 1/3 transition = FIG.4a: first perspective distortion of the image in FIG. 3. It is the virtual perspective rotation of objects 5 and 6 obtained by FIG.3. c) 2/3 transition = FIG. 4b: second perspective distortion of the image in FIG. 3. d) 3/3 transition = H fading = FIG.4c: third perspective distortion of the image in FIG. 3,
coupled with the first fading step that allows replacing the satellite image 3 with the "street level" image (4) . The image in the arrival map is overlapped to the image containing the rotated virtual objects depending on FIG.3. The camera has almost landed, its position being next to lay on the "plane" where the image of the satellite map is projected, with a perspective view. Cross-fading between the plane where the image of the satellite map is projected and the 360° photograph. Both photograph and shot and perspective must coincide with the map spot where landing has been performed, e) 2/2 fading = FIG. 5a, 5b: the transition and fading processes have ended. One is immersed in the StreetLevel view, the player allowing to walk by passing from a geo-referenced photograph to another, to rotate the view or zoom.
In particular, the invention has the advantage of exploiting the graphic transition technique, known in the three-dimensional design field, towards the field of view of geo-referenced maps using a small space in the graphic processing memory, though reaching the aim of guaranteeing a transition of virtual perspective images but derived from a real starting image.
The invention reaches the objects of creating interesting and pleasant effects for network users, navigation programs, etc.
Claims
1. Transition method between at least two maps (3, 4), a starting map (3) and an arrival map (4), characterised in that the method creates a sequence of virtual perspective images (5Λ , 5' ' , 5'"), (6', 6", 6' ' ' ) , intuitive, simple and however derived from objects belonging to an image of the starting map (3) .
2. Transition method between at least two maps (3, 4) according to claim 1, characterised in that said two maps (3, 4) are of the immersive type, being composed of a sequence of geo-located photographs, whose image rendering, starting from a two-dimensional or three-dimensional map, corresponds to an image (2) projected in a virtual space in spherical coordinates (1) .
3. Transition method between at least two maps (3, 4) according to claim 2, characterised in that the starting map (3) and the arrival map (4) have in common the same geographic location defined by longitude and latitude.
4. Transition method between at least two maps (3, 4) according to claim 3, characterised by a sequence of perspective distortions (5' , 5' ' , 5"'), (6', 6", 6'") of reference images (5), (6) belonging to the starting map (3) .
5. Transition method between at least two maps (3, 4) according to claim 4, characterised in that said sequence of perspective distortions of the reference image (5, 6) corresponds to a virtual rotation of objects obtained from said reference image (5, 6) .
6. Transition method between at least two maps (3, 4) according to any one of the previous claims, characterised in that the perspective distortion is coupled with a fading that allows replacing the satellite image (3) with the "street level" image ( 4 ) .
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP08854217A EP2225730A2 (en) | 2007-11-26 | 2008-11-10 | Transition method between two three-dimensional geo-referenced maps |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
ITTO2007A000852 | 2007-11-26 | ||
ITTO20070852 ITTO20070852A1 (en) | 2007-11-26 | 2007-11-26 | TRANSITION METHOD BETWEEN TWO THREE-DIMENSIONAL GEO-REFERENCED MAPS. |
Publications (2)
Publication Number | Publication Date |
---|---|
WO2009069165A2 true WO2009069165A2 (en) | 2009-06-04 |
WO2009069165A3 WO2009069165A3 (en) | 2009-07-16 |
Family
ID=40315039
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/IT2008/000700 WO2009069165A2 (en) | 2007-11-26 | 2008-11-10 | Transition method between two three-dimensional geo-referenced maps |
Country Status (3)
Country | Link |
---|---|
EP (1) | EP2225730A2 (en) |
IT (1) | ITTO20070852A1 (en) |
WO (1) | WO2009069165A2 (en) |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9159153B2 (en) | 2012-06-05 | 2015-10-13 | Apple Inc. | Method, system and apparatus for providing visual feedback of a map view change |
US10156455B2 (en) | 2012-06-05 | 2018-12-18 | Apple Inc. | Context-aware voice guidance |
US10318104B2 (en) | 2012-06-05 | 2019-06-11 | Apple Inc. | Navigation application with adaptive instruction text |
US10323701B2 (en) | 2012-06-05 | 2019-06-18 | Apple Inc. | Rendering road signs during navigation |
US10508926B2 (en) | 2012-06-05 | 2019-12-17 | Apple Inc. | Providing navigation instructions while device is in locked mode |
US11055912B2 (en) | 2012-06-05 | 2021-07-06 | Apple Inc. | Problem reporting in maps |
US11956609B2 (en) | 2021-01-28 | 2024-04-09 | Apple Inc. | Context-aware voice guidance |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20060132482A1 (en) * | 2004-11-12 | 2006-06-22 | Oh Byong M | Method for inter-scene transitions |
-
2007
- 2007-11-26 IT ITTO20070852 patent/ITTO20070852A1/en unknown
-
2008
- 2008-11-10 EP EP08854217A patent/EP2225730A2/en not_active Withdrawn
- 2008-11-10 WO PCT/IT2008/000700 patent/WO2009069165A2/en active Application Filing
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20060132482A1 (en) * | 2004-11-12 | 2006-06-22 | Oh Byong M | Method for inter-scene transitions |
Cited By (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9159153B2 (en) | 2012-06-05 | 2015-10-13 | Apple Inc. | Method, system and apparatus for providing visual feedback of a map view change |
US10156455B2 (en) | 2012-06-05 | 2018-12-18 | Apple Inc. | Context-aware voice guidance |
US10318104B2 (en) | 2012-06-05 | 2019-06-11 | Apple Inc. | Navigation application with adaptive instruction text |
US10323701B2 (en) | 2012-06-05 | 2019-06-18 | Apple Inc. | Rendering road signs during navigation |
US10366523B2 (en) | 2012-06-05 | 2019-07-30 | Apple Inc. | Method, system and apparatus for providing visual feedback of a map view change |
US10508926B2 (en) | 2012-06-05 | 2019-12-17 | Apple Inc. | Providing navigation instructions while device is in locked mode |
US10718625B2 (en) | 2012-06-05 | 2020-07-21 | Apple Inc. | Voice instructions during navigation |
US10732003B2 (en) | 2012-06-05 | 2020-08-04 | Apple Inc. | Voice instructions during navigation |
US11055912B2 (en) | 2012-06-05 | 2021-07-06 | Apple Inc. | Problem reporting in maps |
US11082773B2 (en) | 2012-06-05 | 2021-08-03 | Apple Inc. | Context-aware voice guidance |
US11290820B2 (en) | 2012-06-05 | 2022-03-29 | Apple Inc. | Voice instructions during navigation |
US11727641B2 (en) | 2012-06-05 | 2023-08-15 | Apple Inc. | Problem reporting in maps |
US11956609B2 (en) | 2021-01-28 | 2024-04-09 | Apple Inc. | Context-aware voice guidance |
Also Published As
Publication number | Publication date |
---|---|
WO2009069165A3 (en) | 2009-07-16 |
EP2225730A2 (en) | 2010-09-08 |
ITTO20070852A1 (en) | 2008-02-25 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11869205B1 (en) | Techniques for determining a three-dimensional representation of a surface of an object from a set of images | |
KR101504383B1 (en) | Method and apparatus of taking aerial surveys | |
JP4804256B2 (en) | Information processing method | |
JP6775776B2 (en) | Free viewpoint movement display device | |
EP3534336B1 (en) | Panoramic image generating method and apparatus | |
TWI494898B (en) | Extracting and mapping three dimensional features from geo-referenced images | |
WO2010052548A2 (en) | System and method for creating interactive panoramic walk-through applications | |
US20040105573A1 (en) | Augmented virtual environments | |
JP2006053694A (en) | Space simulator, space simulation method, space simulation program and recording medium | |
CN104322052A (en) | A system for mixing or compositing in real-time, computer generated 3D objects and a video feed from a film camera | |
KR100834157B1 (en) | Method for Light Environment Reconstruction for Image Synthesis and Storage medium storing program therefor. | |
WO2009069165A2 (en) | Transition method between two three-dimensional geo-referenced maps | |
US20030225513A1 (en) | Method and apparatus for providing multi-level blended display of arbitrary shaped textures in a geo-spatial context | |
Bradley et al. | Image-based navigation in real environments using panoramas | |
CN102831816B (en) | Device for providing real-time scene graph | |
Sandnes et al. | Translating the viewing position in single equirectangular panoramic images | |
JP2004265396A (en) | Image forming system and image forming method | |
CN111524230A (en) | Linkage browsing method for three-dimensional model and unfolded panoramic image and computer system | |
Shimamura et al. | Construction of an immersive mixed environment using an omnidirectional stereo image sensor | |
Zheng et al. | Scanning scene tunnel for city traversing | |
JP6168597B2 (en) | Information terminal equipment | |
JP2004310686A (en) | Image processing method and device | |
Akaguma et al. | Mobile AR using pre-captured omnidirectional images | |
EP1796048A2 (en) | Augmented virtual environments | |
Ban et al. | Pixel of matter: new ways of seeing with an active volumetric filmmaking system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 08854217 Country of ref document: EP Kind code of ref document: A2 |
|
NENP | Non-entry into the national phase in: |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2008854217 Country of ref document: EP |