US20160098859A1 - 3d map display system - Google Patents

3d map display system Download PDF

Info

Publication number
US20160098859A1
US20160098859A1 US14/964,492 US201514964492A US2016098859A1 US 20160098859 A1 US20160098859 A1 US 20160098859A1 US 201514964492 A US201514964492 A US 201514964492A US 2016098859 A1 US2016098859 A1 US 2016098859A1
Authority
US
United States
Prior art keywords
representation image
attribute
attribute representation
feature
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US14/964,492
Inventor
Kiyonari Kishikawa
Eiji Teshima
Masatoshi Aramaki
Masaya Ada
Tsubasa TOMITAKA
Tatsuji Kimura
Masaru NAKAGAMI
Tatsuya AZAKAMI
Mai FUKUSAKI
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
GEO Technical Laboratory Co Ltd
Original Assignee
GEO Technical Laboratory Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by GEO Technical Laboratory Co Ltd filed Critical GEO Technical Laboratory Co Ltd
Assigned to GEO TECHNICAL LABORATORY CO., LTD. reassignment GEO TECHNICAL LABORATORY CO., LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: ADA, MASAYA, ARAMAKI, MASATOSHI, AZAKAMI, Tatsuya, KIMURA, TATSUJI, KISHIKAWA, KIYONARI, NAKAGAMI, Masaru, TESHIMA, EIJI, TOMITAKA, Tsubasa, FUKUSAKI, MAI
Publication of US20160098859A1 publication Critical patent/US20160098859A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • G06T17/05Geographic models
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3626Details of the output of route guidance instructions
    • G01C21/3635Guidance using 3D or perspective road maps
    • G01C21/3638Guidance using 3D or perspective road maps including 3D objects and buildings
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3667Display of a road map
    • G01C21/367Details, e.g. road map scale, orientation, zooming, illumination, level of detail, scrolling of road map or positioning of current position marker
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/58Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • G06F16/5866Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using information manually generated, e.g. tags, keywords, comments, manually generated location and time information
    • G06F17/30268
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/60Editing figures and text; Combining figures or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/04Texture mapping
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/50Lighting effects
    • G06T15/506Illumination models

Definitions

  • the present invention relates to a technology for displaying a three-dimensional (3D) map on which features are represented three-dimensionally.
  • features are represented in a state close to views visually recognized by a user in actuality.
  • this 3D map for route guidance in a navigation system, the user can intuitively grasp the route, which is highly convenient.
  • a ratio of lighted windows, hues, brightness and the like are changed in accordance with a type of the building (residential building, office building, high rise building and the like) in a night-view mode by generating a window model for the window of each building.
  • an object to represent a landscape with high reality with a relatively light processing load is common in representation of a night view and in representation of a daytime view.
  • the present invention was made in order to solve the aforementioned problems and has an object to improve reality of a 3D map with a relatively light processing load.
  • the present invention employs the following constitution.
  • An apparatus in accordance with one embodiment of the present invention is a 3D map display system for displaying a 3D map on which a feature is represented three-dimensionally.
  • the apparatus comprises a map database, a projection view generation unit, and an attribute representation image superimposing unit.
  • the map database stores, in an associated manner, drawing data including a 3D model of the feature and appearance attribute information representing an appearance attribute of the feature.
  • the projection view generation unit generates a projection view made by projecting the drawing data.
  • the attribute representation image superimposing unit displays an attribute representation image on the projection view on the basis of the appearance attribute information.
  • the attribute representation image may be an image representing the appearance according to the appearance attribute information without depending on a shape of the feature superimposed.
  • an appearance attribute of a feature may be an attribute indicating a matter visually recognizable in a real world.
  • the appearance attribute information includes a road width, a number of lanes and the like for a road, for example, and a height, a number of floors and the like for a building.
  • a type of roads such as an expressway, a general way, a minor street and the like, a type of buildings such as a building/a house, and moreover, a type of the feature itself such as a road/building and the like can be considered to be information representing the appearance of the feature in a broad sense, and they can be included in the appearance attribute information.
  • the attribute representation image includes an image representing an appearance according to the appearance attribute information without depending on a shape of the feature. For example, if the appearance attribute information such as a road width and a number of lanes is to be visually represented intuitively by arranging an image of an automobile on a road, the image of the automobile corresponds to the attribute representation image.
  • the attribute representation image does not necessarily have to be an image of an actually existing object but an image of a light source when a night view is represented or the like, for example, is also included. Since symbols for guidance such as a map symbol symbolizing a feature, a so-called traffic icon, a business icon and the like do not reproduce the appearance of the feature, they are not included in the attribute representation image.
  • the attribute representation image is an image not depending on the shape of the feature and is different from a texture attached in accordance with the shape of the feature in texture mapping.
  • Such attribute representation image may be prepared in advance in association with the appearance attribute information or may be generated when a 3D map is to be displayed. Moreover, the both may be used separately in accordance with the appearance attribute information to be represented.
  • the attribute representation image may be displayed at a fixed position on a projection map or may be moved.
  • a size and a color may be fixed or may be dynamically changed.
  • the present invention is highly usable particularly for a projection view of a wide area and thus, generation of the projection view is preferably made by perspective projection from a viewpoint in the upper sky.
  • the projection views do not necessarily have to project all the features. For example, if features become small to such a degree that the features cannot be sufficiently recognized on an attribute representation image in a wide area view, such features may be removed from projection targets and replaced by attribute representation images.
  • various decorations according to the appearance of a feature is preferably applied.
  • the 3D map display system of the present invention by using the attribute representation image generated separately from drawing data, such decorations on appearance can be applied.
  • this attribute representation image is displayed on the basis of the appearance attribute information of the feature, unlike the image with the light source simply added as in Patent Literature 2, representation flexibly reflecting a characteristic of the appearance of the feature can be realized.
  • the 3D model of the feature itself does not have to be made detail, an appearance attribute of a feature can be represented spuriously and simply. Therefore, reality of the 3D map can be improved with a relatively light processing load.
  • the attribute representation image may be an image representing an appearance without physical association with the feature
  • the attribute representation image superimposing unit may display the attribute representation image by allowing protrusion from a shape of the feature.
  • the term “allowing” implies that an image not protruding from the shape of the feature is also included.
  • the attribute representation images include an image physically associated with the feature and an image which is not necessarily associated physically with the feature such as an image representing light of a building in a night view or an image representing sea spray of a coastline.
  • images representing the appearances without physical association are displayed without constraint that the image is contained in the shape of the feature.
  • the attribute representation image is a 2D image
  • the attribute representation image superimposing unit may display the attribute representation image superimposed on the projection view generated by the projection view generation unit.
  • the attribute representation image is displayed simply by 2D image display processing separately from projection processing for generating a projection view.
  • the attribute representation image may be drawn directly on the projection view as the generated 2D image or maybe superimposed/displayed on the projection view after the attribute representation image is drawn on a 2D layer. According to such aforementioned aspect, since projection processing is not needed for the attribute representation image, a processing load for display can be reduced.
  • a position where the attribute representation image is displayed can be determined by various methods. For example, it may be so configured that a 2D coordinate within the projection view after projection processing is obtained for each feature, and a display position of the attribute display image is determined by obtaining on the basis of this coordinate value. Alternatively, after the display position of the attribute display image is acquired within a 3D space on the basis of the 3D model of each feature, the coordinate within the 2D image may be acquired by applying coordinate conversion similar to the projection processing.
  • the projection view is to be generated by perspective projection from a view point in the upper sky
  • display may be made such that the higher in the projection view the display position is located, the smaller the size of the attribute representation image is displayed.
  • perspective projection since the view point is drawn from the farther in the upper part in the generated image, perspective can be given also to the attribute representation image by making the size of the attribute representation image smaller as it goes upper.
  • the feature may be a building
  • the appearance attribute information may be information representing a light emission state at night of the building
  • the attribute representation image may be a light source image representing light emission at night of the building
  • the attribute representation image superimposing unit may display the attribute representation image in accordance with the light emission state.
  • the appearance attribute information representing the light emission state at night of the building refers to information which can give an influence on the number, size, color, brightness, shape, arrangement and the like of the light source images, when the appearance at light emission, that is, when the night view is to be displayed with the light source image superimposed on the building.
  • Such appearance attribute information includes a type of the building, a height and the number of floors of the building, planar shape and size of the building and the like, for example.
  • the light source images can be used and represented in accordance with a state such that, if the type of the building is an office building, a light source image representing light of a fluorescent light leaking through a window is used, while in the case of a high-rise building, a light source image representing light of an aircraft warning light is used.
  • the appearance attribute information includes information representing a height or a the number of floors of the building
  • the attribute representation image superimposing unit may display the attribute representation image in the number or size according to the height or the number of floors of the building.
  • the higher building or a building with more floors can be represented with light leaking from many windows.
  • the number of light source images may match the number of floors of the building such as one for a one-storied building and three for a three-storied building or may be changed in steps such as one for one to five-storied building and two for six to ten-storied building. It can be changed in a monotonic increase manner or in steps by various methods in accordance with the height or the number of floors of the building. Moreover, an upper limit value or a lower limit value may be set for the number of light source images.
  • a plurality of light source images are to be displayed, they may be juxtaposed in a height direction of the building. Then, the height of the building can be visually represented more effectively.
  • the light source images may be spaced away from each other, abut against each other or be overlapped with each other.
  • the light source image may be enlarged or have its shape changed so that the area changes in accordance with the height or the number of floors of the building. In this case, too, it may be monotonic increase or a change in steps.
  • the light source images may be displayed by being juxtaposed in a right-and-left direction further in accordance with the planar shape of the building.
  • the attribute representation image may be an image representing an appearance of an object physically associated with the feature
  • the attribute representation image superimposing unit may display the attribute representation image under a constraint condition based on the shape of the feature.
  • the constraint condition based on the shape of the feature can be a condition that the image does not protrude from the polygon representing the feature, a condition that the image is on a boundary or within a predetermined distance range from the boundary and the like, for example.
  • Such attribute representation images include, for example, a streetlamp or a vehicle on a road, a ship on the sea, plants growing in a field or a mountain and the like.
  • objects physically associated with the features as above if they are displayed by ignoring the feature shape, it can be an image giving a sense of discomfort.
  • such attribute representation images are displayed under the constraint condition based on the shape of the feature, such sense of discomfort can be avoided.
  • the attribute representation image superimposing unit may arrange a model for displaying the attribute representation image on the 3D model prior to generation of the projection view.
  • the attribute representation image is displayed by being superimposed on the feature by being projected together with the 3D model of the feature.
  • a positional relation with the 3D model can be regulated relatively easily, and there is a merit that the aforementioned constraint condition can be easily maintained.
  • the projection processing is applied also to the model for the attribute representation image, there is a merit that perspective similar to the 3D model of the feature can be given to the attribute display image.
  • the model for displaying the attribute representation image may be three-dimensional or two-dimensional. If the 2D attribute representation image is used, such a method may be employed that a texture composed of the attribute representation image is generated by arranging all the attribute representation images displayed in the 3D map on one plane in accordance with a positional coordinate of the corresponding feature, and this is pasted to the 3D model and then, projected.
  • the feature is a road
  • the attribute representation image is a light source image representing light of a streetlamp arranged on the road
  • the attribute representation image superimposing unit may display the light source image under a constraint condition that the image is along the road.
  • the light of the streetlamp can be represented on the road, and reality of the night view of the road can be improved.
  • the arrangement of the light source images can be set in various ways, but by regularly arranging them along the road, reality can be further improved.
  • the arrangement of the light source images may be changed in accordance with the number of lanes of the road and the like. For example, if the number of lanes of the road is small, they can be arranged in one row or if the number is large, they may be arranged in plural rows (two rows, for example).
  • a light source image representing a lighted state of a traffic signal may be arranged at a crossing of the road.
  • the present invention does not have to comprise all the aforementioned various characteristics but a part of them can be omitted or combined as appropriate in configuration.
  • the present invention can be configured as an invention of a 3D map display method other than the configuration as the aforementioned 3D map display system.
  • the present invention can be realized in various modes such as a computer program for realizing them and a recording medium recording the program, a data signal including the program and embodied in a carrier wave and the like. In each of the modes, it is possible to apply the various additional elements illustrated above.
  • the present invention When the present invention is configured as the computer program or the recording medium or the like recording the program, it may be configured as an entire program for controlling an operation of the 3D map display system or only a portion performing the function of the present invention may be configured.
  • the recording medium various computer-readable mediums such as a flexible disk, CD-ROM, DVD-ROM, a magneto-optical disk, an IC card, a ROM cartridge, a punch card, a printed matter on which a code such as barcode is printed, an internal storage device of a computer (a memory such as a RAM and a ROM), and an external storage device can be used.
  • FIG. 1 is an explanatory view illustrating an outline configuration of a navigation system in an embodiment.
  • FIGS. 2A through 2C are explanatory views illustrating contents of map data 22 .
  • FIG. 3 is an explanatory view illustrating an example of a setting table 24 T and attribute representation image data 24 .
  • FIG. 4 is a flowchart illustrating a flow of route guidance processing.
  • FIG. 5 is an explanatory view illustrating a display method of a 3D map of this embodiment.
  • FIG. 6 is a flowchart illustrating a flow of 3D map display processing.
  • FIG. 7 is a flowchart illustrating a flow of building attribute representation image drawing processing.
  • FIG. 8 is a flowchart illustrating a flow of road attribute representation image drawing processing.
  • FIG. 9 is an explanatory view illustrating a display example of a 3D map (bird-eye view) when a display mode is a night-view mode.
  • FIGS. 10A and 10B are explanatory views illustrating a display example of a light source image of a variation.
  • FIG. 11 is a flowchart illustrating a flow of road attribute representation image drawing processing of a variation.
  • FIG. 12 is a flowchart illustrating a flow of attribute representation image drawing processing of the variation.
  • FIG. 13 is a flowchart illustrating a flow of the attribute representation image drawing processing of another variation.
  • An embodiment of the present invention will be described on the basis of an embodiment when a 3D map display system of the present invention is applied to a navigation system for giving route guidance from a departure place (current place) to a destination place.
  • a navigation system for giving route guidance from a departure place (current place) to a destination place.
  • an example of the navigation system is illustrated, but the present invention is not limited to such an example but can be configured as various devices for displaying a 3D map.
  • FIG. 1 is an explanatory view illustrating an outline configuration of a navigation system in the embodiment.
  • the navigation system is configured by connecting a server 100 and a terminal 10 having a function as a 3D map display device via a network NE.
  • it may be configured as a standalone device by incorporating a function provided by the server 100 of this embodiment in the terminal 10 or may be configured as a distributed system provided with many more servers and the like.
  • the server 100 is provided with functional blocks of a map database 20 , a transmission/reception unit 101 , a database management unit 102 , and a route search unit 103 , as illustrated. These functional blocks can be configured in a software manner by installing a computer program for realizing the respective functions in the server 100 . At least a part of these functional blocks may be configured in a hardware manner.
  • map data 22 In the map database 20 , map data 22 , a setting table 24 T, attribute representation image data 24 , character data 26 , and network data 28 are stored.
  • the map data 22 is data for displaying a 3D map during route guidance and includes a 3D model (polygon) and the like as drawing data for three-dimensionally drawing various features such as sea, mountain, river, road, building and the like.
  • the setting table 24 T regulates what attribute representation image is to be used to decorate an appearance of a feature.
  • the attribute representation images include those prepared in the form of image data in advance and those generated on the basis of a function provided in a graphics library when drawing a map.
  • the attribute representation image data 24 stores image data of the attribute representation image which should be prepared in advance. Data structures of the map data 22 , the setting table 24 T, and the attribute representation image data 24 will be described later.
  • the character data 26 is data representing characters displayed in the map.
  • the network data 28 is data for route search representing roads as a collection of links and nodes.
  • the transmission/reception unit 101 conducts transaction of various commands, data and the like with the terminal 10 via the network NE.
  • the commands relating to route search and map display, various types of data stored in the map database 20 and the like are transmitted/received, for example.
  • the database management unit 102 controls reading-out of the data from the map database 20 .
  • the route search unit 103 executes route search from the departure place to the destination place specified by a user by using the map database 20 . In the route search, a known method such as Dijkstra's algorithm or the like can be applied.
  • the terminal 10 comprises a CPU, a ROM, a RAM, a hard disk drive and the like.
  • the CPU functions as a transmission/reception unit 12 and a display control unit 13 by reading out and executing an application program stored in the hard disk drive.
  • the display control unit 13 comprises a projection view generation unit 14 , an attribute representation image superimposing unit 15 , and a character display control unit 16 . At least a part of these units may be configured by hardware.
  • a command input unit 11 receives an input of an instruction by a user relating to route search and map display.
  • the transmission/reception unit 12 transmits/receives various commands, data and the like with the server 100 via the network NE.
  • a data holding unit 17 temporarily holds data obtained from the server 100 .
  • a positional information obtaining unit 18 obtains information required for route search and route guidance such as a current position and azimuth of the terminal 10 by a sensor such as a GPS (Global Positioning System) and an electromagnetic compass.
  • GPS Global Positioning System
  • the projection view generation unit 14 generates a projection view obtained by three-dimensionally drawing a feature by a perspective projection method by using the map data 22 .
  • the attribute representation image superimposing unit 15 has the attribute representation image superimposed on the projection view displayed by using the attribute representation image data 24 and the like.
  • the character display control unit 16 controls display of the character representing information relating to the feature on the projection view by using the character data 26 .
  • the display control unit 13 controls operations of the projection view generation unit 14 , the attribute representation image superimposing unit 15 , and the character display control unit 16 and displays a 3D map generated by them on the display device 30 of the terminal 10 .
  • a “day-view mode” which is a display mode displaying a view of a day time and a “night-view mode” which is a display mode displaying a view of a night time are prepared as display modes of the 3D map.
  • FIGS. 2A through 2C are explanatory views illustrating contents of the map data 22 .
  • a feature ID specific to each feature is given, and various types of data illustrated for each feature are managed.
  • a “type” indicates a type of features such as a “building”, a “road”, a “railway”, the “sea”, a “lake”, a “river”, a “mountains and forests”, a “field/plain” and the like.
  • a “name” is a name of the feature.
  • a “3D model” is polygon data for displaying each feature three-dimensionally. This data corresponds to drawing data in the present invention.
  • a “texture” is an image pasted in accordance with a shape of a feature (3D model) in texture mapping.
  • An “attribute” is data indicating various natures of a feature in accordance with the type of the feature. As illustrated in FIG.
  • the type of the feature is a building, for example, detailed types of the building such as a high-rise building, an office building, a house and the like (detailed type), a height or a the number of floors of the building, a width of the building and the like are included in the attributes.
  • detailed types of the road such as a highway, a national route, a prefectural route, a general road, a narrow street and the like (detailed type), a number of lanes of the road, a width of the road and the like are included in the attributes.
  • FIG. 3 is an explanatory view illustrating an example of the setting table 24 T and the attribute representation image data 24 .
  • the setting table 24 T regulates the attribute representation image used for decorating an appearance of a feature.
  • the attribute representation image is also set for each display mode.
  • the attribute representation image data 24 is a database storing two-dimensional image data for displaying the attribute representation image in association with identification information ID.
  • the attribute representation image is associated with the type/detailed type of the feature as illustrated.
  • identification information ID 1 is stored in the day-view mode of the road/highway. This indicates that the identification information ID 1 of the attribute representation image data 24 , that is, image data of an automobile is used as the attribute representation image.
  • a “light source (orange)” is set in the night-view mode. This indicates that, when a map is drawn by using the function of the graphics library, a spherical light source image representing orange-color light of a streetlamp (sodium-vapor lamp) installed on the road is generated and displayed.
  • the identification information ID 1 “automobile” is used similarly to the highway in the day-view mode, but in the night-view mode, a light source in a different color as a “light source (white)” is used.
  • a light source in a different color as a “light source (white)” is used.
  • the attribute representation image is not used.
  • those using the image data prepared in the attribute representation image data 24 include a railway identification information ID 2 “train”, a sea/port identification information ID 3 “ship”, a mountains and forests/conifer forest identification information ID 4 “conifer forest” and mountains and forests/broadleaf forest identification information ID 5 “broadleaf forest” and the like in the day-view mode, respectively.
  • those generated in drawing and used in the night-view mode include square and circular window light source images representing white light of a fluorescent light leaking through the window of a building, circular aircraft warning light images representing red light of an aircraft warning light installed on a rooftop of a high-rise building and the like, and a ship light source image lighted on a ship.
  • those used in the day-view mode include a wave light source image representing brightness of sea waves, white-point images representing sea spray and the like.
  • the attribute representation images can be prepared for various features other than the above.
  • FIG. 4 is a flowchart illustrating a flow of the route guidance processing. Processing contents of the terminal 10 and the server 100 are not described separately, but this processing is executed by both in collaboration.
  • the navigation system inputs instructions of a departure place, a destination place, and a display mode (Step S 10 ).
  • a current position obtained by the positional information obtaining unit 18 may be used as it is.
  • the display mode the “day-view mode” which is a display mode displaying a view of the day time and the “night-view mode” which is a display mode displaying a view of the night time are prepared.
  • the display mode may be automatically switched in accordance with time when the route guidance processing is executed by providing a function of obtaining the time in the navigation system.
  • the navigation system executes route search processing on the basis of specification from the user (Step S 12 ).
  • This processing is executed by the server 100 by using the network data 28 stored in the map database 20 and can be executed by a known method such as Dijkstra's algorithm or the like.
  • the obtained route is transmitted to the terminal 10 .
  • the terminal 10 Upon receipt of a result of the route search, the terminal 10 executes the route guidance by the following procedure while performing 3D map display.
  • the terminal 10 inputs a current position from the sensor such as a GPS and the like (Step S 14 ) and determines a view point position and a line of sight direction when the 3D map is displayed (Step S 16 ).
  • the line of sight direction can be a direction in which a future position is seen on the route from the current position to the destination place, for example.
  • the view point position can be behind the current position only by a predetermined distance, for example, and a height and an angle (an elevation angle, a depression angle) of the view point can be adjusted arbitrarily by the user from a value set in advance.
  • Step S 18 the terminal 10 executes 3D map display processing.
  • the 3D map display processing will be described later in detail.
  • the terminal 10 repeatedly executes the processing from Steps S 14 to S 18 until the destination place is reached (Step S 20 : YES).
  • the appearance of the feature is decorated and the 3D map is displayed.
  • the display of the attribute representation image can be made in various ways but first, a method of superimposing an attribute representation image layer on which the attribute representation image is drawn two-dimensionally on the projection view obtained by performing perspective projection of the 3D model will be described.
  • FIG. 5 is an explanatory view illustrating a display method of the 3D map of this embodiment.
  • a relation among the 3D model, the projection view and the attribute representation image layer is schematically illustrated.
  • the projection view is a two-dimensional image drawn by perspective projection of the 3D model.
  • the 3D model in a short-range view is drawn larger on the lower region, and the 3D model in a long-range view is drawn smaller on the upper region, whereby perspective is represented.
  • the attribute representation image layer is a layer on which the attribute representation image is drawn two-dimensionally and is prepared separately from the projection view. Regarding a display position of each of the attribute representation images on the attribute representation image layer, a two-dimensional coordinate of each feature in the projection view is obtained and the display position of the attribute representation image can be determined on the basis of this coordinate value, for example. After the display position of the attribute display image is acquired in a three-dimensional space on the basis of the 3D model of each feature, the coordinate in the 2D image may be acquired by executing coordinate conversion similar to perspective projection.
  • the attribute representation image layer is superimposed on a front surface of the projection view. In the attribute representation image layer, similarly to the feature in the projection view, the attribute representation image is reduced and drawn on the upper region. In this way, perspective can be given also to the attribute representation image.
  • FIG. 6 is a flowchart illustrating a flow of the 3D map display processing. This processing corresponds to Step S 18 in the route guidance processing illustrated in FIG. 4 and is the processing executed by the terminal 10 .
  • the terminal 10 inputs a view point position, a line of sight direction, and a display mode (Step S 100 ). Then, the terminal 10 reads in a 3D model of a feature present in a display target area determined on the basis of the view point position and the line of sight direction and the attribute representation image data 24 corresponding to each feature from the map database 20 (Step S 110 ). Then, the terminal 10 determines whether the display mode of the 3D map is the day-view mode or the night-view mode (Step S 120 ). If the display mode is the night-view mode, the terminal 10 darkens the 3D model and the entire background thereof (Step S 130 ). On the other hand, if the display mode is the day-view mode, the terminal 10 skips Step S 130 and proceeds with the processing to Step S 140 .
  • the terminal 10 performs rendering by the perspective projection method while performing hidden line elimination on the basis of the view point position and the line of sight direction set at Step S 100 generates a projection view drawing a feature three-dimensionally (Step S 140 ). Then, the terminal 10 obtains a two-dimensional coordinate value in the projection view for each feature and sets a display position of the attribute representation image in the attribute representation image layer on the basis of this coordinate value (Step S 150 ). At this time, the terminal 10 generates an attribute representation image (see FIG. 3 ) corresponding to each feature and not stored in the map database 20 in accordance with the display mode by using the function of the graphics library and also sets the display position of this attribute representation image.
  • the terminal 10 expands/contracts a size of the attribute representation image in accordance with the display position of each feature in the projection view (Step S 160 ).
  • a reference size based on the feature displayed at a center in a vertical direction in the projection view is set in accordance with the map scale, and the lower the display position of the feature is located, the larger the terminal 10 enlarges the size of the attribute representation image, and the upper the display position of the feature is located, the smaller the terminal 10 reduces the size of the two-dimensional light source image.
  • the attribute representation image drawing processing is processing of two-dimensionally drawing the attribute representation image on the attribute representation image layer. As illustrated in FIG. 3 , there are various types of attribute representation images, but since the drawing method is different depending on the type, specific processing contents will be described later.
  • the terminal 10 reads in character data which is a display target and displays each character superimposed on the map (Step S 180 ).
  • the terminal 10 finishes the 3D map display processing by the processing described above.
  • FIG. 7 is a flowchart illustrating a flow of building attribute representation image drawing processing. This processing corresponds to a part of Step S 170 (attribute representation image drawing processing) in the 3D map display processing illustrated in FIG. 6 and is sequentially executed for a building present in a display target area if the display mode is the night-view mode.
  • the terminal 10 selects a building to be processed (Step S 200 ). Then, the terminal 10 sets a display mode of a window light source image which is the attribute representation image on the basis of the attribute information of the building (Step S 210 ). The terminal 10 sets M pieces of the window light source images to a building with N floors. In this embodiment, as illustrated in a frame at Step S 210 , the number of window light source images is set in steps such as one for a one-storied to five-storied building, two for a six-storied to ten-storied building, and three for an eleven-storied to fifteen-storied building and moreover, an upper limit value is set to the number of window light source images.
  • the number of the window light source images may be the same as the number of floors of the building but by setting it in steps as above, the number of the window light source images displayed in the map can be prevented from becoming excessive. Moreover, by providing the upper limit value and by keeping the number of images of window light sources to the upper limit value regardless of the number of floors for a building with certain floors or more, the number of window light source images for one feature can be prevented from becoming excessive.
  • the terminal 10 draws M pieces of the window light source images at corresponding positions (see Step S 150 in FIG. 6 ) of the projection view of the building in the attribute representation image layer (see FIG. 5 ) (Step S 220 ).
  • the window light source image is a circular image having a radius r, and if there are a plurality of the window light source images, the terminal 10 draws the window light source images in abutting state in a juxtaposed manner in a height direction of the building. A part of the window light source images may protrude from the shape of the building.
  • a position of a light source [ 1 ] uses a display position previously obtained at Step S 150 in FIG. 6 .
  • the light source [M] which is the M-th window light source image is drawn on the basis of the position of the light source [ 1 ] which is the first window light source image with an offset in the vertical direction.
  • the offset amount at this time is calculated by the following equation (1):
  • Offset[ M ] ( ⁇ 1) M ⁇ 2 r ⁇ ROUNDUP(( M ⁇ 1)/2,0) (1)
  • ROUNDUP(X, 0) is a function for rounding up to the first decimal place of a numeral value X.
  • the terminal 10 draws the light source image of the aircraft warning light at a position corresponding to a corner of a rooftop of the building, for example, if the light source image of an aircraft warning light is set with respect to the building to be processed.
  • the terminal 10 finishes the building attribute representation image drawing processing.
  • the light sources may be spaced away from each other or in the case of a predetermined number or more of the light sources, they may be separated into two rows or more and displayed. Depending on the position where the light source [ 1 ] is displayed, the light sources [M] may be sequentially arrayed in an upper direction or a lower direction from the light source [ 1 ] and displayed. The light sources may be arranged at random in a region around the display position of the light source [ 1 ].
  • FIG. 8 is a flowchart illustrating a flow of the road attribute representation image drawing processing. This processing corresponds to a part of Step S 170 (attribute representation image drawing processing) in the 3D map display processing illustrated in FIG. 6 and is executed sequentially for the road present within a display target area.
  • the terminal 10 selects a road to be processed (Step S 300 ). Then, the terminal 10 sets the attribute representation image to be drawn (Step S 310 ) on the basis of the attribute information and the display mode of the road. As illustrated in the frame at Step S 310 , if the road to be processed is a highway, and if the display mode is the day-view mode, for example, the terminal 10 selects an image of an automobile as the attribute representation image. If the road to be processed is a highway, and if the display mode is the night-view mode, for example, the terminal 10 selects the light source image in orange representing a streetlamp as the attribute representation image. Then, the terminal 10 draws the selected attribute representation image in a shape of the road in the attribute representation image layer (see FIG. 5 ) (Step S 320 ).
  • the terminal 10 draws the plurality of light source images representing the light of the streetlamps along the road so as to be apparently arranged at equal intervals.
  • the road is assumed to be drawn not as a polygon but by a line with a line width.
  • the terminal 10 obtains coordinates of passage points P 1 , P 2 , and P 3 of line data after projection as display positions when the road is rendered (see Step S 150 in FIG. 6 ).
  • the line of the road on the attribute representation image layer is acquired on the basis of the coordinates of these passage points P 1 to P 3 , and the light source images are arranged at predetermined intervals d along this line.
  • the interval d can be set arbitrarily on the basis of a scale of map display and the like. In the perspective-projected image, considering the fact that the scale of the distance is reduced on the upper part, that is, in the long-range view, the interval d may be made sequentially shorter as it goes to the upper part.
  • the terminal 10 changes an array of the light source images in accordance with the type of the road and the number of lanes. For a high way with many lanes, for example, the terminal 10 arrays the light source images in two rows, while for a narrow street with fewer lanes, the terminal 10 arrays the light source images in one row.
  • the light source images are displayed in two rows or more, it is only necessary to arrange the light source images at positions offset from the road line in a width direction of the road.
  • the offset amount at this time is preferably set on the basis of the line width of the road so as not to be largely deviated from the road, but it does not necessarily have to be contained in the road.
  • the light sources can be drawn under the constraint condition of arrangement along the road line.
  • the terminal 10 performs drawing so that the image of the automobile is arranged at random on the road.
  • a method of specifying the road line is similar to the case of the night-view mode.
  • the position on the road line may be determined by using random numbers or the interval between the automobiles may be determined by using random numbers.
  • the images of the automobiles may be changed to two rows or more in accordance with the type of the road or the number of lanes.
  • the constraint condition is stricter than the light source, and the offset amount needs to be determined so as not to protrude from the line width of the road.
  • the terminal 10 finishes the road attribute representation image drawing processing.
  • the attribute representation image drawing processing for the features other than the building and the road is substantially the same as the aforementioned road attribute representation image drawing processing. That is, the terminal 10 selects the feature to be processed, sets the attribute representation image to be drawn on the basis of the attribute information and the display mode, and draws the attribute representation image at the position corresponding to the projection view in the attribute representation image layer. However, the terminal 10 switches the condition on the drawing position of the attribute representation image in accordance with whether the attribute representation image is an image physically associated with the feature or an image not physically associated with the feature.
  • the attribute representation images not physically associated with the feature include an image representing sea spray on a coastline, for example.
  • the terminal 10 allows white points, lines and the like representing the sea spray with respect to a polygon representing the sea at random and protruding from the boundary of the polygon in the vicinity of the coast line.
  • the attribute representation images physically associated with the feature include images representing a ship floating on the sea, plants growing on the fields and mountains, for example.
  • the terminal 10 performs drawing of the attribute representation images under the condition that the image does not protrude from the polygon representing the feature and the condition that the image is on the boundary or within a predetermined distance range from the boundary, for example.
  • FIG. 9 is an explanatory view illustrating a display example of a 3D map (bird's eye view) when the display mode is the night-view mode.
  • This 3D map is made by the aforementioned 3D map display processing (night-view mode).
  • the sky is drawn in navy blue but in FIG. 9 , in order to clearly represent the boundary between the sky and the mountain, the sky is drawn in white. Moreover, characters in the map are omitted.
  • the light source images representing light leaking through the windows of the building and the like are displayed in a concentrated manner in the vicinity of a region A indicated by surrounding with a broken line, and the night view of a city in which a large number of buildings stand is represented with reality.
  • the light source images representing light of the streetlamps on the road and the like are displayed linearly, and the night view of the plurality of roads such as a road RD and the like is represented with reality.
  • the appearance attribute of a feature can be represented spuriously and simply. Therefore, reality of the 3D map can be improved with a relatively light load.
  • FIGS. 10A and 10B are explanatory views illustrating display modes of a light source image of a variation.
  • the higher the building is the larger the size of the circular light source image may be made.
  • the light source image may be deformed to a vertically long oval.
  • FIG. 11 is a flowchart illustrating a flow of the road attribute representation image drawing processing of a variation. This processing is executed sequentially for the road present in the display target area immediately after Step S 130 in FIG. 6 if the display mode of the 3D map is the night-view mode.
  • the terminal 10 selects the road to be processed (Step S 400 ). Then, the terminal 10 sets a spherical light source model for displaying the light source image as the attribute representation image representing the light of the streetlamp on the basis of the attribute information of the road (Step S 410 ). If the road to be processed is a highway, the terminal 10 sets the spherical model in orange. If the road to be processed is a national route or a prefectural route, the terminal 10 sets a spherical model in white. A diameter of the spherical model can be set arbitrarily within the road width, for example, and it can be 1/10 of the road width, for example.
  • the terminal 10 arranges the set light source models on the 3D space along the road (Step S 420 ).
  • spherical light source models are arranged at the equal intervals d along the road at positions at a height h from the road.
  • the terminal 10 renders the light source model and the 3D model of the feature by the perspective projection method on the basis of the view point position and the line of sight direction set at Step S 100 in FIG. 6 and generates a projection view displayed with the light source image superimposed (Step S 430 ).
  • the terminal 10 finishes the road attribute representation image drawing processing.
  • FIG. 12 is a flowchart illustrating a flow of the attribute representation image drawing processing of a variation. This processing is executed immediately after Step S 130 in FIG. 6 if the display mode of the 3D map is the night-view mode.
  • the terminal 10 selects the feature to be processed (Step S 500 ). Then, the terminal 10 sets the corresponding attribute representation image (light source image) on the basis of the attribute information of the feature (Step S 510 ). Then, the terminal 10 arranges the attribute representation images on one plane in accordance with the position coordinate of the feature (Step S 520 ). This state is schematically illustrated in the frame at Step S 520 . In the illustrated example, for the building, the terminal 10 arranges a circular light source image in a size according to a height of the building at a position corresponding to the position coordinate of the building.
  • the terminal 10 arranges the circular light source images representing the light of the streetlamps along the road at equal intervals at positions corresponding to the position coordinate of the road.
  • the one in which the attribute representation images of a plurality of features are arranged on one plane is called an attribute representation image texture.
  • the terminal 10 determines whether or not all the attribute representation images have been arranged for all the features to be processed (Step S 530 ). If all the attribute representation images are not arranged yet (Step S 530 : NO), the terminal 10 returns the processing to Step S 500 . On the other hand, if all the attribute representation images have been arranged (Step S 530 : YES), the terminal 10 pastes the attribute representation image texture to the 3D model (Step S 540 ). Then, the terminal 10 renders the 3D model by the perspective projection method on the basis of the view point position and the line of sight direction set at Step S 100 in FIG. 6 and generates a projection view displayed with the light source image superimposed (Step S 550 ).
  • the terminal 10 finishes the attribute representation image drawing processing.
  • FIG. 13 is a flowchart illustrating a flow of the attribute representation image drawing processing of another variation.
  • the attribute representation image drawing processing of this variation is processing when a wide area view of such a degree that individual buildings cannot be visually recognized is displayed.
  • the flow of the attribute representation image drawing processing of this variation is substantially the same as the attribute representation image drawing processing of the variation illustrated in FIG. 12 .
  • This processing is also executed immediately after Step S 130 in FIG. 6 when the display mode of the 3D map is the night-view mode.
  • the terminal 10 selects a feature to be processed (Step S 600 ). Then, the terminal 10 sets the corresponding attribute representation image (light source image) on the basis of the attribute information of the feature (Step S 610 ). Then, the terminal 10 arranges the attribute representation images on one plane in accordance with the position coordinate of the feature (Step S 620 ). This state is schematically illustrated in the frame at Step S 620 . In the illustrated example, for the building, the terminal 10 arranges a circular light source image in a size according to a height of the building at a position corresponding to the position coordinate of the building.
  • the terminal 10 arranges the circular light source images representing the light of the streetlamps along the road at equal intervals at positions corresponding to the position coordinate of the road.
  • the 3D model of the building is drawn by a broken line in this drawing in order to indicate that the 3D model is not rendered for the building.
  • the terminal 10 determines whether or not all the attribute representation images have been arranged for all the features to be processed (Step S 630 ). If all the attribute representation images are not arranged yet (Step S 630 : NO), the terminal 10 returns the processing to Step S 600 . On the other hand, if all the attribute representation images have been arranged (Step S 630 : YES), the terminal 10 pastes the attribute representation image texture to the 3D model (Step S 640 ). At this time, the terminal 10 pastes the attribute representation image after the 3D model of the building is removed.
  • the terminal 10 renders the 3D model of those other than the building by the perspective projection method on the basis of the view point position and the line of sight direction set at Step S 100 of FIG. 6 and generates a projection view displayed with the light source image superimposed (Step S 650 ).
  • this projection view the 3D model of the building is not projected, but a light source image corresponding to the building is projected.
  • the terminal 10 finishes the attribute representation image drawing processing.
  • the 3D map display system of the present invention is applied to the navigation system, but regardless of the route search/route guidance function, it can be constituted as a device for displaying a 3D map.
  • processing executed in a software manner may be executed in a hardware manner or vice versa.
  • the present invention can be used for technologies for displaying a 3D map representing a feature three-dimensionally.

Abstract

A three-dimensional map display system displays a three-dimensional map representing features thereon three-dimensionally. The system includes a map database, a projection view generation unit, and an attribute representation image superimposing unit. The map database stores drawing data including a three-dimensional model of a feature and appearance attribute information representing an appearance attribute of the feature, by associating the drawing data and the appearance attribute information with each other. The projection view generation unit generates a projection view by projecting the drawing data. The attribute representation image superimposing unit displays an attribute representation image by superimposing the attribute representation image on the projection view based on the appearance attribute information, where the attribute representation image includes a first attribute representation image representing the appearance of the feature according to the appearance attribute information, the first attribute representation image not being bound by a shape of the feature.

Description

    CLAIM OF PRIORITY
  • This application is a Continuation of International Patent Application No. PCT/JP2014/064661, filed on Jun. 3, 2014, which claims priority to Japanese Patent Application No. 2013-122704, filed on Jun. 11, 2013, each of which is hereby incorporated by reference.
  • BACKGROUND OF THE INVENTION
  • 1. Field of the Invention
  • The present invention relates to a technology for displaying a three-dimensional (3D) map on which features are represented three-dimensionally.
  • 2. Description of the Related Art
  • 3D maps on which features such as buildings, roads and the like are represented three-dimensionally have been popular. In this 3D map, features are represented in a state close to views visually recognized by a user in actuality. Thus, by using this 3D map for route guidance in a navigation system, the user can intuitively grasp the route, which is highly convenient.
  • In recent years, representation of a night view in the 3D map is also proposed. When a night view is represented, features and their backgrounds are drawn in dark colors. In a flight simulator or the like, for example, in order to have it felt more realistically, sophisticated 3D computer graphics are used by a 3D model to represent light leaking through a window of a building, a state in which the building is lighted up, light of streetlamps and the like.
  • Moreover, in the technology described in Japanese Patent Laid-Open No. 2002-298162, a ratio of lighted windows, hues, brightness and the like are changed in accordance with a type of the building (residential building, office building, high rise building and the like) in a night-view mode by generating a window model for the window of each building.
  • Furthermore, in a technology described in Japanese Patent Laid-Open No. 2005-326154, in a navigation device, when a map for night is to be drawn, if a map display scale is small, the gravity center or a top point of a building figure is drawn in high brightness color (light-spot figure).
  • However, by simply drawing light in a building when a night view is represented as in the technology of Japanese Patent Laid-Open No. 2005-326154, reality could not be improved. For example, since a degree of lighting of lights are naturally different between a house and a high rise building, even if lights are lighted similarly in the both for representing a night view, only the same degree of brightness can be represented for a residential street and a high-rise building street, and dazzling brightness specific to the high-rise building district could not be represented.
  • On the other hand, as in the technology described in Patent Laid-Open No. 2002-298162, if a high-definition 3D model is used, the aforementioned problem can be solved, and a night view with high reality can be represented, but it results in another problem that a processing load for drawing becomes extremely high.
  • In this type of 3D map, an object to represent a landscape with high reality with a relatively light processing load is common in representation of a night view and in representation of a daytime view.
  • BRIEF DESCRIPTION OF THE INVENTION
  • The present invention was made in order to solve the aforementioned problems and has an object to improve reality of a 3D map with a relatively light processing load. In order to solve at least a part of the aforementioned problems, the present invention employs the following constitution.
  • An apparatus in accordance with one embodiment of the present invention is a 3D map display system for displaying a 3D map on which a feature is represented three-dimensionally. The apparatus comprises a map database, a projection view generation unit, and an attribute representation image superimposing unit. The map database stores, in an associated manner, drawing data including a 3D model of the feature and appearance attribute information representing an appearance attribute of the feature. The projection view generation unit generates a projection view made by projecting the drawing data. The attribute representation image superimposing unit displays an attribute representation image on the projection view on the basis of the appearance attribute information. The attribute representation image may be an image representing the appearance according to the appearance attribute information without depending on a shape of the feature superimposed.
  • In the present invention, an appearance attribute of a feature may be an attribute indicating a matter visually recognizable in a real world. The appearance attribute information includes a road width, a number of lanes and the like for a road, for example, and a height, a number of floors and the like for a building. Moreover, a type of roads such as an expressway, a general way, a minor street and the like, a type of buildings such as a building/a house, and moreover, a type of the feature itself such as a road/building and the like can be considered to be information representing the appearance of the feature in a broad sense, and they can be included in the appearance attribute information.
  • The attribute representation image includes an image representing an appearance according to the appearance attribute information without depending on a shape of the feature. For example, if the appearance attribute information such as a road width and a number of lanes is to be visually represented intuitively by arranging an image of an automobile on a road, the image of the automobile corresponds to the attribute representation image. The attribute representation image does not necessarily have to be an image of an actually existing object but an image of a light source when a night view is represented or the like, for example, is also included. Since symbols for guidance such as a map symbol symbolizing a feature, a so-called traffic icon, a business icon and the like do not reproduce the appearance of the feature, they are not included in the attribute representation image. Moreover, the attribute representation image is an image not depending on the shape of the feature and is different from a texture attached in accordance with the shape of the feature in texture mapping.
  • Such attribute representation image may be prepared in advance in association with the appearance attribute information or may be generated when a 3D map is to be displayed. Moreover, the both may be used separately in accordance with the appearance attribute information to be represented.
  • The attribute representation image may be displayed at a fixed position on a projection map or may be moved. A size and a color may be fixed or may be dynamically changed.
  • For generation of the projection view, various methods such as parallel projection and perspective projection can be used. The present invention is highly usable particularly for a projection view of a wide area and thus, generation of the projection view is preferably made by perspective projection from a viewpoint in the upper sky.
  • Moreover, the projection views do not necessarily have to project all the features. For example, if features become small to such a degree that the features cannot be sufficiently recognized on an attribute representation image in a wide area view, such features may be removed from projection targets and replaced by attribute representation images.
  • In order to improve reality of the 3D map, various decorations according to the appearance of a feature is preferably applied. According to the 3D map display system of the present invention, by using the attribute representation image generated separately from drawing data, such decorations on appearance can be applied. Moreover, since this attribute representation image is displayed on the basis of the appearance attribute information of the feature, unlike the image with the light source simply added as in Patent Literature 2, representation flexibly reflecting a characteristic of the appearance of the feature can be realized. Moreover, since the 3D model of the feature itself does not have to be made detail, an appearance attribute of a feature can be represented spuriously and simply. Therefore, reality of the 3D map can be improved with a relatively light processing load.
  • In the 3D map display system of one embodiment of the present invention, the attribute representation image may be an image representing an appearance without physical association with the feature, and the attribute representation image superimposing unit may display the attribute representation image by allowing protrusion from a shape of the feature. The term “allowing” implies that an image not protruding from the shape of the feature is also included.
  • The attribute representation images include an image physically associated with the feature and an image which is not necessarily associated physically with the feature such as an image representing light of a building in a night view or an image representing sea spray of a coastline. In the aforementioned aspect, images representing the appearances without physical association are displayed without constraint that the image is contained in the shape of the feature. By configuring as such, representation not constrained by a profile shape of the 3D model can be made, whereby reality can be improved. For example, by displaying white points, lines and the like representing sea spray with respect to a polygon representing the sea at random and allowing protrusion from a boundary of the polygon around the coastline, even if the polygonal shape of the sea is relatively simple, a complicated coastline imitating the wave can be represented, whereby the reality thereof can be improved.
  • In accordance with one embodiment of the present invention, the attribute representation image is a 2D image, and the attribute representation image superimposing unit may display the attribute representation image superimposed on the projection view generated by the projection view generation unit.
  • In the aforementioned aspect, the attribute representation image is displayed simply by 2D image display processing separately from projection processing for generating a projection view. For example, the attribute representation image may be drawn directly on the projection view as the generated 2D image or maybe superimposed/displayed on the projection view after the attribute representation image is drawn on a 2D layer. According to such aforementioned aspect, since projection processing is not needed for the attribute representation image, a processing load for display can be reduced.
  • In Such an aspect, a position where the attribute representation image is displayed can be determined by various methods. For example, it may be so configured that a 2D coordinate within the projection view after projection processing is obtained for each feature, and a display position of the attribute display image is determined by obtaining on the basis of this coordinate value. Alternatively, after the display position of the attribute display image is acquired within a 3D space on the basis of the 3D model of each feature, the coordinate within the 2D image may be acquired by applying coordinate conversion similar to the projection processing.
  • Moreover, if the projection view is to be generated by perspective projection from a view point in the upper sky, display may be made such that the higher in the projection view the display position is located, the smaller the size of the attribute representation image is displayed. In the case of the perspective projection, since the view point is drawn from the farther in the upper part in the generated image, perspective can be given also to the attribute representation image by making the size of the attribute representation image smaller as it goes upper.
  • As a specific application example of the aspect illustrated above, the feature may be a building, the appearance attribute information may be information representing a light emission state at night of the building, the attribute representation image may be a light source image representing light emission at night of the building, and the attribute representation image superimposing unit may display the attribute representation image in accordance with the light emission state.
  • The appearance attribute information representing the light emission state at night of the building refers to information which can give an influence on the number, size, color, brightness, shape, arrangement and the like of the light source images, when the appearance at light emission, that is, when the night view is to be displayed with the light source image superimposed on the building. Such appearance attribute information includes a type of the building, a height and the number of floors of the building, planar shape and size of the building and the like, for example.
  • According to the aforementioned aspect, by generating a light source image in accordance with such appearance attribute information, reality of the night view of the building can be improved. For example, the light source images can be used and represented in accordance with a state such that, if the type of the building is an office building, a light source image representing light of a fluorescent light leaking through a window is used, while in the case of a high-rise building, a light source image representing light of an aircraft warning light is used.
  • Moreover, in the aforementioned aspect representing the light emission state at night of the building, the appearance attribute information includes information representing a height or a the number of floors of the building, and the attribute representation image superimposing unit may display the attribute representation image in the number or size according to the height or the number of floors of the building.
  • By configuring as above, the higher building or a building with more floors can be represented with light leaking from many windows.
  • The number of light source images may match the number of floors of the building such as one for a one-storied building and three for a three-storied building or may be changed in steps such as one for one to five-storied building and two for six to ten-storied building. It can be changed in a monotonic increase manner or in steps by various methods in accordance with the height or the number of floors of the building. Moreover, an upper limit value or a lower limit value may be set for the number of light source images.
  • If a plurality of light source images are to be displayed, they may be juxtaposed in a height direction of the building. Then, the height of the building can be visually represented more effectively. The light source images may be spaced away from each other, abut against each other or be overlapped with each other. The light source image may be enlarged or have its shape changed so that the area changes in accordance with the height or the number of floors of the building. In this case, too, it may be monotonic increase or a change in steps.
  • In accordance with one embodiment of the present invention, the light source images may be displayed by being juxtaposed in a right-and-left direction further in accordance with the planar shape of the building.
  • In the 3D map display system in accordance with one embodiment of the present invention, the attribute representation image may be an image representing an appearance of an object physically associated with the feature, and the attribute representation image superimposing unit may display the attribute representation image under a constraint condition based on the shape of the feature. The constraint condition based on the shape of the feature can be a condition that the image does not protrude from the polygon representing the feature, a condition that the image is on a boundary or within a predetermined distance range from the boundary and the like, for example.
  • Such attribute representation images include, for example, a streetlamp or a vehicle on a road, a ship on the sea, plants growing in a field or a mountain and the like. In the case of objects physically associated with the features as above, if they are displayed by ignoring the feature shape, it can be an image giving a sense of discomfort. In the aforementioned aspect, such attribute representation images are displayed under the constraint condition based on the shape of the feature, such sense of discomfort can be avoided.
  • If a constraint condition based on the shape of the feature is imposed, the attribute representation image superimposing unit may arrange a model for displaying the attribute representation image on the 3D model prior to generation of the projection view. The attribute representation image is displayed by being superimposed on the feature by being projected together with the 3D model of the feature.
  • As described above, according to the method of arranging the model for the attribute representation image on the 3D model, a positional relation with the 3D model can be regulated relatively easily, and there is a merit that the aforementioned constraint condition can be easily maintained. Moreover, since the projection processing is applied also to the model for the attribute representation image, there is a merit that perspective similar to the 3D model of the feature can be given to the attribute display image.
  • Even if such constraint condition is imposed, as described above, it is possible to employ a method of directly drawing the attribute representation image as a 2D image or of drawing in layers and superimposing. On the other hand, it is also possible to employ a method of arranging images on the 3D model of the feature and then, performing projection, including the attribute representation images to which the aforementioned constraint condition is not imposed.
  • Moreover, the model for displaying the attribute representation image may be three-dimensional or two-dimensional. If the 2D attribute representation image is used, such a method may be employed that a texture composed of the attribute representation image is generated by arranging all the attribute representation images displayed in the 3D map on one plane in accordance with a positional coordinate of the corresponding feature, and this is pasted to the 3D model and then, projected.
  • As a specific example of an aspect to which the aforementioned constraint condition is imposed, the feature is a road, the attribute representation image is a light source image representing light of a streetlamp arranged on the road, and the attribute representation image superimposing unit may display the light source image under a constraint condition that the image is along the road.
  • By configuring as above, the light of the streetlamp can be represented on the road, and reality of the night view of the road can be improved. The arrangement of the light source images can be set in various ways, but by regularly arranging them along the road, reality can be further improved. The arrangement of the light source images may be changed in accordance with the number of lanes of the road and the like. For example, if the number of lanes of the road is small, they can be arranged in one row or if the number is large, they may be arranged in plural rows (two rows, for example). Moreover, a light source image representing a lighted state of a traffic signal may be arranged at a crossing of the road.
  • The present invention does not have to comprise all the aforementioned various characteristics but a part of them can be omitted or combined as appropriate in configuration. Moreover, the present invention can be configured as an invention of a 3D map display method other than the configuration as the aforementioned 3D map display system. Moreover, the present invention can be realized in various modes such as a computer program for realizing them and a recording medium recording the program, a data signal including the program and embodied in a carrier wave and the like. In each of the modes, it is possible to apply the various additional elements illustrated above.
  • When the present invention is configured as the computer program or the recording medium or the like recording the program, it may be configured as an entire program for controlling an operation of the 3D map display system or only a portion performing the function of the present invention may be configured. Moreover, as the recording medium, various computer-readable mediums such as a flexible disk, CD-ROM, DVD-ROM, a magneto-optical disk, an IC card, a ROM cartridge, a punch card, a printed matter on which a code such as barcode is printed, an internal storage device of a computer (a memory such as a RAM and a ROM), and an external storage device can be used.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is an explanatory view illustrating an outline configuration of a navigation system in an embodiment.
  • FIGS. 2A through 2C are explanatory views illustrating contents of map data 22.
  • FIG. 3 is an explanatory view illustrating an example of a setting table 24T and attribute representation image data 24.
  • FIG. 4 is a flowchart illustrating a flow of route guidance processing.
  • FIG. 5 is an explanatory view illustrating a display method of a 3D map of this embodiment.
  • FIG. 6 is a flowchart illustrating a flow of 3D map display processing.
  • FIG. 7 is a flowchart illustrating a flow of building attribute representation image drawing processing.
  • FIG. 8 is a flowchart illustrating a flow of road attribute representation image drawing processing.
  • FIG. 9 is an explanatory view illustrating a display example of a 3D map (bird-eye view) when a display mode is a night-view mode.
  • FIGS. 10A and 10B are explanatory views illustrating a display example of a light source image of a variation.
  • FIG. 11 is a flowchart illustrating a flow of road attribute representation image drawing processing of a variation.
  • FIG. 12 is a flowchart illustrating a flow of attribute representation image drawing processing of the variation.
  • FIG. 13 is a flowchart illustrating a flow of the attribute representation image drawing processing of another variation.
  • DETAILED DESCRIPTION OF EMBODIMENTS OF THE INVENTION
  • An embodiment of the present invention will be described on the basis of an embodiment when a 3D map display system of the present invention is applied to a navigation system for giving route guidance from a departure place (current place) to a destination place. In the following, an example of the navigation system is illustrated, but the present invention is not limited to such an example but can be configured as various devices for displaying a 3D map.
  • A. System Configuration:
  • FIG. 1 is an explanatory view illustrating an outline configuration of a navigation system in the embodiment. The navigation system is configured by connecting a server 100 and a terminal 10 having a function as a 3D map display device via a network NE. In addition, it may be configured as a standalone device by incorporating a function provided by the server 100 of this embodiment in the terminal 10 or may be configured as a distributed system provided with many more servers and the like.
  • The server 100 is provided with functional blocks of a map database 20, a transmission/reception unit 101, a database management unit 102, and a route search unit 103, as illustrated. These functional blocks can be configured in a software manner by installing a computer program for realizing the respective functions in the server 100. At least a part of these functional blocks may be configured in a hardware manner.
  • In the map database 20, map data 22, a setting table 24T, attribute representation image data 24, character data 26, and network data 28 are stored.
  • The map data 22 is data for displaying a 3D map during route guidance and includes a 3D model (polygon) and the like as drawing data for three-dimensionally drawing various features such as sea, mountain, river, road, building and the like. The setting table 24T regulates what attribute representation image is to be used to decorate an appearance of a feature. In this example, the attribute representation images include those prepared in the form of image data in advance and those generated on the basis of a function provided in a graphics library when drawing a map. The attribute representation image data 24 stores image data of the attribute representation image which should be prepared in advance. Data structures of the map data 22, the setting table 24T, and the attribute representation image data 24 will be described later. The character data 26 is data representing characters displayed in the map. The network data 28 is data for route search representing roads as a collection of links and nodes.
  • Each functional block of the server 100 provides the following functions, respectively. The transmission/reception unit 101 conducts transaction of various commands, data and the like with the terminal 10 via the network NE. In this embodiment, the commands relating to route search and map display, various types of data stored in the map database 20 and the like are transmitted/received, for example. The database management unit 102 controls reading-out of the data from the map database 20. The route search unit 103 executes route search from the departure place to the destination place specified by a user by using the map database 20. In the route search, a known method such as Dijkstra's algorithm or the like can be applied.
  • The terminal 10 comprises a CPU, a ROM, a RAM, a hard disk drive and the like. The CPU functions as a transmission/reception unit 12 and a display control unit 13 by reading out and executing an application program stored in the hard disk drive. The display control unit 13 comprises a projection view generation unit 14, an attribute representation image superimposing unit 15, and a character display control unit 16. At least a part of these units may be configured by hardware.
  • A command input unit 11 receives an input of an instruction by a user relating to route search and map display. The transmission/reception unit 12 transmits/receives various commands, data and the like with the server 100 via the network NE. A data holding unit 17 temporarily holds data obtained from the server 100. A positional information obtaining unit 18 obtains information required for route search and route guidance such as a current position and azimuth of the terminal 10 by a sensor such as a GPS (Global Positioning System) and an electromagnetic compass.
  • The projection view generation unit 14 generates a projection view obtained by three-dimensionally drawing a feature by a perspective projection method by using the map data 22. The attribute representation image superimposing unit 15 has the attribute representation image superimposed on the projection view displayed by using the attribute representation image data 24 and the like. The character display control unit 16 controls display of the character representing information relating to the feature on the projection view by using the character data 26. The display control unit 13 controls operations of the projection view generation unit 14, the attribute representation image superimposing unit 15, and the character display control unit 16 and displays a 3D map generated by them on the display device 30 of the terminal 10.
  • In this embodiment, a “day-view mode” which is a display mode displaying a view of a day time and a “night-view mode” which is a display mode displaying a view of a night time are prepared as display modes of the 3D map.
  • B. Map Data:
  • FIGS. 2A through 2C are explanatory views illustrating contents of the map data 22. As illustrated in FIG. 2A, in the map data 22, a feature ID specific to each feature is given, and various types of data illustrated for each feature are managed.
  • A “type” indicates a type of features such as a “building”, a “road”, a “railway”, the “sea”, a “lake”, a “river”, a “mountains and forests”, a “field/plain” and the like. A “name” is a name of the feature. A “3D model” is polygon data for displaying each feature three-dimensionally. This data corresponds to drawing data in the present invention. A “texture” is an image pasted in accordance with a shape of a feature (3D model) in texture mapping. An “attribute” is data indicating various natures of a feature in accordance with the type of the feature. As illustrated in FIG. 2B, if the type of the feature is a building, for example, detailed types of the building such as a high-rise building, an office building, a house and the like (detailed type), a height or a the number of floors of the building, a width of the building and the like are included in the attributes. Moreover, as illustrated in FIG. 2C, if the type of the feature is a road, detailed types of the road such as a highway, a national route, a prefectural route, a general road, a narrow street and the like (detailed type), a number of lanes of the road, a width of the road and the like are included in the attributes.
  • C. Attribute Representation Image:
  • FIG. 3 is an explanatory view illustrating an example of the setting table 24T and the attribute representation image data 24. The setting table 24T regulates the attribute representation image used for decorating an appearance of a feature. In this embodiment, since the day-view mode and the night-view mode are provided as the map display modes, the attribute representation image is also set for each display mode. The attribute representation image data 24 is a database storing two-dimensional image data for displaying the attribute representation image in association with identification information ID. In the setting table 24T, the attribute representation image is associated with the type/detailed type of the feature as illustrated.
  • In the day-view mode of the road/highway, identification information ID1 is stored. This indicates that the identification information ID1 of the attribute representation image data 24, that is, image data of an automobile is used as the attribute representation image. On the other hand, in the night-view mode, a “light source (orange)” is set. This indicates that, when a map is drawn by using the function of the graphics library, a spherical light source image representing orange-color light of a streetlamp (sodium-vapor lamp) installed on the road is generated and displayed. Similarly, for the national route, the identification information ID1 “automobile” is used similarly to the highway in the day-view mode, but in the night-view mode, a light source in a different color as a “light source (white)” is used. For the prefectural road, in the day-view mode, such setting is made that the attribute representation image is not used. By changing the setting in accordance with the type of the road as above, appearance of each type can be represented.
  • Similarly, those using the image data prepared in the attribute representation image data 24 include a railway identification information ID2 “train”, a sea/port identification information ID3 “ship”, a mountains and forests/conifer forest identification information ID4 “conifer forest” and mountains and forests/broadleaf forest identification information ID5 “broadleaf forest” and the like in the day-view mode, respectively.
  • On the other hand, those generated in drawing and used in the night-view mode include square and circular window light source images representing white light of a fluorescent light leaking through the window of a building, circular aircraft warning light images representing red light of an aircraft warning light installed on a rooftop of a high-rise building and the like, and a ship light source image lighted on a ship. Moreover, those used in the day-view mode include a wave light source image representing brightness of sea waves, white-point images representing sea spray and the like. The attribute representation images can be prepared for various features other than the above.
  • D. Route Guidance Processing:
  • By using processing when route search and route guidance are made by the navigation system of the embodiment as an example, display control of the 3D map in this embodiment will be described below. FIG. 4 is a flowchart illustrating a flow of the route guidance processing. Processing contents of the terminal 10 and the server 100 are not described separately, but this processing is executed by both in collaboration.
  • When the processing is started, the navigation system inputs instructions of a departure place, a destination place, and a display mode (Step S10). As the departure place, a current position obtained by the positional information obtaining unit 18 may be used as it is. As the display mode, the “day-view mode” which is a display mode displaying a view of the day time and the “night-view mode” which is a display mode displaying a view of the night time are prepared. The display mode may be automatically switched in accordance with time when the route guidance processing is executed by providing a function of obtaining the time in the navigation system.
  • Subsequently, the navigation system executes route search processing on the basis of specification from the user (Step S12). This processing is executed by the server 100 by using the network data 28 stored in the map database 20 and can be executed by a known method such as Dijkstra's algorithm or the like. The obtained route is transmitted to the terminal 10.
  • Upon receipt of a result of the route search, the terminal 10 executes the route guidance by the following procedure while performing 3D map display. First, the terminal 10 inputs a current position from the sensor such as a GPS and the like (Step S14) and determines a view point position and a line of sight direction when the 3D map is displayed (Step S16). The line of sight direction can be a direction in which a future position is seen on the route from the current position to the destination place, for example. The view point position can be behind the current position only by a predetermined distance, for example, and a height and an angle (an elevation angle, a depression angle) of the view point can be adjusted arbitrarily by the user from a value set in advance. Then, the terminal 10 executes 3D map display processing (Step S18). The 3D map display processing will be described later in detail. The terminal 10 repeatedly executes the processing from Steps S14 to S18 until the destination place is reached (Step S20: YES).
  • E. 3D Map Display Processing:
  • In this embodiment, by drawing various attribute representation images described in FIG. 3, the appearance of the feature is decorated and the 3D map is displayed. The display of the attribute representation image can be made in various ways but first, a method of superimposing an attribute representation image layer on which the attribute representation image is drawn two-dimensionally on the projection view obtained by performing perspective projection of the 3D model will be described.
  • FIG. 5 is an explanatory view illustrating a display method of the 3D map of this embodiment. A relation among the 3D model, the projection view and the attribute representation image layer is schematically illustrated. The projection view is a two-dimensional image drawn by perspective projection of the 3D model. In the projection view by perspective projection, the 3D model in a short-range view is drawn larger on the lower region, and the 3D model in a long-range view is drawn smaller on the upper region, whereby perspective is represented.
  • The attribute representation image layer is a layer on which the attribute representation image is drawn two-dimensionally and is prepared separately from the projection view. Regarding a display position of each of the attribute representation images on the attribute representation image layer, a two-dimensional coordinate of each feature in the projection view is obtained and the display position of the attribute representation image can be determined on the basis of this coordinate value, for example. After the display position of the attribute display image is acquired in a three-dimensional space on the basis of the 3D model of each feature, the coordinate in the 2D image may be acquired by executing coordinate conversion similar to perspective projection. The attribute representation image layer is superimposed on a front surface of the projection view. In the attribute representation image layer, similarly to the feature in the projection view, the attribute representation image is reduced and drawn on the upper region. In this way, perspective can be given also to the attribute representation image.
  • FIG. 6 is a flowchart illustrating a flow of the 3D map display processing. This processing corresponds to Step S18 in the route guidance processing illustrated in FIG. 4 and is the processing executed by the terminal 10.
  • When the processing is started, the terminal 10 inputs a view point position, a line of sight direction, and a display mode (Step S100). Then, the terminal 10 reads in a 3D model of a feature present in a display target area determined on the basis of the view point position and the line of sight direction and the attribute representation image data 24 corresponding to each feature from the map database 20 (Step S110). Then, the terminal 10 determines whether the display mode of the 3D map is the day-view mode or the night-view mode (Step S120). If the display mode is the night-view mode, the terminal 10 darkens the 3D model and the entire background thereof (Step S130). On the other hand, if the display mode is the day-view mode, the terminal 10 skips Step S130 and proceeds with the processing to Step S140.
  • Then, the terminal 10 performs rendering by the perspective projection method while performing hidden line elimination on the basis of the view point position and the line of sight direction set at Step S100 generates a projection view drawing a feature three-dimensionally (Step S140). Then, the terminal 10 obtains a two-dimensional coordinate value in the projection view for each feature and sets a display position of the attribute representation image in the attribute representation image layer on the basis of this coordinate value (Step S150). At this time, the terminal 10 generates an attribute representation image (see FIG. 3) corresponding to each feature and not stored in the map database 20 in accordance with the display mode by using the function of the graphics library and also sets the display position of this attribute representation image.
  • Then, the terminal 10 expands/contracts a size of the attribute representation image in accordance with the display position of each feature in the projection view (Step S160). Regarding the size of the attribute representation image, a reference size based on the feature displayed at a center in a vertical direction in the projection view is set in accordance with the map scale, and the lower the display position of the feature is located, the larger the terminal 10 enlarges the size of the attribute representation image, and the upper the display position of the feature is located, the smaller the terminal 10 reduces the size of the two-dimensional light source image.
  • Then, the terminal 10 executes attribute representation image drawing processing (Step S170). The attribute representation image drawing processing is processing of two-dimensionally drawing the attribute representation image on the attribute representation image layer. As illustrated in FIG. 3, there are various types of attribute representation images, but since the drawing method is different depending on the type, specific processing contents will be described later.
  • When the attribute representation image drawing processing is finished, the terminal 10 reads in character data which is a display target and displays each character superimposed on the map (Step S180). The terminal 10 finishes the 3D map display processing by the processing described above.
  • FIG. 7 is a flowchart illustrating a flow of building attribute representation image drawing processing. This processing corresponds to a part of Step S170 (attribute representation image drawing processing) in the 3D map display processing illustrated in FIG. 6 and is sequentially executed for a building present in a display target area if the display mode is the night-view mode.
  • When the processing is started, the terminal 10 selects a building to be processed (Step S200). Then, the terminal 10 sets a display mode of a window light source image which is the attribute representation image on the basis of the attribute information of the building (Step S210). The terminal 10 sets M pieces of the window light source images to a building with N floors. In this embodiment, as illustrated in a frame at Step S210, the number of window light source images is set in steps such as one for a one-storied to five-storied building, two for a six-storied to ten-storied building, and three for an eleven-storied to fifteen-storied building and moreover, an upper limit value is set to the number of window light source images. The number of the window light source images may be the same as the number of floors of the building but by setting it in steps as above, the number of the window light source images displayed in the map can be prevented from becoming excessive. Moreover, by providing the upper limit value and by keeping the number of images of window light sources to the upper limit value regardless of the number of floors for a building with certain floors or more, the number of window light source images for one feature can be prevented from becoming excessive.
  • Then, the terminal 10 draws M pieces of the window light source images at corresponding positions (see Step S150 in FIG. 6) of the projection view of the building in the attribute representation image layer (see FIG. 5) (Step S220). In this embodiment, as illustrated in the frame at Step S220, the window light source image is a circular image having a radius r, and if there are a plurality of the window light source images, the terminal 10 draws the window light source images in abutting state in a juxtaposed manner in a height direction of the building. A part of the window light source images may protrude from the shape of the building.
  • A position of a light source [1] uses a display position previously obtained at Step S150 in FIG. 6. The light source [M] which is the M-th window light source image is drawn on the basis of the position of the light source [1] which is the first window light source image with an offset in the vertical direction. The offset amount at this time is calculated by the following equation (1):

  • Offset[M]=(−1)2r×ROUNDUP((M−1)/2,0)  (1)
  • where “ROUNDUP(X, 0) is a function for rounding up to the first decimal place of a numeral value X.
  • Moreover, though not shown, the terminal 10 draws the light source image of the aircraft warning light at a position corresponding to a corner of a rooftop of the building, for example, if the light source image of an aircraft warning light is set with respect to the building to be processed.
  • By means of the processing above, the terminal 10 finishes the building attribute representation image drawing processing.
  • Arrangement of the light source images can take various modes other than the above. The light sources may be spaced away from each other or in the case of a predetermined number or more of the light sources, they may be separated into two rows or more and displayed. Depending on the position where the light source [1] is displayed, the light sources [M] may be sequentially arrayed in an upper direction or a lower direction from the light source [1] and displayed. The light sources may be arranged at random in a region around the display position of the light source [1].
  • FIG. 8 is a flowchart illustrating a flow of the road attribute representation image drawing processing. This processing corresponds to a part of Step S170 (attribute representation image drawing processing) in the 3D map display processing illustrated in FIG. 6 and is executed sequentially for the road present within a display target area.
  • When the processing is started, the terminal 10 selects a road to be processed (Step S300). Then, the terminal 10 sets the attribute representation image to be drawn (Step S310) on the basis of the attribute information and the display mode of the road. As illustrated in the frame at Step S310, if the road to be processed is a highway, and if the display mode is the day-view mode, for example, the terminal 10 selects an image of an automobile as the attribute representation image. If the road to be processed is a highway, and if the display mode is the night-view mode, for example, the terminal 10 selects the light source image in orange representing a streetlamp as the attribute representation image. Then, the terminal 10 draws the selected attribute representation image in a shape of the road in the attribute representation image layer (see FIG. 5) (Step S320).
  • As illustrated on the left side in the frame at Step S320, if the display mode is the night-view mode, the terminal 10 draws the plurality of light source images representing the light of the streetlamps along the road so as to be apparently arranged at equal intervals. In this embodiment, the road is assumed to be drawn not as a polygon but by a line with a line width. Thus, the terminal 10 obtains coordinates of passage points P1, P2, and P3 of line data after projection as display positions when the road is rendered (see Step S150 in FIG. 6). Then, in the processing at Step S320, the line of the road on the attribute representation image layer is acquired on the basis of the coordinates of these passage points P1 to P3, and the light source images are arranged at predetermined intervals d along this line. The interval d can be set arbitrarily on the basis of a scale of map display and the like. In the perspective-projected image, considering the fact that the scale of the distance is reduced on the upper part, that is, in the long-range view, the interval d may be made sequentially shorter as it goes to the upper part.
  • At this time, the terminal 10 changes an array of the light source images in accordance with the type of the road and the number of lanes. For a high way with many lanes, for example, the terminal 10 arrays the light source images in two rows, while for a narrow street with fewer lanes, the terminal 10 arrays the light source images in one row. When the light source images are displayed in two rows or more, it is only necessary to arrange the light source images at positions offset from the road line in a width direction of the road. The offset amount at this time is preferably set on the basis of the line width of the road so as not to be largely deviated from the road, but it does not necessarily have to be contained in the road. By representing the light sources in such a mode, the light sources can be drawn under the constraint condition of arrangement along the road line.
  • If the display mode is the day-view mode, as illustrated on the right side in the frame at Step S320, the terminal 10 performs drawing so that the image of the automobile is arranged at random on the road. A method of specifying the road line is similar to the case of the night-view mode. Regarding an arrangement position of the image of the automobile, the position on the road line may be determined by using random numbers or the interval between the automobiles may be determined by using random numbers. When the images of the automobiles are displayed, they may be changed to two rows or more in accordance with the type of the road or the number of lanes. However, in the case of the image of the automobile, the constraint condition is stricter than the light source, and the offset amount needs to be determined so as not to protrude from the line width of the road.
  • By means of the processing above, the terminal 10 finishes the road attribute representation image drawing processing.
  • The attribute representation image drawing processing for the features other than the building and the road is substantially the same as the aforementioned road attribute representation image drawing processing. That is, the terminal 10 selects the feature to be processed, sets the attribute representation image to be drawn on the basis of the attribute information and the display mode, and draws the attribute representation image at the position corresponding to the projection view in the attribute representation image layer. However, the terminal 10 switches the condition on the drawing position of the attribute representation image in accordance with whether the attribute representation image is an image physically associated with the feature or an image not physically associated with the feature.
  • The attribute representation images not physically associated with the feature include an image representing sea spray on a coastline, for example. In this case, the terminal 10 allows white points, lines and the like representing the sea spray with respect to a polygon representing the sea at random and protruding from the boundary of the polygon in the vicinity of the coast line. Moreover, the attribute representation images physically associated with the feature include images representing a ship floating on the sea, plants growing on the fields and mountains, for example. In this case, the terminal 10 performs drawing of the attribute representation images under the condition that the image does not protrude from the polygon representing the feature and the condition that the image is on the boundary or within a predetermined distance range from the boundary, for example.
  • As described above, by switching the condition on the drawing position of the attribute representation image in accordance with the feature, diversified attribute representation images can be displayed for various features, and reality of the 3D map can be improved.
  • FIG. 9 is an explanatory view illustrating a display example of a 3D map (bird's eye view) when the display mode is the night-view mode. This 3D map is made by the aforementioned 3D map display processing (night-view mode). In an actual map, the sky is drawn in navy blue but in FIG. 9, in order to clearly represent the boundary between the sky and the mountain, the sky is drawn in white. Moreover, characters in the map are omitted.
  • On a display screen WD of the display device 30, as illustrated, the light source images representing light leaking through the windows of the building and the like are displayed in a concentrated manner in the vicinity of a region A indicated by surrounding with a broken line, and the night view of a city in which a large number of buildings stand is represented with reality. Moreover, on the display screen WD, the light source images representing light of the streetlamps on the road and the like are displayed linearly, and the night view of the plurality of roads such as a road RD and the like is represented with reality.
  • According to the navigation system of this embodiment described above, by displaying the attribute representation image superimposed on the projection view in the 3D map, the appearance attribute of a feature can be represented spuriously and simply. Therefore, reality of the 3D map can be improved with a relatively light load.
  • F. Variations:
  • Some embodiments of the present invention have been described above, but the present invention is not limited to these embodiments but is capable of practice in various modes within a range not departing from the gist thereof. Variations as follows are possible, for example.
  • F1. Variation 1:
  • In the embodiment, if the display mode of the 3D map is the night-view mode, the light source images (window light source images) in the number according to the number of floors of the building are arranged, but the present invention is not limited to that. FIGS. 10A and 10B are explanatory views illustrating display modes of a light source image of a variation. As illustrated in FIG. 10A, the higher the building is, the larger the size of the circular light source image may be made. Moreover, as illustrated in FIG. 10B, the higher the building is, the light source image may be deformed to a vertically long oval. By means of such a display mode, to, the height of the building can be represented spuriously.
  • F2. Variation 2:
  • In the embodiment, in the road attribute representation image drawing processing, the light source image representing the light of the streetlamp is drawn in the shape of the road in the attribute representation image layer, but the present invention is not limited to that. FIG. 11 is a flowchart illustrating a flow of the road attribute representation image drawing processing of a variation. This processing is executed sequentially for the road present in the display target area immediately after Step S130 in FIG. 6 if the display mode of the 3D map is the night-view mode.
  • When the processing is started, the terminal 10 selects the road to be processed (Step S400). Then, the terminal 10 sets a spherical light source model for displaying the light source image as the attribute representation image representing the light of the streetlamp on the basis of the attribute information of the road (Step S410). If the road to be processed is a highway, the terminal 10 sets the spherical model in orange. If the road to be processed is a national route or a prefectural route, the terminal 10 sets a spherical model in white. A diameter of the spherical model can be set arbitrarily within the road width, for example, and it can be 1/10 of the road width, for example.
  • Then, the terminal 10 arranges the set light source models on the 3D space along the road (Step S420). In this embodiment, as illustrated in the frame at Step S420, spherical light source models are arranged at the equal intervals d along the road at positions at a height h from the road. Then, the terminal 10 renders the light source model and the 3D model of the feature by the perspective projection method on the basis of the view point position and the line of sight direction set at Step S100 in FIG. 6 and generates a projection view displayed with the light source image superimposed (Step S430).
  • By means of the processing above, the terminal 10 finishes the road attribute representation image drawing processing.
  • According to the road attribute representation image drawing processing of this variation, since perspective projection is made to the light source model representing the light of the streetlamp, perspective similar to the 3D model of the feature can be given to the light source image representing the light of the streetlamp.
  • F3. Variation 3:
  • In the embodiment, in the 3D map display processing, the attribute representation image is drawn on the attribute representation image layer and this is superimposed on the projection view, but the present invention is not limited to that. FIG. 12 is a flowchart illustrating a flow of the attribute representation image drawing processing of a variation. This processing is executed immediately after Step S130 in FIG. 6 if the display mode of the 3D map is the night-view mode.
  • When the processing is started, the terminal 10 selects the feature to be processed (Step S500). Then, the terminal 10 sets the corresponding attribute representation image (light source image) on the basis of the attribute information of the feature (Step S510). Then, the terminal 10 arranges the attribute representation images on one plane in accordance with the position coordinate of the feature (Step S520). This state is schematically illustrated in the frame at Step S520. In the illustrated example, for the building, the terminal 10 arranges a circular light source image in a size according to a height of the building at a position corresponding to the position coordinate of the building. Moreover, for the road, the terminal 10 arranges the circular light source images representing the light of the streetlamps along the road at equal intervals at positions corresponding to the position coordinate of the road. The one in which the attribute representation images of a plurality of features are arranged on one plane is called an attribute representation image texture.
  • Then, the terminal 10 determines whether or not all the attribute representation images have been arranged for all the features to be processed (Step S530). If all the attribute representation images are not arranged yet (Step S530: NO), the terminal 10 returns the processing to Step S500. On the other hand, if all the attribute representation images have been arranged (Step S530: YES), the terminal 10 pastes the attribute representation image texture to the 3D model (Step S540). Then, the terminal 10 renders the 3D model by the perspective projection method on the basis of the view point position and the line of sight direction set at Step S100 in FIG. 6 and generates a projection view displayed with the light source image superimposed (Step S550).
  • By means of the processing above, the terminal 10 finishes the attribute representation image drawing processing.
  • F4. Variation 4:
  • FIG. 13 is a flowchart illustrating a flow of the attribute representation image drawing processing of another variation. The attribute representation image drawing processing of this variation is processing when a wide area view of such a degree that individual buildings cannot be visually recognized is displayed. The flow of the attribute representation image drawing processing of this variation is substantially the same as the attribute representation image drawing processing of the variation illustrated in FIG. 12. This processing is also executed immediately after Step S130 in FIG. 6 when the display mode of the 3D map is the night-view mode.
  • When the processing is started, the terminal 10 selects a feature to be processed (Step S600). Then, the terminal 10 sets the corresponding attribute representation image (light source image) on the basis of the attribute information of the feature (Step S610). Then, the terminal 10 arranges the attribute representation images on one plane in accordance with the position coordinate of the feature (Step S620). This state is schematically illustrated in the frame at Step S620. In the illustrated example, for the building, the terminal 10 arranges a circular light source image in a size according to a height of the building at a position corresponding to the position coordinate of the building. Moreover, for the road, the terminal 10 arranges the circular light source images representing the light of the streetlamps along the road at equal intervals at positions corresponding to the position coordinate of the road. The 3D model of the building is drawn by a broken line in this drawing in order to indicate that the 3D model is not rendered for the building.
  • Then, the terminal 10 determines whether or not all the attribute representation images have been arranged for all the features to be processed (Step S630). If all the attribute representation images are not arranged yet (Step S630: NO), the terminal 10 returns the processing to Step S600. On the other hand, if all the attribute representation images have been arranged (Step S630: YES), the terminal 10 pastes the attribute representation image texture to the 3D model (Step S640). At this time, the terminal 10 pastes the attribute representation image after the 3D model of the building is removed.
  • Then, the terminal 10 renders the 3D model of those other than the building by the perspective projection method on the basis of the view point position and the line of sight direction set at Step S100 of FIG. 6 and generates a projection view displayed with the light source image superimposed (Step S650). In this projection view, the 3D model of the building is not projected, but a light source image corresponding to the building is projected.
  • By means of the processing above, the terminal 10 finishes the attribute representation image drawing processing.
  • F5. Variation 5:
  • The various processing described in the embodiment and the variations do not have to be provided with all but a part of them may be omitted or replaced by other processing.
  • F6. Variation 6:
  • In the embodiment, the example in which the 3D map display system of the present invention is applied to the navigation system is illustrated, but regardless of the route search/route guidance function, it can be constituted as a device for displaying a 3D map.
  • F7. Variation 7:
  • In the embodiment, the processing executed in a software manner may be executed in a hardware manner or vice versa.
  • INDUSTRIAL APPLICABILITY
  • The present invention can be used for technologies for displaying a 3D map representing a feature three-dimensionally.

Claims (10)

What is claimed is:
1. A three-dimensional map display system for displaying a three-dimensional map representing features thereon three-dimensionally, the system comprising:
a map database for storing drawing data including a three-dimensional model of a feature, and appearance attribute information representing an appearance attribute of the feature, by associating the drawing data and the appearance attribute information with each other;
a projection view generation unit configured to generate a projection view by projecting the drawing data; and
an attribute representation image superimposing unit configured to display an attribute representation image by superimposing the attribute representation image on the projection view based on the appearance attribute information, the attribute representation image including a first attribute representation image representing an appearance for the feature according to the appearance attribute information, the first attribute representation image not being bound by a shape of the feature.
2. The thee-dimensional map display system according to claim 1, wherein
the attribute representation image includes the first attribute representation image representing the appearance of an object without physical association with the feature; and
the attribute representation image superimposing unit allows displays the first attribute representation image to protrude from the shape of the feature.
3. The thee-dimensional map display system according to claim 1, wherein
the attribute representation image is a two-dimensional image.
4. The thee-dimensional map display system according to claim 1, wherein
the feature includes a building;
the appearance attribute information includes information representing a light emission state of the building at night;
the attribute representation image includes a light source image representing light emission of the building at night; and
the attribute representation image superimposing unit displays the attribute representation image in accordance with the light emission state.
5. The thee-dimensional map display system according to claim 4, wherein
the appearance attribute information includes information representing a height or a number of floors of the building; and
the attribute representation image superimposing unit displays the attribute representation image with a number or size according to the height or the number of floors of the building.
6. The thee-dimensional map display system according to claim 1, wherein
the attribute representation image includes a second attribute representation image representing an appearance of an object physically associated with the feature; and
the attribute representation image superimposing unit displays the second attribute representation image under a constraint condition in accordance with the shape of the feature.
7. The thee-dimensional map display system according to claim 6, wherein
the attribute representation image superimposing unit arranges a model for displaying the second attribute representation image on the thee-dimensional model prior to generation of the projection view.
8. The thee-dimensional map display system according to claim 6, wherein
the feature includes a road;
the attribute representation image is a light source image representing light of a streetlamp arranged on the road; and
the attribute representation image superimposing unit displays the light source image under a constraint condition such that the image is disposed along the road.
9. A three-dimensional map display method for displaying a three-dimensional map representing features thereon three-dimensionally, executed by a computer, the method comprising:
a map database referring step for referring to a map database, the map database storing drawing data including a three-dimensional model of a feature and appearance attribute information representing an appearance attribute of the feature by associating the drawing data and the appearance attribute information with each other;
a projection view generating step for generating a projection view obtained by projecting the drawing data; and
an attribute representation image superimposing step for displaying an attribute representation image by superimposing the attribute representation image on the projection view based on the appearance attribute information, the attribute representation image including a first attribute representation image representing an appearance for the feature according to the appearance attribute information, the first attribute representation image not being bound by a shape of the feature.
10. A computer-readable recording medium recording a computer program for displaying a three-dimensional map representing features thereon three-dimensionally, the computer program causing a computer to execute:
a map database referring function of referring to a map database, the map database storing drawing data including a three-dimensional model of a feature and appearance attribute information representing an appearance attribute of the feature by associating the drawing data and the appearance attribute information with each other;
a projection view generating function of generating a projection view obtained by projecting the drawing data; and
an attribute representation image superimposing function of displaying an attribute representation image by superimposing the attribute representation image on the projection view based on the appearance attribute information, the attribute representation image including a first attribute representation image representing an appearance for the feature according to the appearance attribute information, the first attribute representation image not bound by a shape of the feature.
US14/964,492 2013-06-11 2015-12-09 3d map display system Abandoned US20160098859A1 (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2013122704A JP5959479B2 (en) 2013-06-11 2013-06-11 3D map display system
JP2013-122704 2013-06-11
PCT/JP2014/064661 WO2014199859A1 (en) 2013-06-11 2014-06-03 3d map display system

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2014/064661 Continuation WO2014199859A1 (en) 2013-06-11 2014-06-03 3d map display system

Publications (1)

Publication Number Publication Date
US20160098859A1 true US20160098859A1 (en) 2016-04-07

Family

ID=52022157

Family Applications (1)

Application Number Title Priority Date Filing Date
US14/964,492 Abandoned US20160098859A1 (en) 2013-06-11 2015-12-09 3d map display system

Country Status (6)

Country Link
US (1) US20160098859A1 (en)
EP (1) EP3009988A4 (en)
JP (1) JP5959479B2 (en)
KR (1) KR20160019417A (en)
CN (1) CN105283734A (en)
WO (1) WO2014199859A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110148219A (en) * 2019-04-03 2019-08-20 南昌云虫科技有限公司 The creation method of 3D model
US11199940B1 (en) * 2020-04-21 2021-12-14 Corel Corporation Three-dimensional operations based on planar projections in graphic user interfaces

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110059151B (en) * 2019-04-26 2022-10-25 北京百度网讯科技有限公司 Map rendering method, map rendering device, map server, and storage medium
US20210180960A1 (en) * 2019-12-17 2021-06-17 GM Global Technology Operations LLC Road attribute detection and classification for map augmentation
CN113037829A (en) * 2021-03-03 2021-06-25 读书郎教育科技有限公司 System and method for precisely positioning residential district

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6362817B1 (en) * 1998-05-18 2002-03-26 In3D Corporation System for creating and viewing 3D environments using symbolic descriptors
US20060132482A1 (en) * 2004-11-12 2006-06-22 Oh Byong M Method for inter-scene transitions
US20070105626A1 (en) * 2005-08-19 2007-05-10 Nintendo Software Technology Corporation Touch screen inputs for a video game system
US20070172120A1 (en) * 2006-01-24 2007-07-26 Nokia Corporation Compression of images for computer graphics
US7307637B1 (en) * 2004-09-27 2007-12-11 White Rabbit 3D Llc Method and apparatus for identifying pixel position and geometry in 3D systems
US20080150956A1 (en) * 2004-08-20 2008-06-26 Shima Seiki Manufacturing, Ltd. Mapping Device, Mapping Method and Program Thereof
US20130057550A1 (en) * 2010-03-11 2013-03-07 Geo Technical Laboratory Co., Ltd. Three-dimensional map drawing system

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3015353B2 (en) * 1997-12-05 2000-03-06 株式会社ウォール Three-dimensional city map database generating device and recording medium recording a program therefor
JP3792437B2 (en) * 1999-06-08 2006-07-05 三菱電機株式会社 Car information system
JP2002267464A (en) * 2001-03-08 2002-09-18 Alpine Electronics Inc Navigation system
JP2002287616A (en) * 2001-03-28 2002-10-04 Mitsubishi Electric Corp Simulation map providing system and simulation map providing method
JP2002298162A (en) * 2001-03-29 2002-10-11 Mitsubishi Electric Corp Three-dimensional view display device and three- dimensional view generating device
JP2003162217A (en) * 2001-11-26 2003-06-06 Nec Corp Map information display system, portable radio terminal and server
KR100520708B1 (en) * 2003-10-20 2005-10-14 엘지전자 주식회사 Method for displaying three dimensional map
EP1750238A4 (en) * 2004-03-31 2007-06-13 Pioneer Corp Map creation device and navigation device
JP4436186B2 (en) * 2004-05-12 2010-03-24 アルパイン株式会社 Navigation device and map display method

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6362817B1 (en) * 1998-05-18 2002-03-26 In3D Corporation System for creating and viewing 3D environments using symbolic descriptors
US20080150956A1 (en) * 2004-08-20 2008-06-26 Shima Seiki Manufacturing, Ltd. Mapping Device, Mapping Method and Program Thereof
US7307637B1 (en) * 2004-09-27 2007-12-11 White Rabbit 3D Llc Method and apparatus for identifying pixel position and geometry in 3D systems
US20060132482A1 (en) * 2004-11-12 2006-06-22 Oh Byong M Method for inter-scene transitions
US20070105626A1 (en) * 2005-08-19 2007-05-10 Nintendo Software Technology Corporation Touch screen inputs for a video game system
US20070172120A1 (en) * 2006-01-24 2007-07-26 Nokia Corporation Compression of images for computer graphics
US20130057550A1 (en) * 2010-03-11 2013-03-07 Geo Technical Laboratory Co., Ltd. Three-dimensional map drawing system

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
Mark Segal, Carl Korobkin, Rolf van Widenfelt, Jim Foran, and Paul Haeberli. 1992. Fast shadows and lighting effects using texture mapping. SIGGRAPH Comput. Graph. 26, 2 (July 1992), 249-252. *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110148219A (en) * 2019-04-03 2019-08-20 南昌云虫科技有限公司 The creation method of 3D model
US11199940B1 (en) * 2020-04-21 2021-12-14 Corel Corporation Three-dimensional operations based on planar projections in graphic user interfaces

Also Published As

Publication number Publication date
EP3009988A4 (en) 2017-03-01
JP2014241023A (en) 2014-12-25
WO2014199859A1 (en) 2014-12-18
JP5959479B2 (en) 2016-08-02
KR20160019417A (en) 2016-02-19
EP3009988A1 (en) 2016-04-20
CN105283734A (en) 2016-01-27

Similar Documents

Publication Publication Date Title
CN102183262B (en) The 3D route guidance that individualized and situation is responded to
US20160098859A1 (en) 3d map display system
CN103793452B (en) Map browser and method
US9500496B2 (en) Apparatus, method and computer program for spatially representing a digital map section
CN101162556B (en) Insertion of static elements in digital maps
US20130057550A1 (en) Three-dimensional map drawing system
JP4842677B2 (en) Topographic model forming system and topographic model manufacturing method
JP2005326154A (en) Navigation system and map display method
CN106127853A (en) A kind of unmanned plane Analysis of detectable region method
CN102692228A (en) Landmark icons in digital maps
JPH10207351A (en) Navigation system and medium which stores navigation program using the system
CN105823475A (en) Method of three-dimensional representation of a scene
JP6022386B2 (en) 3D map display device, 3D map display method, and computer program
US20090019382A1 (en) Systems and methods for side angle radar training and simulation
JP6764990B1 (en) Display media, processing equipment and processing programs
US10416548B1 (en) Individually angled mirror array system specialty effects
JP7031043B1 (en) Display media, processing equipment, programs and computer-readable recording media on which the programs are recorded
JP6899476B1 (en) Display media, processing equipment and programs
JP5997580B2 (en) 3D natural feature drawing system
JP2005338748A (en) Navigation apparatus and map display method
CN115981580A (en) Display method and device, storage medium, electronic equipment and vehicle
JP2007170824A (en) Vehicle-mounted map display device
JP2016126668A (en) Three-dimensional map display system
JP2007171229A (en) Map display apparatus

Legal Events

Date Code Title Description
AS Assignment

Owner name: GEO TECHNICAL LABORATORY CO., LTD., JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:KISHIKAWA, KIYONARI;TESHIMA, EIJI;ARAMAKI, MASATOSHI;AND OTHERS;SIGNING DATES FROM 20151203 TO 20151207;REEL/FRAME:037265/0062

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION