US20100138153A1 - Navigation device and image management method - Google Patents

Navigation device and image management method Download PDF

Info

Publication number
US20100138153A1
US20100138153A1 US12/596,722 US59672208A US2010138153A1 US 20100138153 A1 US20100138153 A1 US 20100138153A1 US 59672208 A US59672208 A US 59672208A US 2010138153 A1 US2010138153 A1 US 2010138153A1
Authority
US
United States
Prior art keywords
mobile body
image
unit
intersection
navigation device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
US12/596,722
Other versions
US8374779B2 (en
Inventor
Yuichi Abe
Ryo Watanabe
Kazuhiro Yamane
Tomoyuki Yoshimura
Tadashi Sakai
Toshiyuki Irie
Tomoaki Maeta
Kazuya Murakami
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sanyo Electric Co Ltd
Original Assignee
Sanyo Consumer Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2007199410A external-priority patent/JP4900599B2/en
Priority claimed from JP2007248510A external-priority patent/JP4841527B2/en
Application filed by Sanyo Consumer Electronics Co Ltd filed Critical Sanyo Consumer Electronics Co Ltd
Assigned to SANYO ELECTRIC CO., LTD., SANYO CONSUMER ELECTRONICS CO., LTD. reassignment SANYO ELECTRIC CO., LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: ABE, YUICHI, IRIE, TOSHIYUKI, MAETA, TOMOAKI, MURAKAMI, KAZUYA, SAKAI, TADASHI, WATANABE, RYO, YAMANE, KAZUHIRO, YOSHIMURA, TOMOYUKI
Publication of US20100138153A1 publication Critical patent/US20100138153A1/en
Application granted granted Critical
Publication of US8374779B2 publication Critical patent/US8374779B2/en
Active legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3626Details of the output of route guidance instructions
    • G01C21/3647Guidance involving output of stored or live camera images or video streams
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3602Input other than that of destination using image analysis, e.g. detection of road signs, lanes, buildings, real preceding vehicles using a camera
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3667Display of a road map
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B29/00Maps; Plans; Charts; Diagrams, e.g. route diagram
    • G09B29/10Map spot or coordinate position indicators; Map reading aids

Definitions

  • the present invention relates to a navigation device that displays an image.
  • the present invention relates to a navigation device that stores a captured image of an area to be entered after turning at an intersection and displays the captured image before turning at the intersection.
  • navigation devices that are designed as standard equipment for mobile bodies and portable navigation devices that are designed both for use in any mobile bodies and for use by pedestrians have been commercially available.
  • mobile-phone terminals having navigation functions have also been commercially available.
  • Such navigation devices provide guidance at an intersection as to in which direction to go by various methods.
  • a route from a current position to a destination is previously set and, for example, an enlarged image of an intersection is displayed and a sound notification such as “Turn right at the intersection about 50 meters ahead” is given on reaching a predetermined distance from an intersection to make a turn at.
  • Patent Document 1 discloses another method in which an image of an intersection ahead is captured according to key operation, the captured image of the intersection ahead is stored together with coordinates of the intersection, and the captured image is displayed at a next occasion of approaching the intersection.
  • Patent Document 2 discloses still another method in which an image of an intersection ahead is captured not according to key operation but according to operation of a direction indicator, and the captured image is displayed at a next occasion of approaching the intersection.
  • Patent Document 1 JP-A-H09-014976
  • Patent Document 2 JP-A-2006-221362
  • the user also needs to actually make a right or left turn at an intersection when the user has once passed through the intersection and thus is sure that the user will recognize the shop if the user actually makes a turn and see the view there, but the user does not remember whether the user should make a right turn or a left turn.
  • the present invention has been made in view of the above described problems, and an object of the present invention is to provide a navigation device capable of displaying a scene of an area to be entered after turning before actually doing so.
  • a navigation device includes: a first storage unit storing a map; a display unit; and a control unit displaying the map on the display unit, and the navigation device is characterized in that the navigation device further comprises a second storage unit storing a captured image of an area to be entered after a turn is made at an intersection, and the control unit displays the captured image stored in the second storage unit on the display unit before a mobile body reaches the intersection.
  • the navigation device further includes a first image capturing unit capturing an image of area ahead of the mobile body in the traveling direction of the mobile body, the navigation device is characterized in that when the mobile body has made a turn at the intersection, the control unit captures, by the first image capturing unit, an image of area ahead of the mobile body in the moving direction of the mobile body after the turn to obtain a captured image of the area, and the control unit stores the captured image captured by the first image capturing unit in the second storage unit.
  • the navigation device according to the above described second aspect further includes a direction sensor detecting a moving direction of the mobile body, the navigation device is characterized in that the control unit detects, according to detection result of the direction sensor, timing for capturing the image.
  • the navigation device further includes a second image capturing unit capturing an image of area behind the mobile body in the moving direction of the mobile body
  • the navigation device is characterized in that the control unit captures, by the second image capturing unit, an image of area behind the mobile body in the moving direction of the mobile body before the mobile body reaches the intersection to obtain a captured image of the area, and the control unit stores the captured image captured by the second image capturing unit in the second storage unit.
  • the navigation device further includes a detecting unit detecting ON/OFF states of a direction indicator provided in the mobile body, the navigation device is characterized in that the control unit detects, according to detection result of the detecting unit, timing for capturing the image of an area behind the mobile body in the moving direction of the mobile body before the mobile body reaches the intersection.
  • the navigation device in the navigation device according to the above described fourth or fifth aspect, is characterized in that when the mobile body has not made a turn at the intersection but gone straight therethrough, the control unit deletes the captured image of the area behind the mobile body in the moving direction of the mobile body captured by the second image capturing unit and stored in the second storage unit.
  • the navigation device in the navigation device according to any one of the above-described fourth to sixth aspects, is characterized in that the control unit switches between ON and OFF modes to activate or deactivate image capturing operation performed by the second capturing unit, and in the ON mode is captured an image of area behind the mobile body in a moving direction of the mobile body.
  • the navigation device in the navigation device according to any one of the above-described first to seventh aspects, is characterized in that when the mobile body has made a turn at the intersection, the control unit continues to play on the display unit the captured image stored in the second storage unit until the mobile body reaches a predetermined point after making the turn at the intersection.
  • the navigation device according to any one of the above-described first to eighth aspects further includes a play key, the navigation device is characterized in that the control unit displays, on the display unit, the captured image stored in the second storage unit in response to the play key being operated.
  • the navigation device in the navigation device according to the above-described second or third aspect, is characterized in that when the mobile body has made a turn at the intersection, the control unit stores, in the second storage unit, a direction of the turn made at the intersection in association with the captured image, and the control unit displays, on the display unit, the direction of the turn made at the intersection together with the captured image.
  • the navigation device in the navigation device according to any one of the above-described fourth to seventh aspects, is characterized in that when the mobile body has made a turn at the intersection, the control unit stores, in the second storage unit, in association with the captured image, a direction that is directly opposite to a moving direction in which the mobile body travels after making the turn at the intersection, and the control unit displays, on the display unit, together with the captured image, the direction that is directly opposite to the moving direction in which the mobile body travels after making the turn at the intersection.
  • a navigation device for a mobile body includes: a current position detecting unit detecting a current position; an image capturing unit; an image file storage unit storing data of an image captured by the image capturing unit; a map storage unit storing map information; and a display unit displaying a map image
  • the navigation device is characterized in that the navigation device further comprising: a moving state detecting unit detecting whether the mobile body is in a moving state or in a stopping state; a character string extraction unit extracting, from the map information, at least one of character strings representing a name of a place, a facility, a road, and an intersection at or near a current position; and a file name determination unit determining a file name of data of an image captured by the image capturing unit
  • the navigation device is characterized in that when the image capturing unit captures an image, a file name of data of the captured image is determined by combining a moving state detected by the moving state detecting unit, a character string extracted by
  • the navigation device in the navigation device according to the above described twelfth aspect, is characterized in that the predetermined character string contains either a character string indicating that the mobile body is stopping or a character string indicating that the mobile body is moving.
  • the navigation device in the navigation device according to the above described thirteenth aspect, is characterized in that the predetermined character string contains a predetermined character string indicating that the character string extracted by the character string extraction unit from the map information is a character string related to a predetermined area near a current position.
  • an image management method includes steps of: generating a captured image of an area near a mobile body by an image capturing unit; detecting a moving state of the mobile body whether the mobile body is in a moving state or in a stopping state; obtaining current position information by extracting, from map information, at least one of character strings representing a name of a place, a facility, a road, and an intersection at or near the current position; and determining a name of the captured image by combining a predetermined character string that is prepared beforehand, the moving state, and the current position information.
  • a user before turning at an intersection, a user can see a captured image of an area ahead of the user in the moving direction after turning at the intersection, and thus the user can avoid making a wrong turn at the intersection.
  • the user can see a captured image while waiting at a traffic light.
  • a database of captured images of areas ahead of a mobile body in its moving direction after making turns at intersections can be built by the navigation device alone.
  • the navigation device of the third aspect of the present invention it is possible to detect a turn made at an intersection with a higher accuracy than a GPS in which signals are received intermittently.
  • a captured image of an area behind a mobile body in its moving direction before it reaches an intersection can be captured on an outward trip, and the captured image can be displayed on a return trip.
  • the navigation device of the sixth aspect of the present invention it is possible to prevent unnecessary storage of captured images related to an intersection through which a mobile body has gone straight.
  • the navigation device of the eighth aspect of the present invention it is possible to compare a current view of an area with the captured image of the area stored in the navigation device.
  • the user can find that the captured image stored in the second storage unit shows a view the user saw when the user made a turn at the intersection before.
  • the user can do so just by turning in the direction in which the user actually sees the same view as shown in the actually capture image.
  • the navigation device of the tenth aspect of the present invention since the user, before turning at an intersection, can see not only the captured image of an area ahead of the user in the moving direction after the user turns at the intersection but also the direction to make turn as shown with the captured image, the user can make sure in which direction the user should turn at the intersection.
  • the navigation device of the eleventh aspect of the present invention since the user, before turning at an intersection, can see both captured images after turning right and left at the intersection, the user can make sure in which direction the user should turn at the intersection.
  • a file name of data of a captured image is determined by combining together a predetermined character string and a character string representing a name of a place, an intersection, or a road at an image capturing point, or a name of a place or a facility near the image capturing point.
  • the predetermined character strings are set beforehand according to whether the mobile body is in a moving state or in a stopping state, and according to names of locations, intersections, and roads, or according to names of places, facilities, and the like near an image capturing point.
  • a file name implies an image capturing place and the state of the mobile body, and this makes it easy to arrange a large amount of image data or to find out desired image data later.
  • an image data management method for the navigation device of the above-described twelfth aspect can be provided.
  • the file name is determined by combining a character string representing the name of the image capturing place or a name related to the image capturing place and a predetermined character string selected according to the state of the mobile body, it is easy to arrange a large amount of image data or to find out desired image data later.
  • FIG. 1 is a block diagram to show a relevant part of a navigation device according to Embodiment 1 of the present invention.
  • FIG. 2 is a flow chart to show operation of a control circuit of the navigation device according to Embodiment 1.
  • FIG. 3 is a diagram to show a memory map of a stored image.
  • FIG. 4 is a diagram to show an example of display of a view.
  • FIG. 5 is a block diagram to show a relevant part of a navigation device according to Embodiment 2 of the present invention.
  • FIG. 6 is a flow chart to show operation of a control circuit of the navigation device according to Embodiment 2.
  • FIG. 7 is a block diagram to show a relevant part of a navigation device according to Embodiment 3 of the present invention that is for use in mobile bodies and provided with an image capturing unit.
  • FIG. 8 is a diagram to show examples of combination, according to moving state of a mobile body, of a character string extracted with respect to an image capturing point and a predetermined character string.
  • FIG. 9 is a flow chart to show a procedure of image data management performed in the navigation device according to Embodiment 3 that is for use in mobile bodies and provided with an image capturing unit.
  • FIG. 1 is a block diagram to show a relevant part of a navigation device according to Embodiment 1 of the present invention.
  • a GPS (global positioning system) positioning unit 11 receives radio waves carrying location information from a plurality of GPS satellites, calculates its current position, and feeds data of the current position obtained as a result of the calculation to a control circuit 10 which will be described later.
  • a direction sensor 12 detects a direction based on terrestrial magnetism and feeds the detection result to the control circuit 10 .
  • a speed sensor 13 detects moving speed based on rotation of tire and acceleration, and feeds the detection result to the control circuit 10 .
  • An HDD recording/playback unit 14 has an HDD (hard disk drive), and performs recording/playback of map information 141 and an image (captured image) 142 .
  • the map information 141 includes data of gas stations and landmarks, as well as map data such as of roads and intersections.
  • a front camera 15 captures an image of area ahead of the mobile body, and feeds the captured image of the area to the control circuit 10 .
  • a rear camera 16 captures an image of area behind the mobile body, and feeds the captured image of the area to the control circuit 10 .
  • the front and rear cameras 15 and 16 both capture actual still images of views, but they may also be provided with a moving image capturing function.
  • a sound synthesis circuit 17 generates a sound of a character specified by the control circuit 10 , and outputs the resulting sound to the control circuit 10 .
  • a display unit 18 displays a map or an image on which a current position is superposed.
  • a speaker 19 under control of the control circuit 10 , outputs the sound generated by the sound synthesis circuit 17 .
  • An operating unit 20 is provided with not only an image capture key 201 for turning ON/OFF an image capture mode but also various keys (not shown) via which to operate the navigation.
  • the control circuit 10 controls the units according to a program stored in a ROM 21 .
  • a RAM 22 stores information necessary for the control circuit 10 to perform its operation.
  • the control circuit 10 retrieves a map stored in the HDD recording/playback unit 14 and displays it on the display unit 18 .
  • the control circuit 10 then receives data of a current position from the GPS positioning unit 11 . It takes several seconds for the control circuit 10 to start the first display of the current position after it receives the data of the current position, but thereafter, display of the current position is updated every second.
  • the control circuit 10 superposes the received current position on the map displayed on the display unit 18 .
  • the control circuit 10 displays the current position received from the GPS positioning unit 11 on the map retrieved from the HDD recording/playback unit 14 such that the current position is laid on a road in the map at a point closest to the current position. This is called map matching.
  • a destination For route guidance to be performed, a destination needs to be set first.
  • a map is enlarged, reduced, and scrolled via the operating unit 20 , a cursor is set on a point indicating the destination, and then the point is registered as the destination.
  • the destination can be set through a search by a key-word or a destination type.
  • the destination can be set by such information.
  • the control circuit 10 finds a route from the current position to the destination that is optimal based on predetermined conditions.
  • the conditions which are set by the user, include various requirements such as the shortest route, the shortest time, and no toll road.
  • the control circuit 10 When the route is set, the control circuit 10 starts route guidance. For example, the control circuit 10 displays the route in a distinct color from the other loads, and also displays a distance to the destination from the current position and an estimated arrival time. When the mobile body comes to a position a predetermined distance from an intersection to make a turn at, the control circuit 10 outputs through a speaker 19 sound notification such as “Turn right at the intersection about 50 meters ahead” generated by the sound synthesis circuit 17 .
  • FIG. 2 is a flow chart to show image capturing and display processes performed by the control circuit 10 of Embodiment 1.
  • the control circuit 10 stores a variable A in a register disposed inside the control circuit 10 .
  • A the image capture mode is OFF, and when A is 1, the image capture mode is ON.
  • the initial value of the variable A is 0 (that is, the image capture mode is OFF) (step S 1 ).
  • the control circuit 10 Every time the image capture key 201 is pressed down (Y in step S 2 ), the control circuit 10 accordingly changes the value of the variable A (steps S 3 to S 5 ), and thereby turns ON/OFF the image capture mode.
  • step S 6 While performing the route guidance (Yes in step S 6 ), when the mobile body comes to a position that is 10 meters from the intersection to make a turn at (Yes in step S 7 ), the control circuit 10 captures a still image (actual image) of a view behind the mobile body with the rear camera 16 , and stores the resulting image in the HDD recording/playback unit 14 using an image management method that will be described later (step S 8 ).
  • the captured image of the view behind the mobile body is displayed when the mobile body travels in a direction opposite to the current moving direction.
  • entry and exit directions which are directions in image display, are opposite to those when the mobile body travels in the direction opposite to the current moving direction.
  • an image of the view behind the mobile body captured in making a right turn from south to east at an intersection is the same as an image of the view ahead of the mobile body captured in making a left turn from east to south at the intersection.
  • an image of old image capturing date and time is updated with an image of new image capturing date and time.
  • the control circuit 10 In performing route guidance (Yes in step S 6 ), when the mobile body has finished making a turn at an intersection following the route (Yes in step S 9 ), the control circuit 10 captures a still image of a view ahead of the mobile body with the front camera 15 , and stores the captured image in association with the position information of the intersection in the HDD recording/playback unit 14 (step S 10 ).
  • the navigation device 1 judges, from the detection result of the direction sensor 12 , that the mobile body has made a turn at the intersection and has reached an image capturing point.
  • the captured image of the view ahead of the mobile body is displayed next time the mobile body comes to the intersection.
  • FIG. 3 in storing an image, image information of a view, the image capturing date and time, and the entry and exit directions are stored in association with the position information of the intersection.
  • Completion of turn is judged based on a current position obtained by the direction sensor 12 and the speed sensor 13 interpolating the current position detection by the GPS positioning unit 11 , but instead, completion of turn may be judged based on a change in moving direction detected by the direction sensor 12 . This makes it possible to detect a turn made at an intersection with higher accuracy than with a GPS, in which signals are received intermittently.
  • step S 6 when no route guidance is being performed (No in step S 6 ), if the variable A is 1 (Yes in step S 11 ), when the mobile body comes to a position that is 10 meters from a next intersection while moving (Yes in step S 12 ), the control circuit 10 captures an image of area behind the mobile body and temporarily stores the captured image in association with position information of the intersection in the RAM 22 (step S 13 ). And, if the mobile body does not make a turn at the intersection where image of the area behind the mobile body has been captured but goes straight through the intersection (YES in step S 14 ), the control circuit 10 deletes the temporarily stored image of the area behind the mobile body without storing it in the HDD recording/playback unit 14 (step S 15 ). This helps prevent unnecessary storage of records.
  • step S 13 when the mobile body completes a turn at the intersection where the image of the area behind the mobile body has been captured (Yes in step S 16 ), the control circuit 10 stores the image of the scene behind the mobile body in the HDD recording/playback unit 14 by a management method that will be described later (step S 17 ). Also, the control circuit 10 captures a still image of a view ahead of the mobile body with the front camera 15 , and stores the captured image by a management method that will be described later (step S 18 ). Then, the control circuit 10 sets the variable A back to 0 (step S 19 ) to turn the image capture mode OFF, so as to allow the user to set anew an intersection where the user wishes to have an image captured.
  • the control circuit 10 continues to display image of a view stored in the HDD recording/playback unit 14 from a time when the mobile body reaches a position that is 30 meters from an intersection where the image of the view has been captured, until the mobile body reaches the intersection (steps S 20 to S 23 ).
  • FIG. 4 shows an example of how an image is displayed.
  • the mobile body reaches a position that is 30 meters from an intersection where an image is stored in the HDD recording/playback unit 14 , the image starts to be displayed.
  • the image includes two images, that is, left and right images; an image for a left turn is displayed at the left top with characters “left turn” superposed thereon and an image for a right turn is displayed at the right top with characters “right turn” superposed thereon.
  • a left image alone it means that the user has once traveled in the left-turn direction.
  • This operation effect can be obtained only by indicating the turning direction without displaying the characters representing right or left, that is, just by indicating the turning direction, for example, by displaying an image for a right turn on the right side and an image for a left turn on the left side. Or, right and left may be notified by way of sound.
  • an image of a view behind a mobile body is captured and stored (steps S 8 and S 21 , steps S 17 and S 21 ), and after the mobile body makes a turn at the intersection, an image of a view ahead of the mobile body is captured and stored (steps S 10 and S 21 , steps S 18 and S 21 ). And the stored views (captured images of views in the vicinity of the intersection) are displayed before the mobile body makes a turn at the intersection next time.
  • an image generated by a third party by a DVD-RAM or the like may be used as a captured image of a view near an intersection.
  • the provision of the front and rear cameras 15 and 16 allows the user of the mobile body to generate a database of views as in Embodiment 1, and this makes it possible to store views (captured images) that are more suitable for the user.
  • a view behind a mobile body is stored before the mobile body makes a turn at an intersection (steps S 8 and S 21 , steps S 17 and S 21 ), and the stored view can be displayed as a view of an area to be entered after turning at the intersection before the mobile body actually turns at the intersection.
  • the image of the view behind the mobile body can be used on a return trip.
  • image capturing is set to be switched between its ON/OFF modes via the image capture key 201 , and the capturing of an image of area behind a mobile body is performed only in the ON mode (Yes in step S 11 ). This allows the user to specify the intersection where an image of a view behind the mobile body should be captured, and thus unnecessary capturing can be prohibited.
  • step S 6 the above-described capturing of an image of a view behind the mobile body is performed with respect to an intersection specified in the route guidance as an intersection to turn at. This makes it possible to show the user a view of an area into which the user will enter after making a turn at an intersection as additional information in the next route guidance.
  • FIG. 5 is a block diagram to show Embodiment 2. Components similar to those in FIG. 1 to show Embodiment 1 are identified by the same reference numbers, and descriptions thereof will be omitted.
  • the structure of Embodiment 2 shown in FIG. 5 further includes a direction indicator detecting unit 30 and a play key 202 that is provided in the operated portion 20 .
  • the direction indicator detecting unit 30 detects right/left turn operation of a direction indicator and feeds its detection result to the control circuit 10 .
  • the play key 202 is a key for displaying a view of an area to be entered after making a turn at a next intersection when the user wishes.
  • FIG. 6 is a flow chart to show image capturing/display processing performed by the control circuit 10 of Embodiment 2.
  • the processing in steps S 1 to S 4 in FIG. 6 is the same as in steps S 1 to S 4 in FIG. 2 to show Embodiment 1, and the descriptions thereof will be omitted.
  • step S 31 When route guidance is performed (Yes in step S 31 ) or when the variable A is 1 (the image capture mode is ON) (Yes in step S 32 ), if the direction indicator is turned ON from OFF (Yes in step S 33 ), the control circuit 10 captures a still image (captured image) of a view behind the mobile body with the rear camera 16 and stores the captured image in the HDD recording/playback unit 14 by a management method that will be described later (step S 34 ).
  • whether a turn is to be made to the right or left may also be judged according to the direction indicator, instead of according to the GPS positioning unit 11 or the direction sensor 12 .
  • step S 31 When route guidance is performed (Yes in step S 31 ) or when the variable A is 1 (that is, the image capture mode is ON) (Yes in step S 32 ), if the direction indicator is turned OFF from ON (Yes in step S 35 ), the control circuit 10 captures a still image of a view ahead of the mobile body by the front camera 15 and stores the captured image in the HDD recording/playback unit 14 by a management method that will be described later (step S 36 ).
  • step S 36 whether the turn has been made to the right or left can also be judged according to the direction indicator, instead of according to the GPS positioning unit 11 or the direction sensor 12 .
  • variable A is 1 (Yes in step S 37 )
  • the variable A is set back to 0 (step S 38 ) to turn the image capture mode OFF, so as to allow the user to set anew an intersection where the user wishes to have an image captured.
  • step S 39 to S 42 the control circuit 10 continues to play the image (steps S 39 to S 42 ). At this time, whether the image is related to the right turn or the left turn is also displayed. If the start key 202 is operated (Yes in step S 43 ), the control circuit 10 starts display operation in response to the start key 202 being operated and continues the display operation until the mobile body has moved 20 meters after turning at the intersection (steps S 39 to S 42 ). At this time as well, whether the image is related to the right turn or the left turn is displayed.
  • the direction indicator detecting unit 30 is provided to detect whether the direction indicator is ON or OFF, and timing for capturing an image of area behind the mobile body before the mobile body makes a turn at an intersection is detected based on the direction indicator detecting unit 30 (step S 33 ).
  • the navigation device of Embodiment 1 in capturing an image of a view not ahead of but behind a mobile body, since the image capturing needs to be performed before the mobile body starts making a turn and it is impossible to specify beforehand whether or not the mobile body is going to make a turn at the intersection, the capturing needs to be performed with respect to every intersection.
  • unnecessary image capturing can be prevented by detecting the operation of the direction indicator.
  • FIG. 7 is a block diagram to show the structure of a navigation device 1 of Embodiment 3 of the present invention that is for use in a mobile body and provided with an image capturing unit.
  • the navigation device 1 is provided with a control unit 110 , a current position detecting unit 111 , a map storage unit 112 , a route search unit 113 , an image capturing unit 114 that is composed of a CCD camera or the like, a character string extraction unit 115 , an input unit 116 , a display unit 117 , an image file storage unit 118 , a moving state detecting unit 119 , and a file name determination unit 120 .
  • the control unit 110 is composed of a processor comprising a CPU, a RAM, and a ROM, and controls operation of each unit of the navigation device 1 according to a control program stored in the ROM.
  • the current position detecting unit 111 is composed of, for example, a GPS receiver, and receives radio waves containing time information from a plurality of GPS satellites revolving around the earth, and calculates current position information based on the received radio waves.
  • the current position detecting unit 111 may use a distance sensor and a direction sensor. In that case, the distance and direction that the mobile body has traveled are each detected, and the detected values are accumulated with respect to a standard position, and thereby the current position is calculated.
  • This current position detection method adopted together with GPS reception, exerts its advantage in current position detection in a tunnel where GPS radio waves cannot be received and in an area of high-rise buildings where errors are apt to happen.
  • the map storage unit 112 stores road data containing road node data and road link data.
  • a node is a connection point of respective roads such as an intersection and a branch point
  • a link is a route between nodes.
  • the road node data contains, for example, data of reference numerals given to road nodes, position coordinates of road nodes, a numbers of links connected to road nodes, names of intersections, and the like.
  • the road link data contains, for example, data of reference numerals of road nodes that are a start point and an end point, respectively, of each link, road types, lengths of links (link costs), time distances, a number of lanes each link has, widths of the road, and the like.
  • the road link data is data of link attributes such as a bridge, a tunnel, a crossing, and a tollgate.
  • the road type data is information indicating whether a link is a free way or a toll way, whether a link is a national road or a prefectural road, and the like.
  • the map storage unit 112 further stores background data containing, for example, water system data such as data of coast lines, lakes, and river shapes, administrative border data, and facility data containing a position, a shape, and a name of a facility.
  • background data containing, for example, water system data such as data of coast lines, lakes, and river shapes, administrative border data, and facility data containing a position, a shape, and a name of a facility.
  • the map storage unit 112 may store, in addition to the road data and the background data, map image data stored in a vector form for the purpose of achieving easy-to-read map display.
  • a map of a predetermined range including a current position of the navigation device 1 is extracted from the map storage unit 112 , the above-mentioned road data, background data, and map image data are displayed together with the map on the display unit 117 with a current position mark that indicates the current position and an image of a guidance route superposed thereon.
  • the route search unit 113 refers to the road data stored in the map storage unit 112 , and searches for an optimal route from the starting point to the destination.
  • This search for the optimal route is performed as follows. First, links and nodes between a road node corresponding to the current position or the starting point specified by the user and a road node corresponding to the destination specified by the user are searched for by various methods such as a Dijkstra method. Then, lengths of the links (link costs), time distances, and the like are accumulated to obtain a total link length and a total time distance. And then, a route of the shortest total link length, the shortest total time distance, or the like is selected as a guidance route, and road nodes and links along the route are provided as guidance route data.
  • the image capturing unit 114 is composed of a CCD camera or the like, and captures desired images such as images of nearby areas in response to an image capture button being operated.
  • the image capture button is included in the input unit 116 .
  • the file name determination unit 120 determines a file name of data of a captured image, and the data together with the file name is stored in the image file storage unit 118 .
  • the user can retrieve a desired image file from the image file storage unit 118 to transfer the image file to a portable storage medium, which the user can later connect to a personal computer or a printing device to process or output the data.
  • the file name determination unit 120 determines a file name of data of an image captured by the image capturing unit 114 , and the image data with the file name is stored in the image file storage unit 118 .
  • the file name determination unit 120 uses capturing date and time of an image captured by the image capturing unit 114 as a default value in determining the file name of the data of the captured image.
  • a serial number indicating order in which the image is captured may be used in the file name. The serial number is reset at start of a new image capturing date and determined by counting one image capturing after another.
  • the moving state detecting unit 119 detects a moving state of the mobile body, that is, whether the mobile body is in a moving state or in a stopping state based on outputs from sensors provided in the mobile body such as an acceleration sensor, a steering angle sensor, an ignition sensor, and the like.
  • the character string extraction unit 115 extracts a character string from map data based on a current position detected by the current position detecting unit 111 and the map data.
  • the character string extracted by the character string extraction unit 115 specifies the position at which the mobile body is located or an area around the position, for example, an area within 100-meter radius from the position including names of a place, a road, an intersection, a facility and the like near the current position of the mobile body.
  • the navigation device 1 of the present invention to data of an image captured by the image capturing unit 114 can be given a file name based on the capturing place of the image and the moving state of the mobile body at the time of the image capturing by the moving state of the mobile body detected by the moving state detecting unit 119 and a character string representing a name of a place, a road, or the like extracted by the character string extraction unit 115 .
  • the file name determination unit 120 is fed with data of the state of the mobile body detected by the moving state detecting unit 119 and data of a character string representing a name of a place, a bridge, or the like extracted by the character string extraction unit 115 .
  • the file name determination unit 120 determines a file name of image data in the following manner based on the state of the mobile body detected by the moving state detecting unit 119 and data of a character string representing a name of a place, a road, or the like extracted by the character string extraction unit 115 , and stores the image data in the image file storage unit 118 .
  • the file name determination unit 120 selects, from predetermined character strings, according to whether the mobile body is in a moving state or in a stopping state, a character string to be interposed in the file name. For example, when the mobile body is not moving, predetermined character string such as “at” or “near” is used. The character string “at” is used when the mobile body is stopping and the character string extraction unit 115 extracts a facility name that specifies the position. The character string “near” is used when the mobile body is stopping and the character string extraction unit 115 extracts a name of a place or a facility near the position.
  • character string such as “moving along” or “moving near” is used.
  • the character string “moving along” is used when the mobile body is moving and the character string extraction unit 115 extracts a name of a road along which the mobile body is moving
  • the character string “moving near” is used when the mobile body is moving and the character string extraction unit 115 extracts a name of a place, a facility, or the like near the position.
  • File names of data of images captured by the image capturing unit 114 which is determined by combining a character string representing name of the capturing point of the image and a predetermined character string that is selected according to the state of the mobile body together as described above, makes it easy to later arrange a large amount of image data or to find out a desired piece of image data from a large amount of image data, with memories of a trip.
  • FIG. 8 is a diagram to show relation among state of the mobile body, the character string extraction result, and the predetermined character strings in file name determination.
  • the predetermined character string is selected and added to the file name as shown in FIG. 8 according to state of the mobile body, whether it is in a moving state or in a stopping state, and a character string extracted by the character string extraction unit 115 which represents name of a place, a facility, a road, or an intersection that specifies the image capturing point, or name of a place or a facility near the image capturing point.
  • the file name determination unit 120 determines the combination of a character string and a predetermined character string as shown in FIG. 8 , and gives the determined combination to the data of the captured image as its file name.
  • the file name determination unit 120 adds information of the current position (latitude, longitude, and the like) to the image capturing date and time, which is a default value, and gives the result to the captured image data as its file name.
  • the image capturing date and time which is a default value
  • the serial number of the image may be added to the file name in addition to the above-mentioned character string and the predetermined character string.
  • FIG. 9 is a flow chart to show the procedure of image data management performed in the navigation device 1 according to the above-described embodiments.
  • the control unit 110 detects activation of a camera which is the image capturing unit 114 (step S 101 ), and in step S 102 , detects if image capturing has been performed by monitoring of actuation of a shutter, a flash, or the image capture button (of the input unit 116 ).
  • image capturing is detected (YES in step S 102 )
  • the captured image data is stored in a temporary storage unit (not shown) such as a RAM (step S 103 ).
  • step S 104 it is judged, by the moving state detecting unit 119 , whether the mobile body is in a moving state or in a stopping state.
  • the process proceeds to step S 105 in which information of a current position detected by the current position detecting unit 111 is obtained.
  • step S 109 information of the current position detected by the current position detecting unit 111 is obtained.
  • the character string extraction unit 115 refers to map data including the current position to extract a character string representing a name of a place, a facility, an intersection, a road, or the like at the current position of the mobile body, that is, at the position of the image capturing performed by the image capturing unit 114 , and sends the extracted character string to the file name determination unit 120 .
  • the character string extraction unit 115 fails to extract a character string representing a name of a place, a facility, an intersection, a road, or the like at the current position of the mobile body, the character string extraction unit 115 extracts a character string representing a name of a place, a facility, an intersection, a road, or the like found in an area within a predetermined distance from the current position, and sends the extracted character string to the file name determination unit 120 .
  • step S 106 it is judged whether or not the character string extraction unit 115 has performed the above described extraction of the character string.
  • the file name determination unit 120 is informed to that effect, and then the file name determination unit 120 determines the file name of the image data temporarily stored in step S 103 by, according to a default value, information of the image capturing date and time (a serial number may be added) and information of the current position, and the image data is stored in the image file storage unit 118 (step S 107 ).
  • the file name determination unit 120 determines the file name of the image data temporarily stored in step S 103 by combining together the character string extracted by the character string extraction unit 115 and one of the predetermined character strings “at” and “near” that is selected according to the extracted character string, and the image data is stored in the image file storage unit 118 (step S 108 ).
  • the character string extraction unit 115 refers to map data including the current position to extract a character string representing a name of a road or the like that specifies the current position of the mobile body, that is, the position at which the image capturing is performed by the image capturing unit 114 (current position), and sends the extracted character string to the file name determination unit 120 .
  • the character string extraction unit 115 fails to extract a character string representing a name of a road, or a name of a place, a facility, an intersection, a road, or the like at the current position of the mobile body, it extracts a character string representing a name of a place, a facility, an intersection, a road, or the like in an area within a predetermined distance from the current position, and sends the extracted character string to the file name determination unit 120 .
  • step S 110 it is judged whether or not the character string extracted by the character string extraction unit 115 represents a name of a place, a facility, or the like near the current position.
  • the process proceeds to step S 111 , in which the file name determination unit 120 determines a file name of the image data by adding the predetermined character string “moving near” shown in FIG. 8 to the character string representing a name of a place or the like extracted by the character string extraction unit 115 , and the image data is stored in the image file storage unit 118 .
  • step S 112 In the judgment processing in step S 112 , it is judged whether or not a character string representing a name of a road has been extracted.
  • the process proceeds to step S 113 , in which the file name determination unit 120 determines a file name of the image data by adding the predetermined character string “moving along” shown in FIG. 8 to the character string representing the name of the road, and the image data is stored in the image file storage unit 118 .
  • step S 112 When, in the judgment processing performed in step S 112 , no character string representing a name of a road is extracted, the process proceeds to step S 107 , in which the file name determination unit 120 determines a file name of the image data temporarily stored in step S 103 by, according to the default value, using the image capturing date and time (a serial number may be added) and information of the current position, stores the image data in the image file storage unit 118 , and the process is completed.
  • the file name determination unit 120 determines a file name of the image data temporarily stored in step S 103 by, according to the default value, using the image capturing date and time (a serial number may be added) and information of the current position, stores the image data in the image file storage unit 118 , and the process is completed.
  • data of an image captured by the image capturing unit 14 is stored under a file name that includes a predetermined character string selected according to the state of the mobile body and the character string that the character string extraction unit 15 has extracted from map data based on the current position.
  • a file name that includes a predetermined character string selected according to the state of the mobile body and the character string that the character string extraction unit 15 has extracted from map data based on the current position.
  • display of a stored view is continued until a mobile body reaches a predetermined point after it makes a turn at an intersection. This makes it possible to compare a current view with the stored view.
  • the play key 202 is provided in the present invention, and when the play key 202 is operated, a stored image of a view with respect to an intersection is displayed. This makes it possible to display, before the user actually makes a turn at an intersection, an image of a view that is to be seen after turning at the intersection, at any timing that the user desires.
  • the present invention is also applicable to mobile-phone terminals equipped with a camera and navigation which have been commercially available.
  • the present invention is applicable to camera-equipped navigation devices for use in mobile bodies, camera-provided portable navigation devices, and mobile phone terminals having navigation and camera functions.

Abstract

A navigation device which stores a captured image of an area ahead of user in a moving direction after turning at an intersection and displays the captured image before turning at the intersection, includes: a first storage unit (HDD recording/playback unit 14) storing a map; a display unit (display unit 18); and a control unit (control unit 10) displaying the map on the display unit, and the navigation device further includes a second storage unit (HDD recording/playback unit 14) storing a captured image of an area to be entered after a turn is made at an intersection, and the control unit displays the captured image stored in the second storage unit on the display unit before a mobile body reaches the intersection.

Description

    TECHNICAL FIELD
  • The present invention relates to a navigation device that displays an image. In particular, the present invention relates to a navigation device that stores a captured image of an area to be entered after turning at an intersection and displays the captured image before turning at the intersection.
  • BACKGROUND ART
  • Conventionally, navigation devices that are designed as standard equipment for mobile bodies and portable navigation devices that are designed both for use in any mobile bodies and for use by pedestrians have been commercially available. In addition, mobile-phone terminals having navigation functions have also been commercially available.
  • Such navigation devices provide guidance at an intersection as to in which direction to go by various methods. In a commonly used method, a route from a current position to a destination is previously set and, for example, an enlarged image of an intersection is displayed and a sound notification such as “Turn right at the intersection about 50 meters ahead” is given on reaching a predetermined distance from an intersection to make a turn at.
  • Patent Document 1 discloses another method in which an image of an intersection ahead is captured according to key operation, the captured image of the intersection ahead is stored together with coordinates of the intersection, and the captured image is displayed at a next occasion of approaching the intersection. Patent Document 2 discloses still another method in which an image of an intersection ahead is captured not according to key operation but according to operation of a direction indicator, and the captured image is displayed at a next occasion of approaching the intersection.
  • Patent Document 1: JP-A-H09-014976 Patent Document 2: JP-A-2006-221362 DISCLOSURE OF THE INVENTION Problems to be Solved by the Invention
  • With the above-mentioned conventional methods, however, when a clear, unobstructed view cannot be obtained in the vicinity of an intersection, it is impossible to check, before turning at the intersection, what an area to be entered after turning at the intersection is like. For example, when a destination cannot be set because a user does not know an address or a telephone number of a shop the user wishes to go to, or when the user forgets whether the shop is on right side or on the left side of a next intersection, the user needs to actually make a right or left turn at the intersection to find it out. The user also needs to actually make a right or left turn at an intersection when the user has once passed through the intersection and thus is sure that the user will recognize the shop if the user actually makes a turn and see the view there, but the user does not remember whether the user should make a right turn or a left turn. Hence, there has been a demand for knowing, before turning at an intersection, what an area to be entered after turning at the intersection is like.
  • The present invention has been made in view of the above described problems, and an object of the present invention is to provide a navigation device capable of displaying a scene of an area to be entered after turning before actually doing so.
  • Means to Solve the Problem
  • To solve the above described problems, according to a first aspect of the present invention, a navigation device includes: a first storage unit storing a map; a display unit; and a control unit displaying the map on the display unit, and the navigation device is characterized in that the navigation device further comprises a second storage unit storing a captured image of an area to be entered after a turn is made at an intersection, and the control unit displays the captured image stored in the second storage unit on the display unit before a mobile body reaches the intersection.
  • According to a second aspect of the present invention, the navigation device according to the above described first aspect further includes a first image capturing unit capturing an image of area ahead of the mobile body in the traveling direction of the mobile body, the navigation device is characterized in that when the mobile body has made a turn at the intersection, the control unit captures, by the first image capturing unit, an image of area ahead of the mobile body in the moving direction of the mobile body after the turn to obtain a captured image of the area, and the control unit stores the captured image captured by the first image capturing unit in the second storage unit.
  • According to a third aspect of the present invention, the navigation device according to the above described second aspect further includes a direction sensor detecting a moving direction of the mobile body, the navigation device is characterized in that the control unit detects, according to detection result of the direction sensor, timing for capturing the image.
  • According to a fourth aspect of the present invention, the navigation device according to any one of the above described one to third aspects further includes a second image capturing unit capturing an image of area behind the mobile body in the moving direction of the mobile body, the navigation device is characterized in that the control unit captures, by the second image capturing unit, an image of area behind the mobile body in the moving direction of the mobile body before the mobile body reaches the intersection to obtain a captured image of the area, and the control unit stores the captured image captured by the second image capturing unit in the second storage unit.
  • According to a fifth aspect of the present invention, the navigation device according to the above described fourth aspect further includes a detecting unit detecting ON/OFF states of a direction indicator provided in the mobile body, the navigation device is characterized in that the control unit detects, according to detection result of the detecting unit, timing for capturing the image of an area behind the mobile body in the moving direction of the mobile body before the mobile body reaches the intersection.
  • According to a sixth aspect of the present invention, in the navigation device according to the above described fourth or fifth aspect, the navigation device is characterized in that when the mobile body has not made a turn at the intersection but gone straight therethrough, the control unit deletes the captured image of the area behind the mobile body in the moving direction of the mobile body captured by the second image capturing unit and stored in the second storage unit.
  • According to a seventh aspect of the present invention, in the navigation device according to any one of the above-described fourth to sixth aspects, the navigation device is characterized in that the control unit switches between ON and OFF modes to activate or deactivate image capturing operation performed by the second capturing unit, and in the ON mode is captured an image of area behind the mobile body in a moving direction of the mobile body.
  • According to an eighth aspect of the present invention, in the navigation device according to any one of the above-described first to seventh aspects, the navigation device is characterized in that when the mobile body has made a turn at the intersection, the control unit continues to play on the display unit the captured image stored in the second storage unit until the mobile body reaches a predetermined point after making the turn at the intersection.
  • According to a ninth aspect of the present invention, the navigation device according to any one of the above-described first to eighth aspects further includes a play key, the navigation device is characterized in that the control unit displays, on the display unit, the captured image stored in the second storage unit in response to the play key being operated.
  • According to a tenth aspect of the present invention, in the navigation device according to the above-described second or third aspect, the navigation device is characterized in that when the mobile body has made a turn at the intersection, the control unit stores, in the second storage unit, a direction of the turn made at the intersection in association with the captured image, and the control unit displays, on the display unit, the direction of the turn made at the intersection together with the captured image.
  • According to an eleventh aspect of the present invention, in the navigation device according to any one of the above-described fourth to seventh aspects, the navigation device is characterized in that when the mobile body has made a turn at the intersection, the control unit stores, in the second storage unit, in association with the captured image, a direction that is directly opposite to a moving direction in which the mobile body travels after making the turn at the intersection, and the control unit displays, on the display unit, together with the captured image, the direction that is directly opposite to the moving direction in which the mobile body travels after making the turn at the intersection.
  • According to a twelfth aspect of the present invention, a navigation device for a mobile body includes: a current position detecting unit detecting a current position; an image capturing unit; an image file storage unit storing data of an image captured by the image capturing unit; a map storage unit storing map information; and a display unit displaying a map image, the navigation device is characterized in that the navigation device further comprising: a moving state detecting unit detecting whether the mobile body is in a moving state or in a stopping state; a character string extraction unit extracting, from the map information, at least one of character strings representing a name of a place, a facility, a road, and an intersection at or near a current position; and a file name determination unit determining a file name of data of an image captured by the image capturing unit, the navigation device is characterized in that when the image capturing unit captures an image, a file name of data of the captured image is determined by combining a moving state detected by the moving state detecting unit, a character string extracted by the character string extraction unit, and a predetermined character string that is provided beforehand in the file name determination unit.
  • According to a thirteenth aspect of the present invention, in the navigation device according to the above described twelfth aspect, the navigation device is characterized in that the predetermined character string contains either a character string indicating that the mobile body is stopping or a character string indicating that the mobile body is moving.
  • According to a fourteenth aspect of the present invention, in the navigation device according to the above described thirteenth aspect, the navigation device is characterized in that the predetermined character string contains a predetermined character string indicating that the character string extracted by the character string extraction unit from the map information is a character string related to a predetermined area near a current position.
  • According to a fifteenth aspect of the present invention, an image management method includes steps of: generating a captured image of an area near a mobile body by an image capturing unit; detecting a moving state of the mobile body whether the mobile body is in a moving state or in a stopping state; obtaining current position information by extracting, from map information, at least one of character strings representing a name of a place, a facility, a road, and an intersection at or near the current position; and determining a name of the captured image by combining a predetermined character string that is prepared beforehand, the moving state, and the current position information.
  • Advantages of the Invention
  • According to the navigation device of the first aspect of the present invention, before turning at an intersection, a user can see a captured image of an area ahead of the user in the moving direction after turning at the intersection, and thus the user can avoid making a wrong turn at the intersection. In addition, the user can see a captured image while waiting at a traffic light.
  • According to the navigation device of the second aspect of the present invention, a database of captured images of areas ahead of a mobile body in its moving direction after making turns at intersections can be built by the navigation device alone.
  • According to the navigation device of the third aspect of the present invention, it is possible to detect a turn made at an intersection with a higher accuracy than a GPS in which signals are received intermittently.
  • According to the navigation device of the fourth aspect of the present invention, a captured image of an area behind a mobile body in its moving direction before it reaches an intersection can be captured on an outward trip, and the captured image can be displayed on a return trip.
  • In capturing an image of area not ahead of but behind a mobile body in its moving direction, since it cannot be specified beforehand which intersection to turn at, image capturing needs to be performed every time the mobile body approaches any intersection. However, with the navigation device of the fifth aspect of the present invention, it is possible to prevent unnecessary image capturing.
  • According to the navigation device of the sixth aspect of the present invention, it is possible to prevent unnecessary storage of captured images related to an intersection through which a mobile body has gone straight.
  • In capturing an image of area not ahead of but behind a mobile body in its moving direction, since it cannot be specified beforehand which intersection to turn at, image capturing needs to be performed every time the mobile body approaches any intersection. However, with the navigation device of the seventh aspect of the present invention, a user can specify an intersection where the user wishes to have an image of area behind him captured, and thus it is possible to prevent unnecessary storage of captured images.
  • According to the navigation device of the eighth aspect of the present invention, it is possible to compare a current view of an area with the captured image of the area stored in the navigation device.
  • According to the navigation device of the ninth aspect of the present invention, the user can find that the captured image stored in the second storage unit shows a view the user saw when the user made a turn at the intersection before. In particular, when the user wishes to make a turn in the same direction as before, and just a single captured image is displayed, the user can do so just by turning in the direction in which the user actually sees the same view as shown in the actually capture image.
  • According to the navigation device of the tenth aspect of the present invention, since the user, before turning at an intersection, can see not only the captured image of an area ahead of the user in the moving direction after the user turns at the intersection but also the direction to make turn as shown with the captured image, the user can make sure in which direction the user should turn at the intersection.
  • According to the navigation device of the eleventh aspect of the present invention, since the user, before turning at an intersection, can see both captured images after turning right and left at the intersection, the user can make sure in which direction the user should turn at the intersection.
  • According to the navigation device of the twelfth aspect of the present invention, a file name of data of a captured image is determined by combining together a predetermined character string and a character string representing a name of a place, an intersection, or a road at an image capturing point, or a name of a place or a facility near the image capturing point. The predetermined character strings are set beforehand according to whether the mobile body is in a moving state or in a stopping state, and according to names of locations, intersections, and roads, or according to names of places, facilities, and the like near an image capturing point. Thus, by determining a file name by combining a character string representing the place name of the image capturing point or a related name with a predetermined character string selected according to the state of the mobile body, it becomes easy to arrange a large amount of image data or to find out desired image data later.
  • According to the navigation device of the thirteenth aspect of the present invention, a file name implies an image capturing place and the state of the mobile body, and this makes it easy to arrange a large amount of image data or to find out desired image data later.
  • According to the image management method of the fifteenth aspect of the present invention, an image data management method for the navigation device of the above-described twelfth aspect can be provided. Here, since the file name is determined by combining a character string representing the name of the image capturing place or a name related to the image capturing place and a predetermined character string selected according to the state of the mobile body, it is easy to arrange a large amount of image data or to find out desired image data later.
  • BRIEF DESCRIPTION OF DRAWINGS
  • FIG. 1 is a block diagram to show a relevant part of a navigation device according to Embodiment 1 of the present invention.
  • FIG. 2 is a flow chart to show operation of a control circuit of the navigation device according to Embodiment 1.
  • FIG. 3 is a diagram to show a memory map of a stored image.
  • FIG. 4 is a diagram to show an example of display of a view.
  • FIG. 5 is a block diagram to show a relevant part of a navigation device according to Embodiment 2 of the present invention.
  • FIG. 6 is a flow chart to show operation of a control circuit of the navigation device according to Embodiment 2.
  • FIG. 7 is a block diagram to show a relevant part of a navigation device according to Embodiment 3 of the present invention that is for use in mobile bodies and provided with an image capturing unit.
  • FIG. 8 is a diagram to show examples of combination, according to moving state of a mobile body, of a character string extracted with respect to an image capturing point and a predetermined character string.
  • FIG. 9 is a flow chart to show a procedure of image data management performed in the navigation device according to Embodiment 3 that is for use in mobile bodies and provided with an image capturing unit.
  • List of Reference Symbols
  • 1, 100 navigation device
  • 10, 110 control circuit
  • 11 GPS positioning unit
  • 12 direction sensor
  • 13 speed sensor
  • 14 HDD recording/playback unit
  • 15 front camera
  • 16 rear camera
  • 17 sound synthesis circuit
  • 18, 117 display unit
  • 19 speaker
  • 20 operating unit
  • 201 image capture key
  • 202 play key
  • 111 current position detecting unit
  • 112 map storage unit
  • 113 route search unit
  • 114 image capturing unit
  • 115 character string extraction unit
  • 116 input unit
  • 118 image file storage unit
  • 119 moving state detecting unit
  • 120 file name determination unit
  • BEST MODE FOR CARRYING OUT THE INVENTION
  • Hereinafter, with reference to the drawings, descriptions will be given of embodiments of the present invention taking up navigation devices as examples. However, it should be noted that the following embodiments simply deal with navigation devices as examples for embodying the technical idea of the present invention, and it is not intended to limit scope of the present invention to the navigation devices dealt with in the embodiments. On the contrary, other embodiments included in claims may equally be applied without departing from the technical idea shown in the scope of the claims.
  • EMBODIMENT 1
  • FIG. 1 is a block diagram to show a relevant part of a navigation device according to Embodiment 1 of the present invention.
  • A GPS (global positioning system) positioning unit 11 receives radio waves carrying location information from a plurality of GPS satellites, calculates its current position, and feeds data of the current position obtained as a result of the calculation to a control circuit 10 which will be described later. A direction sensor 12 detects a direction based on terrestrial magnetism and feeds the detection result to the control circuit 10. A speed sensor 13 detects moving speed based on rotation of tire and acceleration, and feeds the detection result to the control circuit 10. An HDD recording/playback unit 14 has an HDD (hard disk drive), and performs recording/playback of map information 141 and an image (captured image) 142. The map information 141 includes data of gas stations and landmarks, as well as map data such as of roads and intersections.
  • A front camera 15 captures an image of area ahead of the mobile body, and feeds the captured image of the area to the control circuit 10. A rear camera 16 captures an image of area behind the mobile body, and feeds the captured image of the area to the control circuit 10. Here, the front and rear cameras 15 and 16 both capture actual still images of views, but they may also be provided with a moving image capturing function. A sound synthesis circuit 17 generates a sound of a character specified by the control circuit 10, and outputs the resulting sound to the control circuit 10. A display unit 18 displays a map or an image on which a current position is superposed.
  • A speaker 19, under control of the control circuit 10, outputs the sound generated by the sound synthesis circuit 17. An operating unit 20 is provided with not only an image capture key 201 for turning ON/OFF an image capture mode but also various keys (not shown) via which to operate the navigation. The control circuit 10 controls the units according to a program stored in a ROM 21. A RAM 22 stores information necessary for the control circuit 10 to perform its operation.
  • Next, a description will be given of current position detection performed by the control circuit 10. When the navigation device is activated in response to turning-on of an ignition switch (not shown) of a mobile body, the control circuit 10 retrieves a map stored in the HDD recording/playback unit 14 and displays it on the display unit 18. The control circuit 10 then receives data of a current position from the GPS positioning unit 11. It takes several seconds for the control circuit 10 to start the first display of the current position after it receives the data of the current position, but thereafter, display of the current position is updated every second.
  • The control circuit 10 superposes the received current position on the map displayed on the display unit 18. The control circuit 10 displays the current position received from the GPS positioning unit 11 on the map retrieved from the HDD recording/playback unit 14 such that the current position is laid on a road in the map at a point closest to the current position. This is called map matching.
  • Current positions are received intermittently every second, and the direction sensor 12 and the speed sensor 13 interpolate current positions while no current position is being received from the GPS positioning unit 11. In particular, a change in position made by the mobile body to make a turn (e.g. a right or left turn at an intersection) is displayed according to operation of the direction sensor 12.
  • Next, route guidance will be described. For route guidance to be performed, a destination needs to be set first. In the most typical setting method of destination, a map is enlarged, reduced, and scrolled via the operating unit 20, a cursor is set on a point indicating the destination, and then the point is registered as the destination. In another method, when the name (e.g., Mt. Fuji) or the type of a destination (e.g., a restaurant) is known, the destination can be set through a search by a key-word or a destination type. When the address or the phone number of a destination is known, the destination can be set by such information.
  • When the destination is set, the control circuit 10 finds a route from the current position to the destination that is optimal based on predetermined conditions. The conditions, which are set by the user, include various requirements such as the shortest route, the shortest time, and no toll road.
  • When the route is set, the control circuit 10 starts route guidance. For example, the control circuit 10 displays the route in a distinct color from the other loads, and also displays a distance to the destination from the current position and an estimated arrival time. When the mobile body comes to a position a predetermined distance from an intersection to make a turn at, the control circuit 10 outputs through a speaker 19 sound notification such as “Turn right at the intersection about 50 meters ahead” generated by the sound synthesis circuit 17.
  • FIG. 2 is a flow chart to show image capturing and display processes performed by the control circuit 10 of Embodiment 1. The control circuit 10 stores a variable A in a register disposed inside the control circuit 10. When A is 0, the image capture mode is OFF, and when A is 1, the image capture mode is ON. The initial value of the variable A is 0 (that is, the image capture mode is OFF) (step S1). Every time the image capture key 201 is pressed down (Y in step S2), the control circuit 10 accordingly changes the value of the variable A (steps S3 to S5), and thereby turns ON/OFF the image capture mode.
  • While performing the route guidance (Yes in step S6), when the mobile body comes to a position that is 10 meters from the intersection to make a turn at (Yes in step S7), the control circuit 10 captures a still image (actual image) of a view behind the mobile body with the rear camera 16, and stores the resulting image in the HDD recording/playback unit 14 using an image management method that will be described later (step S8). The captured image of the view behind the mobile body is displayed when the mobile body travels in a direction opposite to the current moving direction.
  • In the image of the view behind the mobile body, entry and exit directions, which are directions in image display, are opposite to those when the mobile body travels in the direction opposite to the current moving direction. For example, an image of the view behind the mobile body captured in making a right turn from south to east at an intersection is the same as an image of the view ahead of the mobile body captured in making a left turn from east to south at the intersection. In storing images, since images of one same position can be obtained from images having the same intersection position information and having the same entry and exit directions, an image of old image capturing date and time is updated with an image of new image capturing date and time.
  • In performing route guidance (Yes in step S6), when the mobile body has finished making a turn at an intersection following the route (Yes in step S9), the control circuit 10 captures a still image of a view ahead of the mobile body with the front camera 15, and stores the captured image in association with the position information of the intersection in the HDD recording/playback unit 14 (step S10). Here, the navigation device 1 judges, from the detection result of the direction sensor 12, that the mobile body has made a turn at the intersection and has reached an image capturing point. The captured image of the view ahead of the mobile body is displayed next time the mobile body comes to the intersection. As shown in FIG. 3, in storing an image, image information of a view, the image capturing date and time, and the entry and exit directions are stored in association with the position information of the intersection.
  • Completion of turn is judged based on a current position obtained by the direction sensor 12 and the speed sensor 13 interpolating the current position detection by the GPS positioning unit 11, but instead, completion of turn may be judged based on a change in moving direction detected by the direction sensor 12. This makes it possible to detect a turn made at an intersection with higher accuracy than with a GPS, in which signals are received intermittently.
  • Also, when no route guidance is being performed (No in step S6), if the variable A is 1 (Yes in step S11), when the mobile body comes to a position that is 10 meters from a next intersection while moving (Yes in step S12), the control circuit 10 captures an image of area behind the mobile body and temporarily stores the captured image in association with position information of the intersection in the RAM 22 (step S13). And, if the mobile body does not make a turn at the intersection where image of the area behind the mobile body has been captured but goes straight through the intersection (YES in step S 14), the control circuit 10 deletes the temporarily stored image of the area behind the mobile body without storing it in the HDD recording/playback unit 14 (step S15). This helps prevent unnecessary storage of records.
  • In step S13, when the mobile body completes a turn at the intersection where the image of the area behind the mobile body has been captured (Yes in step S16), the control circuit 10 stores the image of the scene behind the mobile body in the HDD recording/playback unit 14 by a management method that will be described later (step S17). Also, the control circuit 10 captures a still image of a view ahead of the mobile body with the front camera 15, and stores the captured image by a management method that will be described later (step S18). Then, the control circuit 10 sets the variable A back to 0 (step S19) to turn the image capture mode OFF, so as to allow the user to set anew an intersection where the user wishes to have an image captured.
  • The control circuit 10 continues to display image of a view stored in the HDD recording/playback unit 14 from a time when the mobile body reaches a position that is 30 meters from an intersection where the image of the view has been captured, until the mobile body reaches the intersection (steps S20 to S23).
  • FIG. 4 shows an example of how an image is displayed. When the mobile body reaches a position that is 30 meters from an intersection where an image is stored in the HDD recording/playback unit 14, the image starts to be displayed.
  • The image includes two images, that is, left and right images; an image for a left turn is displayed at the left top with characters “left turn” superposed thereon and an image for a right turn is displayed at the right top with characters “right turn” superposed thereon. Here, when a left image alone is displayed, it means that the user has once traveled in the left-turn direction. This operation effect can be obtained only by indicating the turning direction without displaying the characters representing right or left, that is, just by indicating the turning direction, for example, by displaying an image for a right turn on the right side and an image for a left turn on the left side. Or, right and left may be notified by way of sound.
  • Incidentally, even when right and left directions are not indicated at all, if the user can see, before the user makes a turn, part of the view of an area he will enter after making the turn, the user can judge in which direction the user should turn to actually see the displayed view.
  • Thus, according to the present invention, before a mobile body makes a turn at an intersection, an image of a view behind a mobile body is captured and stored (steps S8 and S21, steps S17 and S21), and after the mobile body makes a turn at the intersection, an image of a view ahead of the mobile body is captured and stored (steps S10 and S21, steps S18 and S21). And the stored views (captured images of views in the vicinity of the intersection) are displayed before the mobile body makes a turn at the intersection next time.
  • As a result, the user can see a view of an area to be entered after turning at an intersection before the user actually turns at the intersection, and thus the user can avoid making a wrong turn. Also, when the user has to wait for the traffic light to turn at the intersection, the user can see, while waiting for the traffic light to change, the display to see the view of the area to be entered after making turning at the intersection before the user actually turns at the intersection. Incidentally, in the present invention, an image generated by a third party by a DVD-RAM or the like may be used as a captured image of a view near an intersection. However, the provision of the front and rear cameras 15 and 16 allows the user of the mobile body to generate a database of views as in Embodiment 1, and this makes it possible to store views (captured images) that are more suitable for the user.
  • Also, according to the present invention, a view behind a mobile body is stored before the mobile body makes a turn at an intersection (steps S8 and S21, steps S17 and S21), and the stored view can be displayed as a view of an area to be entered after turning at the intersection before the mobile body actually turns at the intersection. Thus, by capturing an image of a view behind a mobile body in addition to an image of a view ahead of the mobile body, the image of the view behind the mobile body can be used on a return trip.
  • In capturing an image of area not ahead but behind of a mobile body, since the image capturing needs to be performed before the mobile body starts making a turn at an intersection and it is impossible to specify beforehand whether or not the mobile body is going to make a turn at the intersection, the image capturing needs to be performed with respect to every intersection. However, according to the present invention, image capturing is set to be switched between its ON/OFF modes via the image capture key 201, and the capturing of an image of area behind a mobile body is performed only in the ON mode (Yes in step S11). This allows the user to specify the intersection where an image of a view behind the mobile body should be captured, and thus unnecessary capturing can be prohibited.
  • In addition, according to the present invention, as shown by Yes in step S6, the above-described capturing of an image of a view behind the mobile body is performed with respect to an intersection specified in the route guidance as an intersection to turn at. This makes it possible to show the user a view of an area into which the user will enter after making a turn at an intersection as additional information in the next route guidance.
  • EMBODIMENT 2
  • FIG. 5 is a block diagram to show Embodiment 2. Components similar to those in FIG. 1 to show Embodiment 1 are identified by the same reference numbers, and descriptions thereof will be omitted. As compared with the structure of Embodiment 1 shown in FIG. 1, the structure of Embodiment 2 shown in FIG. 5 further includes a direction indicator detecting unit 30 and a play key 202 that is provided in the operated portion 20. The direction indicator detecting unit 30 detects right/left turn operation of a direction indicator and feeds its detection result to the control circuit 10. The play key 202 is a key for displaying a view of an area to be entered after making a turn at a next intersection when the user wishes.
  • FIG. 6 is a flow chart to show image capturing/display processing performed by the control circuit 10 of Embodiment 2. The processing in steps S1 to S4 in FIG. 6 is the same as in steps S1 to S4 in FIG. 2 to show Embodiment 1, and the descriptions thereof will be omitted.
  • When route guidance is performed (Yes in step S31) or when the variable A is 1 (the image capture mode is ON) (Yes in step S32), if the direction indicator is turned ON from OFF (Yes in step S33), the control circuit 10 captures a still image (captured image) of a view behind the mobile body with the rear camera 16 and stores the captured image in the HDD recording/playback unit 14 by a management method that will be described later (step S34). Here, whether a turn is to be made to the right or left may also be judged according to the direction indicator, instead of according to the GPS positioning unit 11 or the direction sensor 12.
  • When route guidance is performed (Yes in step S31) or when the variable A is 1 (that is, the image capture mode is ON) (Yes in step S32), if the direction indicator is turned OFF from ON (Yes in step S35), the control circuit 10 captures a still image of a view ahead of the mobile body by the front camera 15 and stores the captured image in the HDD recording/playback unit 14 by a management method that will be described later (step S36). Here, whether the turn has been made to the right or left can also be judged according to the direction indicator, instead of according to the GPS positioning unit 11 or the direction sensor 12. Then, when the variable A is 1 (Yes in step S37), the variable A is set back to 0 (step S38) to turn the image capture mode OFF, so as to allow the user to set anew an intersection where the user wishes to have an image captured.
  • From when the mobile body reaches a position that is 30 meters from an intersection where an image of a view is stored in the HDD recording/playback unit 14 until the mobile body reaches the intersection, the control circuit 10 continues to play the image (steps S39 to S42). At this time, whether the image is related to the right turn or the left turn is also displayed. If the start key 202 is operated (Yes in step S43), the control circuit 10 starts display operation in response to the start key 202 being operated and continues the display operation until the mobile body has moved 20 meters after turning at the intersection (steps S39 to S42). At this time as well, whether the image is related to the right turn or the left turn is displayed.
  • As just described, according to the present invention, the direction indicator detecting unit 30 is provided to detect whether the direction indicator is ON or OFF, and timing for capturing an image of area behind the mobile body before the mobile body makes a turn at an intersection is detected based on the direction indicator detecting unit 30 (step S33).
  • With the navigation device of Embodiment 1, in capturing an image of a view not ahead of but behind a mobile body, since the image capturing needs to be performed before the mobile body starts making a turn and it is impossible to specify beforehand whether or not the mobile body is going to make a turn at the intersection, the capturing needs to be performed with respect to every intersection. However, with the navigation device of Embodiment 2, unnecessary image capturing can be prevented by detecting the operation of the direction indicator.
  • EMBODIMENT 3
  • An image data management method according to the present invention will be described below. FIG. 7 is a block diagram to show the structure of a navigation device 1 of Embodiment 3 of the present invention that is for use in a mobile body and provided with an image capturing unit. The navigation device 1 is provided with a control unit 110, a current position detecting unit 111, a map storage unit 112, a route search unit 113, an image capturing unit 114 that is composed of a CCD camera or the like, a character string extraction unit 115, an input unit 116, a display unit 117, an image file storage unit 118, a moving state detecting unit 119, and a file name determination unit 120.
  • The control unit 110 is composed of a processor comprising a CPU, a RAM, and a ROM, and controls operation of each unit of the navigation device 1 according to a control program stored in the ROM. The current position detecting unit 111 is composed of, for example, a GPS receiver, and receives radio waves containing time information from a plurality of GPS satellites revolving around the earth, and calculates current position information based on the received radio waves.
  • Furthermore, the current position detecting unit 111 may use a distance sensor and a direction sensor. In that case, the distance and direction that the mobile body has traveled are each detected, and the detected values are accumulated with respect to a standard position, and thereby the current position is calculated. This current position detection method, adopted together with GPS reception, exerts its advantage in current position detection in a tunnel where GPS radio waves cannot be received and in an area of high-rise buildings where errors are apt to happen.
  • The map storage unit 112 stores road data containing road node data and road link data. Here, a node is a connection point of respective roads such as an intersection and a branch point, and a link is a route between nodes. The road node data contains, for example, data of reference numerals given to road nodes, position coordinates of road nodes, a numbers of links connected to road nodes, names of intersections, and the like. The road link data contains, for example, data of reference numerals of road nodes that are a start point and an end point, respectively, of each link, road types, lengths of links (link costs), time distances, a number of lanes each link has, widths of the road, and the like. Further imparted to the road link data is data of link attributes such as a bridge, a tunnel, a crossing, and a tollgate. The road type data is information indicating whether a link is a free way or a toll way, whether a link is a national road or a prefectural road, and the like.
  • The map storage unit 112 further stores background data containing, for example, water system data such as data of coast lines, lakes, and river shapes, administrative border data, and facility data containing a position, a shape, and a name of a facility.
  • The map storage unit 112 may store, in addition to the road data and the background data, map image data stored in a vector form for the purpose of achieving easy-to-read map display. When the navigation device 1 is used, a map of a predetermined range including a current position of the navigation device 1 is extracted from the map storage unit 112, the above-mentioned road data, background data, and map image data are displayed together with the map on the display unit 117 with a current position mark that indicates the current position and an image of a guidance route superposed thereon.
  • When the user specifies a starting point and a destination, the route search unit 113 refers to the road data stored in the map storage unit 112, and searches for an optimal route from the starting point to the destination. This search for the optimal route is performed as follows. First, links and nodes between a road node corresponding to the current position or the starting point specified by the user and a road node corresponding to the destination specified by the user are searched for by various methods such as a Dijkstra method. Then, lengths of the links (link costs), time distances, and the like are accumulated to obtain a total link length and a total time distance. And then, a route of the shortest total link length, the shortest total time distance, or the like is selected as a guidance route, and road nodes and links along the route are provided as guidance route data.
  • The image capturing unit 114 is composed of a CCD camera or the like, and captures desired images such as images of nearby areas in response to an image capture button being operated. The image capture button is included in the input unit 116. The file name determination unit 120 determines a file name of data of a captured image, and the data together with the file name is stored in the image file storage unit 118. The user can retrieve a desired image file from the image file storage unit 118 to transfer the image file to a portable storage medium, which the user can later connect to a personal computer or a printing device to process or output the data.
  • The file name determination unit 120 determines a file name of data of an image captured by the image capturing unit 114, and the image data with the file name is stored in the image file storage unit 118. The file name determination unit 120 uses capturing date and time of an image captured by the image capturing unit 114 as a default value in determining the file name of the data of the captured image. In addition to image capturing date and time, a serial number indicating order in which the image is captured may be used in the file name. The serial number is reset at start of a new image capturing date and determined by counting one image capturing after another.
  • The moving state detecting unit 119 detects a moving state of the mobile body, that is, whether the mobile body is in a moving state or in a stopping state based on outputs from sensors provided in the mobile body such as an acceleration sensor, a steering angle sensor, an ignition sensor, and the like.
  • The character string extraction unit 115 extracts a character string from map data based on a current position detected by the current position detecting unit 111 and the map data. The character string extracted by the character string extraction unit 115 specifies the position at which the mobile body is located or an area around the position, for example, an area within 100-meter radius from the position including names of a place, a road, an intersection, a facility and the like near the current position of the mobile body.
  • In the navigation device 1 of the present invention, to data of an image captured by the image capturing unit 114 can be given a file name based on the capturing place of the image and the moving state of the mobile body at the time of the image capturing by the moving state of the mobile body detected by the moving state detecting unit 119 and a character string representing a name of a place, a road, or the like extracted by the character string extraction unit 115. To achieve this, when an image is captured by the image capturing unit 114, the file name determination unit 120 is fed with data of the state of the mobile body detected by the moving state detecting unit 119 and data of a character string representing a name of a place, a bridge, or the like extracted by the character string extraction unit 115.
  • The file name determination unit 120 determines a file name of image data in the following manner based on the state of the mobile body detected by the moving state detecting unit 119 and data of a character string representing a name of a place, a road, or the like extracted by the character string extraction unit 115, and stores the image data in the image file storage unit 118.
  • In determining the file name, the file name determination unit 120 selects, from predetermined character strings, according to whether the mobile body is in a moving state or in a stopping state, a character string to be interposed in the file name. For example, when the mobile body is not moving, predetermined character string such as “at” or “near” is used. The character string “at” is used when the mobile body is stopping and the character string extraction unit 115 extracts a facility name that specifies the position. The character string “near” is used when the mobile body is stopping and the character string extraction unit 115 extracts a name of a place or a facility near the position.
  • When the mobile body is moving, character string such as “moving along” or “moving near” is used. The character string “moving along” is used when the mobile body is moving and the character string extraction unit 115 extracts a name of a road along which the mobile body is moving, and the character string “moving near” is used when the mobile body is moving and the character string extraction unit 115 extracts a name of a place, a facility, or the like near the position.
  • File names of data of images captured by the image capturing unit 114 which is determined by combining a character string representing name of the capturing point of the image and a predetermined character string that is selected according to the state of the mobile body together as described above, makes it easy to later arrange a large amount of image data or to find out a desired piece of image data from a large amount of image data, with memories of a trip.
  • FIG. 8 is a diagram to show relation among state of the mobile body, the character string extraction result, and the predetermined character strings in file name determination. As mentioned above, the predetermined character string is selected and added to the file name as shown in FIG. 8 according to state of the mobile body, whether it is in a moving state or in a stopping state, and a character string extracted by the character string extraction unit 115 which represents name of a place, a facility, a road, or an intersection that specifies the image capturing point, or name of a place or a facility near the image capturing point. The file name determination unit 120 determines the combination of a character string and a predetermined character string as shown in FIG. 8, and gives the determined combination to the data of the captured image as its file name.
  • When the character string extraction unit 115 has failed to extract from the map data a character string representing a name of a place, a facility, an intersection, a road, or the like, the file name determination unit 120 adds information of the current position (latitude, longitude, and the like) to the image capturing date and time, which is a default value, and gives the result to the captured image data as its file name. Incidentally, even when the character string extraction unit 115 has succeeded in extracting a character string, the image capturing date and time, which is a default value, and the serial number of the image may be added to the file name in addition to the above-mentioned character string and the predetermined character string.
  • FIG. 9 is a flow chart to show the procedure of image data management performed in the navigation device 1 according to the above-described embodiments. The control unit 110 detects activation of a camera which is the image capturing unit 114 (step S101), and in step S102, detects if image capturing has been performed by monitoring of actuation of a shutter, a flash, or the image capture button (of the input unit 116). When image capturing is detected (YES in step S102), the captured image data is stored in a temporary storage unit (not shown) such as a RAM (step S103).
  • Then, in step S104, it is judged, by the moving state detecting unit 119, whether the mobile body is in a moving state or in a stopping state. When the mobile body is in a stopping state, the process proceeds to step S105 in which information of a current position detected by the current position detecting unit 111 is obtained. When the mobile body is not in a stopping state, that is, the mobile body is in a moving state, the process proceeds to step S109 in which information of the current position detected by the current position detecting unit 111 is obtained.
  • When information of the current position is obtained in step S105, the character string extraction unit 115 refers to map data including the current position to extract a character string representing a name of a place, a facility, an intersection, a road, or the like at the current position of the mobile body, that is, at the position of the image capturing performed by the image capturing unit 114, and sends the extracted character string to the file name determination unit 120. Here, if the character string extraction unit 115 fails to extract a character string representing a name of a place, a facility, an intersection, a road, or the like at the current position of the mobile body, the character string extraction unit 115 extracts a character string representing a name of a place, a facility, an intersection, a road, or the like found in an area within a predetermined distance from the current position, and sends the extracted character string to the file name determination unit 120.
  • Then, in the processing performed in step S106, it is judged whether or not the character string extraction unit 115 has performed the above described extraction of the character string. When no character string is extracted, the file name determination unit 120 is informed to that effect, and then the file name determination unit 120 determines the file name of the image data temporarily stored in step S103 by, according to a default value, information of the image capturing date and time (a serial number may be added) and information of the current position, and the image data is stored in the image file storage unit 118 (step S107).
  • When it is judged, as a result of the judgment performed in step S106, that a character string has been extracted, the file name determination unit 120 determines the file name of the image data temporarily stored in step S103 by combining together the character string extracted by the character string extraction unit 115 and one of the predetermined character strings “at” and “near” that is selected according to the extracted character string, and the image data is stored in the image file storage unit 118 (step S108).
  • When a current position is obtained while the mobile body is in a moving state (step S109), the character string extraction unit 115 refers to map data including the current position to extract a character string representing a name of a road or the like that specifies the current position of the mobile body, that is, the position at which the image capturing is performed by the image capturing unit 114 (current position), and sends the extracted character string to the file name determination unit 120. In contrast, when the character string extraction unit 115 fails to extract a character string representing a name of a road, or a name of a place, a facility, an intersection, a road, or the like at the current position of the mobile body, it extracts a character string representing a name of a place, a facility, an intersection, a road, or the like in an area within a predetermined distance from the current position, and sends the extracted character string to the file name determination unit 120.
  • Then, in processing performed in step S110, it is judged whether or not the character string extracted by the character string extraction unit 115 represents a name of a place, a facility, or the like near the current position. When it is judged that the extracted character string represents a name of a place, a facility, or the like near the current position, the process proceeds to step S111, in which the file name determination unit 120 determines a file name of the image data by adding the predetermined character string “moving near” shown in FIG. 8 to the character string representing a name of a place or the like extracted by the character string extraction unit 115, and the image data is stored in the image file storage unit 118.
  • When, in the judgment processing performed in step S110, no character string representing a name of a place, a facility, or the like near the current position has been extracted, the process proceeds to step S112, in which judgment processing is performed. In the judgment processing in step S112, it is judged whether or not a character string representing a name of a road has been extracted. When a character string representing a name of a road is extracted, the process proceeds to step S113, in which the file name determination unit 120 determines a file name of the image data by adding the predetermined character string “moving along” shown in FIG. 8 to the character string representing the name of the road, and the image data is stored in the image file storage unit 118.
  • When, in the judgment processing performed in step S112, no character string representing a name of a road is extracted, the process proceeds to step S107, in which the file name determination unit 120 determines a file name of the image data temporarily stored in step S103 by, according to the default value, using the image capturing date and time (a serial number may be added) and information of the current position, stores the image data in the image file storage unit 118, and the process is completed.
  • As hitherto described in detail, with the navigation device 1 according to the embodiments of the present invention, data of an image captured by the image capturing unit 14 is stored under a file name that includes a predetermined character string selected according to the state of the mobile body and the character string that the character string extraction unit 15 has extracted from map data based on the current position. As a result, it is easy to later arrange a large amount of image data in order or find out a desired piece of image data from a large amount of image data, with memories of a trip as a guide.
  • According to the present invention, display of a stored view is continued until a mobile body reaches a predetermined point after it makes a turn at an intersection. This makes it possible to compare a current view with the stored view.
  • Furthermore, the play key 202 is provided in the present invention, and when the play key 202 is operated, a stored image of a view with respect to an intersection is displayed. This makes it possible to display, before the user actually makes a turn at an intersection, an image of a view that is to be seen after turning at the intersection, at any timing that the user desires.
  • Incidentally, although captured images in the above embodiments are still images, the present invention is applicable to moving images as well. In that case, in playing back an image captured as an image of area behind the mobile body, it should be played back backward.
  • The present invention is also applicable to mobile-phone terminals equipped with a camera and navigation which have been commercially available.
  • INDUSTRIAL APPLICABILITY
  • The present invention is applicable to camera-equipped navigation devices for use in mobile bodies, camera-provided portable navigation devices, and mobile phone terminals having navigation and camera functions.

Claims (15)

1. A navigation device comprising:
a first storage unit storing a map;
a display unit; and
a control unit displaying the map on the display unit, wherein
the navigation device further comprising a second storage unit storing a captured image of an area to be entered after a turn is made at an intersection, and
the control unit displays the captured image stored in the second storage unit on the display unit before a mobile body reaches the intersection.
2. The navigation device of claim 1 further comprising a first image capturing unit capturing an image of area ahead of the mobile body in the traveling direction of the mobile body, wherein
when the mobile body has made a turn at the intersection, the control unit captures, by the first image capturing unit, an image of area ahead of the mobile body in the moving direction of the mobile body after the turn to obtain a captured image of the area, and
the control unit stores the captured image captured by the first image capturing unit in the second storage unit.
3. The navigation device of claim 2 further comprising a direction sensor detecting a moving direction of the mobile body, wherein
the control unit detects, according to detection result of the direction sensor, timing for capturing the image.
4. The navigation device of claim 1 further comprising a second image capturing unit capturing an image of area behind the mobile body in the moving direction of the mobile body, wherein
the control unit captures, by the second image capturing unit, an image of area behind the mobile body in the moving direction of the mobile body before the mobile body reaches the intersection to obtain a captured image of the area, and
the control unit stores the captured image captured by the second image capturing unit in the second storage unit.
5. The navigation device of claim 4 further comprising a detecting unit detecting ON/OFF states of a direction indicator provided in the mobile body, wherein
the control unit detects, according to detection result of the detecting unit, timing for capturing the image of an area behind the mobile body in the moving direction of the mobile body before the mobile body reaches the intersection.
6. The navigation device of claim 4 wherein,
when the mobile body has not made a turn at the intersection but gone straight therethrough, the control unit deletes the captured image of the area behind the mobile body in the moving direction of the mobile body captured by the second image capturing unit and stored in the second storage unit.
7. The navigation device of claim 4 wherein the control unit switches between ON and OFF modes to activate or deactivate image capturing operation performed by the second capturing unit, and in the ON mode is captured an image of area behind the mobile body in a moving direction of the mobile body.
8. The navigation device of claim 1 wherein,
when the mobile body has made a turn at the intersection, the control unit continues to play on the display unit the captured image stored in the second storage unit until the mobile body reaches a predetermined point after making the turn at the intersection.
9. The navigation device of claim 1 further comprising a play key, wherein
the control unit displays, on the display unit, the captured image stored in the second storage unit in response to the play key being operated.
10. The navigation device of claim 2 wherein,
when the mobile body has made a turn at the intersection, the control unit stores, in the second storage unit, a direction of the turn made at the intersection in association with the captured image, and
the control unit displays, on the display unit, the direction of the turn made at the intersection together with the captured image.
11. The navigation device of claim 4 wherein, when the mobile body has made a turn at the intersection, the control unit stores, in the second storage unit, in association with the captured image, a direction that is directly opposite to a moving direction in which the mobile body travels after making the turn at the intersection, and the control unit displays, on the display unit, together with the captured image, the direction that is directly opposite to the moving direction in which the mobile body travels after making the turn at the intersection.
12. A navigation device for a mobile body, comprising:
a current position detecting unit detecting a current position;
an image capturing unit;
an image file storage unit storing data of an image captured by the image capturing unit;
a map storage unit storing map information; and
a display unit displaying a map image, wherein
the navigation device further comprising:
a moving state detecting unit detecting whether the mobile body is in a moving state or in a stopping state;
a character string extraction unit extracting, from the map information, at least one of character strings representing a name of a place, a facility, a road, and an intersection at or near a current position; and
a file name determination unit determining a file name of data of an image captured by the image capturing unit, wherein
when the image capturing unit captures an image, a file name of data of the captured image is determined by combining a moving state detected by the moving state detecting unit, a character string extracted by the character string extraction unit, and a predetermined character string that is provided beforehand in the file name determination unit.
13. The navigation device of claim 12 wherein the predetermined character string contains either a character string indicating that the mobile body is stopping or a character string indicating that the mobile body is moving.
14. The navigation device of claim 13 wherein the predetermined character string contains a predetermined character string indicating that the character string extracted by the character string extraction unit from the map information is a character string related to a predetermined area near a current position.
15. An image management method comprising steps of:
generating a captured image of an area near a mobile body by an image capturing unit;
detecting a moving state of the mobile body whether the mobile body is in a moving state or in a stopping state;
obtaining current position information by extracting, from map information, at least one of character strings representing a name of a place, a facility, a road, and an intersection at or near the current position; and
determining a name of the captured image by combining a predetermined character string that is prepared beforehand, the moving state, and the current position information.
US12/596,722 2007-07-31 2008-07-28 Navigation device and image management method Active 2030-02-15 US8374779B2 (en)

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
JP2007199410A JP4900599B2 (en) 2007-07-31 2007-07-31 Navigation device
JP2007-199410 2007-07-31
JP2007-248510 2007-09-26
JP2007248510A JP4841527B2 (en) 2007-09-26 2007-09-26 Car navigation system and image data management method
PCT/JP2008/063504 WO2009017085A1 (en) 2007-07-31 2008-07-28 Navigation device and image management method

Publications (2)

Publication Number Publication Date
US20100138153A1 true US20100138153A1 (en) 2010-06-03
US8374779B2 US8374779B2 (en) 2013-02-12

Family

ID=40304315

Family Applications (1)

Application Number Title Priority Date Filing Date
US12/596,722 Active 2030-02-15 US8374779B2 (en) 2007-07-31 2008-07-28 Navigation device and image management method

Country Status (4)

Country Link
US (1) US8374779B2 (en)
KR (2) KR101057245B1 (en)
CN (2) CN101688777A (en)
WO (1) WO2009017085A1 (en)

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090012708A1 (en) * 2007-01-05 2009-01-08 Jui-Chien Wu Personal navigation devices and related methods
US20100329662A1 (en) * 2008-08-09 2010-12-30 Cheng-Chien Hsu Photographing Device of a Global Positioning System
US20120101720A1 (en) * 2010-10-25 2012-04-26 Mitac International Corp. Storage medium saving program and capable of being read by computer, computer program product, navigator and control method thereof
WO2013092058A1 (en) * 2011-12-21 2013-06-27 Navteq B.V. Image view in mapping
US20140107915A1 (en) * 2012-10-14 2014-04-17 Shan-Chih Yang System and related method for offering navigation guidance
US9212927B2 (en) 2011-06-30 2015-12-15 Here Global B.V. Map view
US9256983B2 (en) 2012-06-28 2016-02-09 Here Global B.V. On demand image overlay
US9256961B2 (en) 2012-06-28 2016-02-09 Here Global B.V. Alternate viewpoint image enhancement
US20160082597A1 (en) * 2013-05-22 2016-03-24 Neurala, Inc. Methods and apparatus for early sensory integration and robust acquisition of real world knowledge
US9322665B2 (en) 2012-06-05 2016-04-26 Apple Inc. System and method for navigation with inertial characteristics
US10006769B2 (en) 2012-06-11 2018-06-26 Samsung Electronics Co., Ltd. Terminal apparatus, method and system for setting up destination and providing information
US10161868B2 (en) 2014-10-25 2018-12-25 Gregory Bertaux Method of analyzing air quality
EP3477466A1 (en) * 2017-10-31 2019-05-01 Nokia Technologies Oy Provision of virtual reality content
US10317237B2 (en) * 2014-01-21 2019-06-11 Denso Corporation Navigation apparatus displaying information related to target intersection
US10499207B2 (en) 2012-06-11 2019-12-03 Samsung Electronics Co., Ltd. Service providing system including display device and mobile device, and method for providing service using the same
US11070623B2 (en) 2013-05-22 2021-07-20 Neurala, Inc. Methods and apparatus for iterative nonspecific distributed runtime architecture and its application to cloud intelligence

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101900571A (en) * 2010-08-13 2010-12-01 深圳市凯立德计算机系统技术有限公司 Display method of navigation information and navigation apparatus
CN102685850A (en) * 2011-03-15 2012-09-19 深圳富泰宏精密工业有限公司 Signal searching system and signal searching method
JP5708307B2 (en) * 2011-06-30 2015-04-30 アイシン・エィ・ダブリュ株式会社 Navigation system, navigation method, and navigation program
JP2013024709A (en) * 2011-07-20 2013-02-04 Nissan Motor Co Ltd Navigation device and navigation method
CN102831669A (en) * 2012-08-13 2012-12-19 天瀚科技(吴江)有限公司 Driving recorder capable of simultaneous displaying of map and video pictures
GB201219799D0 (en) * 2012-11-02 2012-12-19 Tomtom Int Bv Map Matching methods
CN103162709B (en) * 2013-03-11 2015-12-23 沈阳美行科技有限公司 The method for designing of enlarged drawing junction ahead prompting thumbnail in a kind of guider
JP2015041969A (en) * 2013-08-23 2015-03-02 ソニー株式会社 Image acquisition apparatus, image acquisition method, and information distribution system
JP6593786B2 (en) * 2015-02-23 2019-10-23 株式会社グッドワークコミュニケーションズ Road guide server, road guide program
CN106469513A (en) * 2015-08-18 2017-03-01 中兴通讯股份有限公司 A kind of method of route guidance, device and mobile terminal

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6199014B1 (en) * 1997-12-23 2001-03-06 Walker Digital, Llc System for providing driving directions with visual cues
US20060256212A1 (en) * 2005-05-16 2006-11-16 Samsung Electronics Co., Ltd. Image photographing apparatus, method of storing data for the same, and navigation apparatus using location information included in image data
US20060287819A1 (en) * 2005-01-18 2006-12-21 Christian Brulle-Drews Navigation system with intersection and three-dimensional landmark view
US20070067104A1 (en) * 2000-09-28 2007-03-22 Michael Mays Devices, methods, and systems for managing route-related information
US20070192020A1 (en) * 2005-01-18 2007-08-16 Christian Brulle-Drews Navigation System with Animated Intersection View
US7460953B2 (en) * 2004-06-30 2008-12-02 Navteq North America, Llc Method of operating a navigation system using images
US7630830B2 (en) * 2004-09-14 2009-12-08 Alpine Electronics, Inc. Navigation apparatus, driving direction guidance method, and navigation system
US20100070162A1 (en) * 2006-10-13 2010-03-18 Navitime Japan Co., Ltd. Navigation system, mobile terminal device, and route guiding method
US7827507B2 (en) * 2002-05-03 2010-11-02 Pixearth Corporation System to navigate within images spatially referenced to a computed space

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS5960214A (en) * 1982-09-29 1984-04-06 Nippon Denso Co Ltd Road guiding device for vehicle
JPH08171697A (en) * 1994-12-19 1996-07-02 Kyocera Corp Road guidance system
JPH0914976A (en) 1995-06-29 1997-01-17 Matsushita Electric Ind Co Ltd On-vehicle navigation system
JPH11132915A (en) 1997-10-31 1999-05-21 Fuji Heavy Ind Ltd Failure-diagnosing apparatus
JP3308247B2 (en) * 1999-09-08 2002-07-29 本田技研工業株式会社 Electronic camera system and map information display device
JP2002296061A (en) * 2001-03-29 2002-10-09 Hitachi Software Eng Co Ltd Guidance information providing method and guidance information providing program
US6668227B2 (en) * 2002-04-10 2003-12-23 Matsushita Electric Industrial Co., Ltd. Navigation apparatus
US7063256B2 (en) 2003-03-04 2006-06-20 United Parcel Service Of America Item tracking and processing systems and methods
JP3950085B2 (en) * 2003-06-10 2007-07-25 株式会社つくばマルチメディア Map-guided omnidirectional video system
JP2005275978A (en) * 2004-03-25 2005-10-06 Fuji Photo Film Co Ltd Photographing apparatus
JP2006221362A (en) 2005-02-09 2006-08-24 Sanyo Electric Co Ltd Navigation apparatus
JP2007115077A (en) * 2005-10-21 2007-05-10 Pioneer Electronic Corp Communication terminal, information display method, information display program and storage medium

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6199014B1 (en) * 1997-12-23 2001-03-06 Walker Digital, Llc System for providing driving directions with visual cues
US20070067104A1 (en) * 2000-09-28 2007-03-22 Michael Mays Devices, methods, and systems for managing route-related information
US7827507B2 (en) * 2002-05-03 2010-11-02 Pixearth Corporation System to navigate within images spatially referenced to a computed space
US7460953B2 (en) * 2004-06-30 2008-12-02 Navteq North America, Llc Method of operating a navigation system using images
US7630830B2 (en) * 2004-09-14 2009-12-08 Alpine Electronics, Inc. Navigation apparatus, driving direction guidance method, and navigation system
US20060287819A1 (en) * 2005-01-18 2006-12-21 Christian Brulle-Drews Navigation system with intersection and three-dimensional landmark view
US20070192020A1 (en) * 2005-01-18 2007-08-16 Christian Brulle-Drews Navigation System with Animated Intersection View
US20060256212A1 (en) * 2005-05-16 2006-11-16 Samsung Electronics Co., Ltd. Image photographing apparatus, method of storing data for the same, and navigation apparatus using location information included in image data
US20100283867A1 (en) * 2005-05-16 2010-11-11 Choi Hyong-Uk Image photographing apparatus, method of storing data for the same, and navigation apparatus using location information included in image data
US20100070162A1 (en) * 2006-10-13 2010-03-18 Navitime Japan Co., Ltd. Navigation system, mobile terminal device, and route guiding method

Cited By (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090012708A1 (en) * 2007-01-05 2009-01-08 Jui-Chien Wu Personal navigation devices and related methods
US20100329662A1 (en) * 2008-08-09 2010-12-30 Cheng-Chien Hsu Photographing Device of a Global Positioning System
US20120101720A1 (en) * 2010-10-25 2012-04-26 Mitac International Corp. Storage medium saving program and capable of being read by computer, computer program product, navigator and control method thereof
US9212927B2 (en) 2011-06-30 2015-12-15 Here Global B.V. Map view
WO2013092058A1 (en) * 2011-12-21 2013-06-27 Navteq B.V. Image view in mapping
US9322665B2 (en) 2012-06-05 2016-04-26 Apple Inc. System and method for navigation with inertial characteristics
US10006769B2 (en) 2012-06-11 2018-06-26 Samsung Electronics Co., Ltd. Terminal apparatus, method and system for setting up destination and providing information
US10499207B2 (en) 2012-06-11 2019-12-03 Samsung Electronics Co., Ltd. Service providing system including display device and mobile device, and method for providing service using the same
US9256983B2 (en) 2012-06-28 2016-02-09 Here Global B.V. On demand image overlay
US9256961B2 (en) 2012-06-28 2016-02-09 Here Global B.V. Alternate viewpoint image enhancement
US10030990B2 (en) 2012-06-28 2018-07-24 Here Global B.V. Alternate viewpoint image enhancement
US9194715B2 (en) * 2012-10-14 2015-11-24 Mitac International Corp. System and related method for offering navigation guidance
US20140107915A1 (en) * 2012-10-14 2014-04-17 Shan-Chih Yang System and related method for offering navigation guidance
US20160082597A1 (en) * 2013-05-22 2016-03-24 Neurala, Inc. Methods and apparatus for early sensory integration and robust acquisition of real world knowledge
US10300603B2 (en) * 2013-05-22 2019-05-28 Neurala, Inc. Methods and apparatus for early sensory integration and robust acquisition of real world knowledge
US10974389B2 (en) 2013-05-22 2021-04-13 Neurala, Inc. Methods and apparatus for early sensory integration and robust acquisition of real world knowledge
US11070623B2 (en) 2013-05-22 2021-07-20 Neurala, Inc. Methods and apparatus for iterative nonspecific distributed runtime architecture and its application to cloud intelligence
US10317237B2 (en) * 2014-01-21 2019-06-11 Denso Corporation Navigation apparatus displaying information related to target intersection
US10161868B2 (en) 2014-10-25 2018-12-25 Gregory Bertaux Method of analyzing air quality
EP3477466A1 (en) * 2017-10-31 2019-05-01 Nokia Technologies Oy Provision of virtual reality content

Also Published As

Publication number Publication date
KR101057245B1 (en) 2011-08-16
KR20090130118A (en) 2009-12-17
CN101688777A (en) 2010-03-31
US8374779B2 (en) 2013-02-12
CN102589564A (en) 2012-07-18
KR20110079782A (en) 2011-07-07
WO2009017085A1 (en) 2009-02-05

Similar Documents

Publication Publication Date Title
US8374779B2 (en) Navigation device and image management method
EP2442072B1 (en) Route search device and route search method
JP4622676B2 (en) Car navigation system
US20090198443A1 (en) In-vehicle navigation device and parking space guiding method
JP4120651B2 (en) Route search device
JPH1123299A (en) Car navigation system and storage medium
JP2006194665A (en) Portable terminal with navigation function
JP2006038558A (en) Car navigation system
US8428865B2 (en) Navigation system and roadway search method
JP2008232938A (en) Route guidance device
JP2011038970A (en) Navigation system
JP2002342330A (en) Navigation system
JP2006313167A (en) Navigation device for vehicle and route guidance method
JP4841527B2 (en) Car navigation system and image data management method
JP2002350145A (en) Method and apparatus for present position detection for navigation
JP5115862B2 (en) Route guidance device
JP2010197083A (en) Map data updating device
JP5279232B2 (en) Navigation device
JP2010181265A (en) Navigation system and program for navigation
JP4274913B2 (en) Destination search device
JP2009014692A (en) Navigation device
JP2011179933A (en) Navigation apparatus and program for navigation
JP2007132724A (en) Navigation device and facility search method
JP3770325B2 (en) Navigation device
JP4900599B2 (en) Navigation device

Legal Events

Date Code Title Description
AS Assignment

Owner name: SANYO ELECTRIC CO., LTD.,JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:ABE, YUICHI;WATANABE, RYO;YAMANE, KAZUHIRO;AND OTHERS;REEL/FRAME:023395/0690

Effective date: 20090903

Owner name: SANYO CONSUMER ELECTRONICS CO., LTD.,JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:ABE, YUICHI;WATANABE, RYO;YAMANE, KAZUHIRO;AND OTHERS;REEL/FRAME:023395/0690

Effective date: 20090903

Owner name: SANYO CONSUMER ELECTRONICS CO., LTD., JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:ABE, YUICHI;WATANABE, RYO;YAMANE, KAZUHIRO;AND OTHERS;REEL/FRAME:023395/0690

Effective date: 20090903

Owner name: SANYO ELECTRIC CO., LTD., JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:ABE, YUICHI;WATANABE, RYO;YAMANE, KAZUHIRO;AND OTHERS;REEL/FRAME:023395/0690

Effective date: 20090903

STCF Information on status: patent grant

Free format text: PATENTED CASE

FEPP Fee payment procedure

Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

FPAY Fee payment

Year of fee payment: 4

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 8

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 12TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1553); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 12