US5877751A - Touch display type information input system - Google Patents

Touch display type information input system Download PDF

Info

Publication number
US5877751A
US5877751A US08/909,765 US90976597A US5877751A US 5877751 A US5877751 A US 5877751A US 90976597 A US90976597 A US 90976597A US 5877751 A US5877751 A US 5877751A
Authority
US
United States
Prior art keywords
switch
display
touching
reaction
area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
US08/909,765
Inventor
Hiroyuki Kanemitsu
Kyomi Morimoto
Yukiyoshi Suzuki
Kazuteru Maekawa
Hitoshi Asano
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Aisin AW Co Ltd
Original Assignee
Aisin AW Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP22846694A external-priority patent/JP3330239B2/en
Priority claimed from JP24437394A external-priority patent/JP3469329B2/en
Application filed by Aisin AW Co Ltd filed Critical Aisin AW Co Ltd
Priority to US08/909,765 priority Critical patent/US5877751A/en
Application granted granted Critical
Publication of US5877751A publication Critical patent/US5877751A/en
Anticipated expiration legal-status Critical
Expired - Fee Related legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • G06F3/0418Control or interface arrangements specially adapted for digitisers for error correction or compensation, e.g. based on parallax, calibration or alignment
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3605Destination input or retrieval
    • G01C21/3611Destination input or retrieval using character input or menus, e.g. menus of POIs
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3605Destination input or retrieval
    • G01C21/3614Destination input or retrieval through interaction with a road map, e.g. selecting a POI icon on a road map
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3664Details of the user input interface, e.g. buttons, knobs or sliders, including those provided on a touch screen; remote controllers; input using gestures
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3667Display of a road map
    • G01C21/367Details, e.g. road map scale, orientation, zooming, illumination, level of detail, scrolling of road map or positioning of current position marker
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3667Display of a road map
    • G01C21/3673Labelling using text of road map data items, e.g. road names, POI names
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus

Definitions

  • This invention relates to a display touch type information input system for inputting operating information by detecting the touching of a switch displayed on screen.
  • radio sets, audio sets, air conditioners and many other accessory units which have no direct bearing on the running of vehicles have been mounted in vehicles, and there is a trend for an increasing number of these accessory units.
  • these accessory units are navigation units, and vehicles with navigation units mounted therein are increasing in number.
  • the navigation unit serves to assist driving by displaying the present location on a map, and there are many navigation units which undertake route guidance by displaying the route to a destination when the destination is input.
  • Such a navigation unit requires a display for the map display.
  • various operations are necessary for the input of the destination and other purposes.
  • the space available within a vehicle is limited, and it is impossible to provide exclusive switches for respective necessary operations.
  • the display surface is utilized as a touch switch panel, that is, it is utilized as a switch.
  • the same display is utilized as a switch for operating an air conditioner, an audio set, etc., thus dispensing with some of the dedicated switches.
  • touch switch panel There are many different types of touch switch panel, such as an electrostatic capacitance type, an optical type, etc. In many cases, however, irrespective of the type of touch switch panel used, the touch detection area is slightly above the actual display surface.
  • the electrostatic capacitance type use is made of deformation of a film provided on the display surface, and in the optical type blocking of light beam provided along the display surface is detected. Therefore, the actual detection area is above the display surface. This leads to a problem that an error is produced between a switch displayed on the display and the actual detection position.
  • an acrylic acid resin sheet for screen protection and an LCF (light control film) for suppressing reflection of light by the screen surface are provided between the switch display on the screen and a detection light beam.
  • the thickness thus provided constitutes a corresponding distance between switch display and reaction point. In the optical type, therefore, the problem noted above is particularly pronounced.
  • the display that is provided in the vehicle can not be disposed right directly in front of the driver's seat. That is, it has to be disposed at an intermediate position between the driver seat and the passenager seat. Inevitably, therefore, the driver's eyesight is directed obliquely with respect to the display. This leads to an error between the switch display on the display and directly above the touch reaction point (shown shaded), as shown in FIG. 1, and it is impossible to obtain correct detection of the driver's point of touch.
  • Japanese Utility Model Publication No. 121641/1987 shows precluding the deviation between display switch and reaction area due to a deviation of the driver's eyesight direction by shifting a switch display position (i.e., the whole display on the screen) according to the operator's position.
  • a switch display position i.e., the whole display on the screen
  • seating detection switches are provided on the driver and passenger seats, and the switch display position is shifted by specifying the operator from the status of the seating detection switches. More specifically, when the driver alone is present, the display position is shifted to the left, while when both the driver and a passenger are present it is held at the center.
  • the reaction points can not be arranged very closely. Therefore, where a plurality of switches are displayed, the switch display may be comparatively small, and the number of reaction points of the switch display may be considerably small.
  • the switch display and reaction points are deviated as shown in FIG. 3.
  • the switch a and b are equal in size, the switch a is constituted by two reaction points, while the switch b is constituted by four reaction points.
  • An object of the invention is to provide a screen touch type information input system, which can realize sufficient improvement in operability.
  • a reaction area corresponding to a display switch is enlarged from the consideration of the operator's eyesight direction. It is thus possible to increase the accuracy of detection of switch operation made by looking at the display obliquely, while maintaining the operability when the display is viewed from the front right thereof.
  • FIG. 1 is a view showing the relationship between eyesight direction and reaction points
  • FIG. 2 is a view showing an example of the relationship between switch display and reaction points
  • FIG. 3 is a view showing different examples of relationships between switch display and reaction points
  • FIG. 4 is a block diagram showing an embodiment of the invention.
  • FIG. 5 is a view showing a touch panel
  • FIG. 6 is a view showing an array of reaction points in a touch panel
  • FIG. 7 is a flow chart illustrating the operation of the embodiment
  • FIG. 8 is a view showing an example of enlarging a normal reaction area
  • FIG. 9 is a view showing an example of enlarging a reaction area with consideration of inter-switch interval
  • FIG. 10 is a view showing an example of enlarging an area in the case when the use of an adjacent switch is prohibited;
  • FIG. 11 is a flow chart illustrating operation in the case when there is a switch, the use of which is prohibited;
  • FIG. 12 is a view showing an example of display in the case when there is a switch, the use of which is prohibited;
  • FIG. 13 is a schematic showing the entire system structure
  • FIG. 14 is a flow chart illustrating the operation of the embodimentt
  • FIG. 15 is a view showing a storage state of a memory
  • FIG. 16 is a view showing a destination setting method choice display
  • FIG. 17 is a view showing a facility name choice display
  • FIG. 18 is a view showing a prefectural district list display
  • FIG. 19 is a view showing a facility list dislay
  • FIG. 20 is a view showing a facility neighborhood map display
  • FIG. 21 is a view showing a position change display
  • FIG. 22 is a view showing a search condition confirmation display
  • FIG. 23 is a view showing a passing point setting display
  • FIG. 24 is a view showing another search condition confirmation display
  • FIG. 25 is a view showing a route search state display
  • FIG. 26 is a view showing a whole route guide display
  • FIG. 27 is a view showing a route guide state display
  • FIG. 28 is a view showing a guide display in the case of departure from the route
  • FIG. 29 is a view showing an enlarged-scale intersection neighborhood display
  • FIG. 30 is a view showing a guide road list display
  • FIG. 31 is a view showing a route change display
  • FIG. 32 is a view showing a route correction display
  • FIG. 33 is a view showing a destination arrival guide display.
  • the embodiment of the system comprises a touch panel 10, which includes an LCD display 10a and an optical touch panel 10b provided on the surface of the LCD display 10a.
  • the LCD display 10a is connected via a driver 20 to a display ECU (electronic control unit) 22.
  • the display ECU impresses a predetermined voltage via the driver 20 on liquid crystal of the LCD display 10a at a desired position thereof for display.
  • the touch panel 10b is also connected to the display ECU 22, and it detects touch to some of the reaction points arranged in a matrix array.
  • a main ECU 30 is connected to the display ECU 22. According to a signal from the main ECU 30, the display ECU 22 determines the contents of display on the LCD display 22. In this embodiment, the main ECU 30 feeds an RGB signal (i.e., color television signal) about the display on the LCD display 10a, and according to this signal the display ECU 22 controls the voltage output of the driver 20 to obtain a desired color display. Meanwhile, on-off information (i.e., touch information) about the individual reaction points on the touch panel 10b is directly transmitted to the touch panel 10b. The main ECD 30 detects an on-off state of the display switch from the display on the LCD display 10a and reaction point on-off information.
  • RGB signal i.e., color television signal
  • the display and touch detection are made on the display ECU.
  • the main ECU 30 it is possible that such processes are executed by the main ECU 30.
  • a separate ECU may be provided, which executes only the processes of touch detection and judgment.
  • a logic circuit for touch judgment may be constructed by hardware.
  • a GPS (grobal positioning system) 32 for detecting the absolute position (i.e., latitude, altitude and height) of the vehicle by receiving radio waves from an artificial satellite
  • a CD-ROM 34 for reproducing optical disk with map information stored therein
  • a vehicle speed sensor 36 for detecting the running speed of the vehicle
  • a geo-magnetic sensor for detecting the bearing of the vehicle
  • a parking switch 40 for detecting the parking state of the vehicle.
  • the main ECU 30 thus feeds signals concerning about display by determining the present position from the absolute position detected by the GPS 32 and covered distance obtained from the vehicle speed and geo-magnetism sensors 36 and 38, etc. and determining the content of display on the LCD display 10a from a map obtained from the CD-ROM 34, etc. Further, when displaying switches to be operated, the main ECU 30 determines the relationship between display switch and reaction points on the touch panel 10b. Further, a TV tuner is connected to the main ECU 22. It is thus possible to produce a TV display on the LCD display 10a. Further, it is possible to operate an air conditioner, etc. with the touch panel 10.
  • the optical touch panel 10b in this embodiment is shown in FIG. 5. It has light-emitting and light-receiving element rows disposed around the LCD display 10a. More specifically, along the right and upper side of the LCD display 10a a plurality of infrared LEDs (light-emitting diodes) 12 are provided in a row. Along the left and upper sides a plurality of photo-transistors 14 are provided in a row. Infrared radiation from the light-emitting diodes 12 is thus received by the respectively opposing photo-transistors 14.
  • the light-emitting diodes 12 and light-receiving transistors 14 form a matrix of a corresponding number of reaction points as shown in FIG. 6.
  • the main ECU 30 sets the reaction area of a switch displayed on the LCD display 10a in correspondence to the shape of the switch. It detects the touch to the reaction points in the reaction area to determine the on-off state of the switch.
  • the detection of the on-off state of switch will now be described with reference to the flow chart of FIG. 7.
  • the area directly above the display of the switch is set directly as a reaction area, and the reaction area is enlarged in the judging process.
  • step S11 when the touch panel 10b is touched in the presence of the switch display (step S11), the Nos. (1 to 169) of the touched reaction points are detected by the display ECU 22 and taken in to the main ECU 30 (step S12).
  • the main ECU 30 checks whether there is any meaningless a reaction point (i.e., a reaction point not corresponding to any reaction area) (step S13). If no meaningless reaction point is found, it is determined that a switch corresponding to meaningful reaction point (i.e., reaction point corresponding to a reaction area) has been operated (step S14).
  • step S13 If a meaningless reaction point is found in the step S13, a check is made as to whether there is any reaction point in a reaction area corresponding to a switch (step S15).
  • a switch of a reaction area located at a reaction point of the No. smaller by one than that of the touched meaningless reaction point i.e., a reaction point on the left side of the touched meaningless reaction point
  • the left side reaction point is selected on the assumption that the operator is the driver seated in the right side seat. If it is known that preference is given to the left side seat, a switch of the No. larger by one (i.e., right side reaction point) may be determined. With the left end reaction point, this process is null because the reaction point with a No. smaller by one is on the right end. In many cases of usual operation, there is only a single switch located on the left side of a meaningless reaction point. When there are two or more such switches, the numbers of reaction points corresponding to these switches may be referred to for verification.
  • step S15 If it is found in the step S15 that there is a reaction point in a reaction area corresponding to a switch, the numbers of such reaction points and meaningless reaction points are compared (step S17). If the number of reaction points in the reaction area corresponding to the switch is greater, it is determined that the switch corresponding to these reaction points has been operated (step S18).
  • step S17 If it is found in the step S17 that the number of meaningless reaction points is greater, a check is made of the position relationship between the meaningless reaction points and the reaction area corresponding to the switch (step S19). According to the result of this check, switch operation is determined as follows. If the meaningless reaction points are located under a reaction area corresponding to a switch, it is determined that a switch immediately over the meaning reaction points has been operated (step S20). If the meaningless reaction points are located over the reaction area corresponding to a switch, it is determined that a switch immediately under the meaningless reaction points has been operated (step S21).
  • step S22 If the meaningless reaction points are located on the right side of the reaction area corresponding to a switch, it is determined that a switch on the immediate left side of the meaningless reaction points has been operated (step S22). If the meaningless reaction points are located on the left side of the reaction area corresponding to a switch, it is determined that a switch on the immediate right side of the meaningless reaction points has been operated (step S23).
  • the result of the check of the position relationship in the step S19 may be from a map or from operational processing of arithmetic formulas.
  • the touch to a meaningless reaction point may be regarded to be the touch to a nearby reaction point.
  • an area directly over a switch display was set directly as a reaction area, and the reaction area was changed when and only when the touch to a predetermined meaningless reaction point was detected.
  • various reaction areas can be comparatively readily set.
  • the main ECU 30 has internally memorized display switch positions, and it is possible to set a reaction area to an enlarged scale in advance. Further, it is possible to further enlarge a reaction area in the manner as shown in FIG. 7 as described at the time of detection of the touch to a meaningless reaction point in a check executed after the setting of the enlarged reaction area.
  • reaction areas may be allotted to displayed switches as shown in FIGS. 8 to 10.
  • a reaction area which includes the entire top of the display switch and is enlarged to the right and downward.
  • the touch of a reaction point in this reaction area is judged to be a switch operation.
  • the reaction area is merely enlarged, correct detection can also be obtained when the switch operation is made with the operator's face brought to the front of the screen.
  • the enlarged reaction area is made greater according to the interval. This is done because the possibility of erroneous detection due to reaction area enlargement is low. By determining the extent of reaction area enlargement according to the display switch interval, the switch operation can be detected more effectively.
  • switches which can not be operated during driving are also displayed.
  • the setting of a destination or the like is made according to the input of the kind of destination (such as department store, golfing place, station, etc.), address, telephone No., etc.
  • Such an operation is comparatively complicated and is executed by watching the display.
  • a switch is displayed in thin form (as shown shaded in the Figures) so such as it can not be operated. If such a switch is operated, the operation is made null, while displaying a message that "the switch can not be operated during driving of the vehicle, so please use the switch after parking.”
  • the possibility of operating a switch which can not be operated is low.
  • reaction area is further enlarged as shown in FIG. 10, so that the enlarged reaction area of the switch covers up to reaction points adjacent to a switch which can not be operated during driving of the vehicle.
  • the switch operation recognition factor can be increased by allotting all the reaction points between the two switches to the reaction area of the operable switch.
  • any inoperable switch is tom down (step S31). Specifically, of the display switches as shown in FIG. 12 the "TELEPHONE NO.” switch for setting the destination from a telephone No., the "ADDRESS” switch for setting a destination from an address, facility name switches for setting a destination from a facility name, the "MEMORIZED POINT” switch for setting destination among memorized points, and a "PREVIOUS DEPARTURE POINT” switch are toned down.
  • the "HOME” switch is held operable without toning it down. (In the Figure, only the “HOME” switch is shown shaded). This is done because the destination can be set by merely touching the "HOME” switch, the operation being thus simple and capable of execution without any trouble during driving.
  • the area covering the reaction points corresponding to the operable switch is enlarged (step S32).
  • the running state of the vehicle may be detected with reference to zero vehicle speed.
  • the parking position of a shift lever may be detected using a parking position switch 40.
  • the parking state of the vehicle may be detected through detection of an "on" state of the side brake.
  • the routine shown in FIG. 11 is called by an interrupt when the running state of the vehicle is detected.
  • the display itself is fixed.
  • various available displays are those which are capable of angle adjustment. That is, there are displays which can be directed toward the driver's seat in some cases and toward the passenger seat in other cases. With such a display, whether the operator is the driver or the passenger can be determined from the display installation angle. It is thus possible to provide a display position detector (which may detect rotation of the display as the display is usually rotated about its center line) and determine the reaction area enlargement direction by specifying the operator according to the result of detection.
  • the method of the operator's eyesight direction estimation is not limited to the above display angle adjustment, and the same effects are obtainable with other methods.
  • a conceivable different method for instance, is such that an "PASSENGER SEAT OPERATION" key is displayed, and that if this key is touched, it is determined that the operator's eyesight direction is from the passenger seat.
  • This structural example concerns a map call/display system for a navigation system, which can call and display maps concerning specified items of call.
  • a map may be called by specifying a facility name such as a golfing place name, a station name, etc., specifying a telephone No. or an address, or specifying a memorized place as noted before.
  • Prior art navigation systems can call maps according to such input. As an example, when the home of an acquaintance which is located behind a certain golfing place is chosen as a destination, first the name of the golfing place is input as a facility name. Then, a map of the neighborhood of the golfing place is called and displayed. On the displayed map, the final destination is set, and a route search is made.
  • a pertinent map may be called and displayed by inputting comparatively rough information, such as a city name, an urban telephone exchange name, etc., and the final destination may be set on the displayed map.
  • a map which is called in the above way according to the input information may sometimes be of a scale which is not suited to the setting of the final destination.
  • the same scale of display is not suitable when calling a private shop having a small installation area and when calling a golfing place having a vast area.
  • the called map can not be contained within the screen.
  • the instant structural example seeks to solve the above problems.
  • FIG. 13 is a schematic showing the entire structure of the example of a map call/display system.
  • a display/touch system 110 as shown serves as map display means.
  • An electro multi-television ECU 112 is connected to the dislay/touch panel 110.
  • the electro multi-television ECU 112 has roles of providing displays and detecting the touching of the touch panel.
  • a CD-ROM changer 114 is connected to the electro multi-television ECU 112.
  • a map CD-ROM 114b is connected via magazine 114a to the CD-ROM changer 114.
  • the map CD-ROM 114b serves as memory means for storing map data.
  • the electro multi-television ECU 112 serves as map call means for obtaining desired map information from the CD-ROM changer 114.
  • a GPS receiver 120 is connected to the electro multi-television ECU 112.
  • the GPS receiver 120 receives signals from an artificial satellite via a GPS antenna 120a to detect the absolute position (altitude and latitude) of the vehicle and supplies detection data to the electro muilti-television ECU 112. The absolute position of the vehicle can thus be detected in the electro multi-television ECU 112.
  • Various sensors 122 for detecting the running state of the vehicle are connected to the electro multi-television ECU 112. The running condition thus can be ascertained at all times.
  • the sensors 122 are a geo-magnetism sensor 122a, a wheel sensor 122b, a steering sensor 122d and a distance sensor 122d.
  • the electro multi-television ECU 112 can thus grasp the bearing, speed and steering angle of the vehicle and distance covered thereby from the results of detection by the sensors 122.
  • the electro multi-television ECU always calculates the present position from the detection signals from the sensors 122, and makes higher accuracy detection of the present position by combining the result of detection from the GPS receiver 20 with the calculated present position.
  • the electro multi-television ECU 112 is further supplied with detection signals from various switches 124.
  • signals from an accessory switch 122a, an ignition switch 124b, a parking brake switch 124c, an alternator 124d and a check terminal 124e are supplied to the electro multi-television ECU 122.
  • the electro multi-television ECU 112 can ascertain running conditions of the vehicle, such as whether the ignition key is "on", whether the vehicle is parked with the parking brake operated, whether the electricity generation state is satisfactory, and whether various accessories mounted in the vehicle are normal.
  • a loudspeaker 126 is connected to the electro multi-television ECU 112.
  • a guidance voice for route guidance is output from the loundspeaker 126.
  • a TV tuner 130 is provided to permit display of a television signal intercepted by a TV antenna 130a built into the window glass on the screen of the display/touch panel 110.
  • An air conditioner ECU 132 is further connected to the display/touch panel 110.
  • the air conditioner can thus be operated by a switch displayed on the display/touch panel 110.
  • the system includes an audio CD changer 134b for reproducing data from a CD-ROM 134c.
  • CD-ROM 134c data for sight-seeing guides, etc. is stored.
  • video data can be supplied from the CD-ROM 134c to the electro multi-television ECU 112 to display sight-seeing guide displays on the display/touch panel 110.
  • voice data from a music CD 134b and CD-ROM 134c can be supplied through an audio amplifier 136 to loudspeakers 138 to output predetermined voices.
  • an audio head unit 140 is connected to the audio amplifier 136 to permit signals from an FM radio unit, an AM radio unit, etc. to be output to the loudspeakers 38.
  • information for the destination setting is first input by operating the display/touch panel 110.
  • the destination is set according to an address.
  • information "Toyoda City, Akita Prefecture" is input, and it is determied to be the final choice item (step S101).
  • the electro multi-television ECU 112 determines the center position of the input "Toyoda City” and the scale of the display from data stored in the map CD-ROM 114b (step S102).
  • the center position adopted may be the center position of the area (i.e., administrative area) of Toyoda City, the location of the administrative office of the city, etc.
  • the scale of the display has been determined from the area and shape of the pertinent city (i.e., Toyoda City in this case), and this information has been stored. When displaying the area, a scale is required which permits maximum enlargement of the area without a non-display portion. This data is stored together with data of the item (i.e., Toyoda City in this case) in the CD-ROM 114b.
  • the data has already been read out from the CD-ROM 114b into a memory of the electro multi-television ECU 112.
  • FIG. 15 An example of the configuration of this data is shown in FIG. 15.
  • the group name, name i.e., kanji name in the case of a Japanese name
  • altitude and latitude of the destination and the scale of the display are stored.
  • the scale of display one is stored which is best suited, in both the altitude and latitude scales to the display of the area as described above.
  • the group is the kind of item, such as golfing place, amusement park, etc. (corresponding to the facility name), and it represents the kind of call subject together with the name.
  • the scale of display is stored in the CD-ROM as shown in FIG. 15.
  • step S103 Since the center position of display and the display scale are determined in the above way, the pertinent map data are read out (step S103).
  • the read-out map data are displayed on the display/touch panel (step S104).
  • the best suited display scale is determined according to the kind of final choice item as determined in the step S101. If the final choice item determined in the step S101 is not at the level of city, town or village but specifies the address, the display scale is determined under the assumption that the area of the pertinent address is to be displayed. If an urban telephone exchange No. or a golfing place name is specified, a display scale corresponding to the final choice item is read out. Thus, the map call is always made on the basis of the best display scale, and the best map display is carried out.
  • a golfing place is displayed on a map with a scale of 1/100,000, it is partly displayed on the screen.
  • the operator wishes to search for a desired place, for instance a club house parking area, in the golfing place.
  • a desired place for instance a club house parking area
  • the destination could be set correctly.
  • the destination could not be found, it would be set to be a different place, for instance a parking area on the opposite side of the club house.
  • the destination is usually set to be the club house parking area which the general driver decides to be the destination. This means that by setting the destination directly on the display, correct destination setting can be obtained.
  • the golfing place as a destination is set in an optimum size, i.e., a size which is not excessively large or small.
  • the destination setting can thus be completed at a level intending to find golfing place.
  • the destination setting can be obtained without a change in the predetermined club house position. In this way, in this example correct destination setting can be obtained.
  • FIG. 16 shows a display for choosing a method of destination setting.
  • the display includes "TELEPHONE NO.” and “ADDRESS” keys. It also includes “GOLFING PLACE” and “OTHER FACILITIES” keys for facility names (or groups). It further includes “HOME”, “MEMORIZED POINT” and “PREVIOUS DEPARTURE POINT” keys for registered points.
  • a navigation message saying "You can call and set a map of destination neighborhood from telephone No. or facility name.” is produced.
  • the "OTHER FACILITIES” key is touched.
  • a destination facility name choice display as shown in FIG. 17 is provided. At this time, a navigation message saying "Please choose destination facility name.” is also produced.
  • the operator thus chooses and touches a desired key among "AMUSEMENT PARK", “SKIING PLACE”, etc. keys.
  • a prefectural district list display as shown in FIG. 18 is provided.
  • a navigation message saying "Please choose prefecture name corresponding to destination.” is produced.
  • the "AMUSEMENT PARK" key is touched as facility name.
  • the prefectural district containing the pertinent amusement park is determined by touching a corresponding prefectural district key.
  • the prefectural districts would be comparatively broad areas corresponding to the states.
  • the first prefectural district display part in the display has a "COUNTRY" key.
  • a display is provided in which the facilities all over the country are listed, while producing a navigation message saying "Please chose destination name, and you can call a neighborhood map.”
  • a desired one of the facilities all over the country can be chosen.
  • a prefectural district is chosen
  • a facility choise display as shown in FIG. 19 is provided, which shows a list of amusement parks in the chosen prefectural district.
  • a desired facility is chosen.
  • a map of the neighborhood of the chosen facility as shown in FIG. 20 is displayed, while a navigation message is produced which says "Please touch "SET” key, and you can set destination.”
  • the destination is set.
  • the best map scale for display is stored as shown in FIG. 15 in correspondence to the area and shape of the specified facility (i.e., amusement park in this case).
  • the called map is in the best scale, and the whole amusement park and the neighborhood thereof are displayed on the screen.
  • the destination is set as the amusement park.
  • the actual destination that is set is a parking area which is closest to the front gate of the amusement park which is usually set as a target.
  • the destination setting is completed by touching the "SET" key.
  • the destination setting is completed, data of roads near the destination or the like are confirmed. If there is no problem, a search condition confirmation display is provided for search up to the destination. Meanwhile, if there is no nearby road or the like suited for navigation after the destination setting, such a display message as "No road suited for navigation is found, so please operate again after moving the destination point to the vicinity of a trunk road” is provided. Then, an arrow mark display like that shown in FIG. 21 is provided for re-setting of the destination.
  • a search condition confirmation display as shown in FIG. 22 is provided.
  • a navigation message saying "Touch "SEARCH START" key, and search of route up to destination will be started on condition that preference is given to toll road.” is produced.
  • whether or not there is a designated passing point is set. Also, setting as to whether or not preference is given to a toll road is made.
  • a passing point is to be designated on the route up to the destination is chosen by key touch.
  • a passing point is designated, a best route passing through that point before reaching the destination is chosen in the route search.
  • a passing point setting choice display as shown in FIG. 23 is provided. This display resembles that for the destination setting.
  • the passing point can be set in a manner similar to the destination setting as described above.
  • the "SET" key on the display is touched.
  • the setting of the passing point is completed, and data of the vicinity of the passing point can be confirmed.
  • a search condition choice confirmation display as shown in FIG. 24 is provided.
  • route search is started by touching a "START SEARCH" key.
  • a display as shown in FIG. 25 is provided for route search.
  • the route search is ended, the whole route from the present point to the destination is displayed on map. At the same time, the entire distance (in km) to be covered is displayed as shown in FIG. 26. At this time, it is informed by navigation message that navigation along a route passing through the designated passing point will be made.
  • a "SEARCH” key is displayed. When this "SEARCH” key is touched, a navigation route search is made.
  • a present position display can be provided by depressing a present position switch. It is thus possible to confirm the present position during the navigation.
  • the search is ended, the whole route and the entire distance to be covered are displayed by touching "DISPLAY ROUTE" key.
  • the navigation route search could not be made, this is displayed, and also a "CON FIRM" key is displayed. Then, by touching the "CONFIRM” key, a present position display is provided. By touching the "SEARCH” key after departure and after running past the displayed present position, the navigation route search is started again. In other words, if there was no road suitable for navigation near the present position, this fact is displayed, and it is instructed to carry out a search again in the neighborhood of a trunk road. If the route search could not be made any reason other than a problem with a road in the vicinity of the present position, merely the fact that the route search could not be made is displayed, and it is instructed to undertake the operation afresh.
  • route navigation up to the destination is started by touching a "START NAVIGATION" key on the display or after 15 seconds of running. That is, a route navigation display as shown in FIG. 27 is provided at this time.
  • the display includes a present position mark shown at the center of the display. Also, a "DURING NAVIGATION" display is provided in an upper portion of the display.
  • the vicinity of the passing point (only when the passing point has been set) and the distance to be covered up to the destination neighborhood arc displayed in a right lower portion of the display.
  • the "DURING NAVIGATION" display is removed, and a "RE-SEARCH” key is displayed in a lower portion of the display as shown in FIG. 28.
  • the route search can be made afresh.
  • an intersection When an intersection is approached during route navigation, it is detected, and an enlarged-scale display of the vicinity of the intersection is provided for navigation. More specifically, when an intersection is approached by the vehicle, this approach is detected, and an enlarged-scale intersection vicinity display as shown in FIG. 29 is provided. Also, the direction of proceeding along the navigation route is informed by a navigation message. That is, a navigation message saying, for instance, "Please turn to the left at so and so intersection about 300 meters ahead.” is output.
  • a navigation message saying, for instance, "Please turn to the left at so and so intersection about 300 meters ahead.”
  • the usual route navigation display is provided. This display includes an "INTERSECTION” key display. By touching this "DISPLAY INTERSECTION" key, the enlarged-scale intersection vicinity display can be restored. When it is detected that the navigated intersection has been passed, the usual route navigation navigation display is restored.
  • a display showing the whole navigation route from the present position to the destination has a "NAVIGATION ROAD LIST" key display.
  • a display showing a navigation route list from the present position to the destination is provided. That is, a display showing the roads extending from the present position to the destination as shown in FIG. 30 is provided.
  • the navigation route is divided at points of change in road type or at interchanges or junctions where roads are entered or left. Also, the distances between adjacent points noted above are shown as actual distances.
  • the display can be scrolled with "FORWARD” and "BACKWARD” keys (only the “FORWARD” key being shown in the Figure). In this way, a display of the whole route up to the destination can be obtained.
  • the "DURING NAVIGATION" display is removed, and a "RE-SEARCH” key is displayed on a lower portion of the display.
  • a new route in the vicinity of the present position is searched.
  • the new navigation route from the vicinity of the present position is displayed. If the re-search could not be made, the previous navigation route is displayed again. At this time, a navigation message saying "New route could not be found, so the previous route will be displayed.” is output. Also, this content is displayed.
  • the re-search which is instigated by touching the "RE-SEARCH” key, is a search of a route from the present position to a navigation route which has already been searched.
  • a "SEARCH WHOLE ROUTE” key is displayed. By touching this "SEARCH WHOLE ROUTE” key, an entire new route from the present position to the destination is searched. When the search is ended, the whole new navigation route is displayed. The display of this whole new navigation route is like the case when a route search is carried out by setting the destination. Further, when a passing point has been set at the time of the re-search, an "ERASE PASSING POINT” key is displayed simultaneously with the "SEARCH WHOLE ROUTE” key. By touching this "ERASE PASSING POINT” key, a search of a whole new route without any designated passing point is made. When the search of the whole new route after erasing of the passing point is ended, the whole new route without any designated passing point is displayed.
  • the navigation route or search condition may be changed.
  • the whole route display (see FIG. 26) after the end of the navigation route search has a "CHANGE ROUTE” key display. By touching this "CHANGE ROUTE" key, a route change choice display as shown in FIG. 31 is provided.
  • This display has a "RE-SEARCH TO SET DIFFERENT ROUTE" key. By touching this key, a whole new navigation route is searched. When the search is ended, the whole new navigation route is displayed. When the result of the search is the same as before the navigation route change, the navigation route before the navigation route change is restored.
  • the route change choice display also has a "CHANGE SEARCH CONDITION" key. By touching this key, a search condition confirmation display is provided. In this case, it is possible to change the setting of any designated passing point and also to change the setting as to whether or not preference is to be given to toll roads. Further, it is possible to change the setting so as to make a further change of the navigation route.
  • route navigation display In the presence of the route navigation display, it is possible to correct the navigation route with a change in the present position and also to correct the navigation route from toll road to general road or from general road to toll road. More specifically, by touching a "CORRECT" key on the route navigation display, a route correction display is provided.
  • This display has eight direction arrow keys for correcting the present position. By touching these keys the present position is changed to a new position desired on a separate route. Then a"SET" key is touched. As a result, the present position is changed, and a new navigation route from the changed position is displayed.
  • a "CORRECT ROUTE TO GENERAL ROAD” key is displayed on the route correction display. This key is touched if it is desired to use an ordinary road instead of the toll road. As a result, the route is corrected to a new one using ordinary roads only. If ordinary roads only is set as the navigation road along which the vehicle is to run, a "CORRECT ROUTE TO TOLL ROAD” key is displayed on the route correction display. This key is touched if it is desired to use toll roads. As a result, the route is corrected to a new one using toll roads.
  • ending navigation is provided by a navigation message. More specifically, when the destination that has been set is approached by the vehicle, a message saying "You are in the destination neighborhood, and this is an ending navigation message.” is output, thus bringing an end to the navigation.
  • a message that "You can confirm destination on a wide area map.” is displayed on the display.
  • the destination can be confirmed with a wide area map displayed by a map area expanding operation.
  • a display on which arrival at the destination is determined is shown in FIG. 33.
  • a message that it is desired to confirm the destination on an expanded area map is displayed.
  • the vicinity of that position is moved to the center of the display.
  • a desired portion of display can be brought to the center thereof.
  • the edge of the display is touched on the desired side.
  • a displayed map can be inverted by touching a bearing display on it. That is, a map displayed with the north shown upward is inverted to one with the north shown downward.
  • a map display which is too detailed can not be seen during running. Accordingly, during display a substitute display showing main roads only is provided by automatic switching. When the vehicle is parked, the detailed map display is restored.
  • This example is a map call-out system for a navigation system for calling out and displaying maps concerning call subjects specified according to input information, and it features that it comprises memory means for storing map data, map call-out means for calling out map data about a call subject from the memory means, and map display means for displaying the called map, the map call-out means calling out map data of a scale suited to the display according to kind of map call-out subject, the display means displaying the call-out subject map in a scale corresponding to the subject.
  • suited map scales are stored in the memory means in correspondence to the kinds of call-out subjects.
  • the map can be displayed in a scale suited to the subject.
  • a scale which permits display of the whole golfing place is set.
  • a scale which permits sufficient specification of the private house is set. It is thus possible to avoid such a situation that it becomes impossible to obtain correct destination setting due to unnecessary operation made in map display for destination setting or similar purposes.

Abstract

A reaction area in a display can be enlarged from one directly above a switch display in the operator's eyesight direction. Specifically, the operator's position is determined from the driver's position or the like, and when the display is looked at to the right and upward, for instance, the reaction area is enlarged to the right and downward of a switch on the front side in the eyesight direction. Thus, operability can be improved in both cases when the display is looked at obliquely and diretly from the front.

Description

This application is a continuation of U.S. application Ser. No. 08/483,176, filed Jun. 7, 1995, now abandoned.
BACKGROUND OF THE INVENTION
1. Field of the Invention
This invention relates to a display touch type information input system for inputting operating information by detecting the touching of a switch displayed on screen.
2. Prior Art
Heretofore, radio sets, audio sets, air conditioners and many other accessory units which have no direct bearing on the running of vehicles have been mounted in vehicles, and there is a trend for an increasing number of these accessory units. Among these accessory units are navigation units, and vehicles with navigation units mounted therein are increasing in number.
The navigation unit serves to assist driving by displaying the present location on a map, and there are many navigation units which undertake route guidance by displaying the route to a destination when the destination is input.
Such a navigation unit requires a display for the map display. In addition, various operations are necessary for the input of the destination and other purposes. However, the space available within a vehicle is limited, and it is impossible to provide exclusive switches for respective necessary operations. Usually, therefore, the display surface is utilized as a touch switch panel, that is, it is utilized as a switch. Also, there are many cases in which the same display is utilized as a switch for operating an air conditioner, an audio set, etc., thus dispensing with some of the dedicated switches.
There are many different types of touch switch panel, such as an electrostatic capacitance type, an optical type, etc. In many cases, however, irrespective of the type of touch switch panel used, the touch detection area is slightly above the actual display surface.
For example, in the electrostatic capacitance type, use is made of deformation of a film provided on the display surface, and in the optical type blocking of light beam provided along the display surface is detected. Therefore, the actual detection area is above the display surface. This leads to a problem that an error is produced between a switch displayed on the display and the actual detection position.
Particularly, in the optical type an acrylic acid resin sheet for screen protection and an LCF (light control film) for suppressing reflection of light by the screen surface are provided between the switch display on the screen and a detection light beam. The thickness thus provided constitutes a corresponding distance between switch display and reaction point. In the optical type, therefore, the problem noted above is particularly pronounced.
Meanwhile, the display that is provided in the vehicle can not be disposed right directly in front of the driver's seat. That is, it has to be disposed at an intermediate position between the driver seat and the passenager seat. Inevitably, therefore, the driver's eyesight is directed obliquely with respect to the display. This leads to an error between the switch display on the display and directly above the touch reaction point (shown shaded), as shown in FIG. 1, and it is impossible to obtain correct detection of the driver's point of touch.
Japanese Utility Model Publication No. 121641/1987 shows precluding the deviation between display switch and reaction area due to a deviation of the driver's eyesight direction by shifting a switch display position (i.e., the whole display on the screen) according to the operator's position. Particularly, in a system disclosed in this publication seating detection switches are provided on the driver and passenger seats, and the switch display position is shifted by specifying the operator from the status of the seating detection switches. More specifically, when the driver alone is present, the display position is shifted to the left, while when both the driver and a passenger are present it is held at the center.
In this prior art example, however, sufficient improvement is not made because what is done is merely to shift the display itself. For example, there is such a problem that when the system is operated with the driver's face brought to the front right of the display, sufficient reaction can not be obtained due to a display deviation.
Particularly, while the touch of the touch switch panel is detected with respect to a plurality of spaced-apart reaction points, the reaction points can not be arranged very closely. Therefore, where a plurality of switches are displayed, the switch display may be comparatively small, and the number of reaction points of the switch display may be considerably small. By way of example, where the area defined by reaction points and the shape of switch display are in accord as shown in FIG. 2, the effect of the eyesight deviation is not so much. However, there is a case when the switch display and reaction points are deviated as shown in FIG. 3. In this example, while switches a and b are equal in size, the switch a is constituted by two reaction points, while the switch b is constituted by four reaction points. In such a case, if the eyesight direction is oblique with respect to the switch a, there is a high possibility that reaction points which do not belong to the switch a are operated by the driver's finger. If this is the case, sufficient reaction can not be obtained.
SUMMARY OF THE INVENTION
An object of the invention is to provide a screen touch type information input system, which can realize sufficient improvement in operability.
In the screen touch type information input system according to the invention, a reaction area corresponding to a display switch is enlarged from the consideration of the operator's eyesight direction. It is thus possible to increase the accuracy of detection of switch operation made by looking at the display obliquely, while maintaining the operability when the display is viewed from the front right thereof.
Further, it is possible to attain sufficient reaction area enlargement with respect to the eyesight direction in a range free from erroneous operation by determining the extent of enlargement from consideration of the distance between adjacent switch displays.
In a further aspect, in a screen display in a vehicle, there are switches which can not be operated during driving of the vehicle. When such a switch is present as an adjacent switch, the reaction are is enlarged to a somewhat greater extent from the consideration of this fact. Thus, it is possible to improve the switch operation recognition factor.
Further, reaction area enlargement in the case of the presence of a switch which can not be operated during driving of the vehicle, as an adjacent switch is made even when no consideration is given to the eyesight deviation. Again in this case, the switch operation recognition factor can be improved.
Further, where a switch operable in the parked state of a vehicle is always found on the eyesight direction side of an operated switch, the reaction area is effectively enlarged in the eyesight direction.
BRIEF DESCRIPTION OF THE DRAWINGS
FIG. 1 is a view showing the relationship between eyesight direction and reaction points;
FIG. 2 is a view showing an example of the relationship between switch display and reaction points;
FIG. 3 is a view showing different examples of relationships between switch display and reaction points;
FIG. 4 is a block diagram showing an embodiment of the invention;
FIG. 5 is a view showing a touch panel;
FIG. 6 is a view showing an array of reaction points in a touch panel;
FIG. 7 is a flow chart illustrating the operation of the embodiment;
FIG. 8 is a view showing an example of enlarging a normal reaction area;
FIG. 9 is a view showing an example of enlarging a reaction area with consideration of inter-switch interval;
FIG. 10 is a view showing an example of enlarging an area in the case when the use of an adjacent switch is prohibited;
FIG. 11 is a flow chart illustrating operation in the case when there is a switch, the use of which is prohibited;
FIG. 12 is a view showing an example of display in the case when there is a switch, the use of which is prohibited;
FIG. 13 is a schematic showing the entire system structure;
FIG. 14 is a flow chart illustrating the operation of the embodimentt;
FIG. 15 is a view showing a storage state of a memory;
FIG. 16 is a view showing a destination setting method choice display;
FIG. 17 is a view showing a facility name choice display;
FIG. 18 is a view showing a prefectural district list display;
FIG. 19 is a view showing a facility list dislay;
FIG. 20 is a view showing a facility neighborhood map display;
FIG. 21 is a view showing a position change display;
FIG. 22 is a view showing a search condition confirmation display;
FIG. 23 is a view showing a passing point setting display;
FIG. 24 is a view showing another search condition confirmation display;
FIG. 25 is a view showing a route search state display;
FIG. 26 is a view showing a whole route guide display;
FIG. 27 is a view showing a route guide state display;
FIG. 28 is a view showing a guide display in the case of departure from the route;
FIG. 29 is a view showing an enlarged-scale intersection neighborhood display;
FIG. 30 is a view showing a guide road list display;
FIG. 31 is a view showing a route change display;
FIG. 32 is a view showing a route correction display; and
FIG. 33 is a view showing a destination arrival guide display.
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENT
Now, an embodiment of the invention will be described with reference to the drawings. The overall structure of the embodiment is shown in the block diagram of FIG. 4.
As shown, the embodiment of the system comprises a touch panel 10, which includes an LCD display 10a and an optical touch panel 10b provided on the surface of the LCD display 10a. The LCD display 10a is connected via a driver 20 to a display ECU (electronic control unit) 22. The display ECU impresses a predetermined voltage via the driver 20 on liquid crystal of the LCD display 10a at a desired position thereof for display. The touch panel 10b is also connected to the display ECU 22, and it detects touch to some of the reaction points arranged in a matrix array.
A main ECU 30 is connected to the display ECU 22. According to a signal from the main ECU 30, the display ECU 22 determines the contents of display on the LCD display 22. In this embodiment, the main ECU 30 feeds an RGB signal (i.e., color television signal) about the display on the LCD display 10a, and according to this signal the display ECU 22 controls the voltage output of the driver 20 to obtain a desired color display. Meanwhile, on-off information (i.e., touch information) about the individual reaction points on the touch panel 10b is directly transmitted to the touch panel 10b. The main ECD 30 detects an on-off state of the display switch from the display on the LCD display 10a and reaction point on-off information.
In this embodiment, the display and touch detection are made on the display ECU. However, it is possible that such processes are executed by the main ECU 30. As a further alternative, a separate ECU may be provided, which executes only the processes of touch detection and judgment. As a yet further alternative, a logic circuit for touch judgment may be constructed by hardware.
To the main ECU 30 are connected a GPS (grobal positioning system) 32 for detecting the absolute position (i.e., latitude, altitude and height) of the vehicle by receiving radio waves from an artificial satellite, a CD-ROM 34 for reproducing optical disk with map information stored therein, a vehicle speed sensor 36 for detecting the running speed of the vehicle, a geo-magnetic sensor for detecting the bearing of the vehicle, and a parking switch 40 for detecting the parking state of the vehicle.
The main ECU 30 thus feeds signals concerning about display by determining the present position from the absolute position detected by the GPS 32 and covered distance obtained from the vehicle speed and geo- magnetism sensors 36 and 38, etc. and determining the content of display on the LCD display 10a from a map obtained from the CD-ROM 34, etc. Further, when displaying switches to be operated, the main ECU 30 determines the relationship between display switch and reaction points on the touch panel 10b. Further, a TV tuner is connected to the main ECU 22. It is thus possible to produce a TV display on the LCD display 10a. Further, it is possible to operate an air conditioner, etc. with the touch panel 10.
The optical touch panel 10b in this embodiment is shown in FIG. 5. It has light-emitting and light-receiving element rows disposed around the LCD display 10a. More specifically, along the right and upper side of the LCD display 10a a plurality of infrared LEDs (light-emitting diodes) 12 are provided in a row. Along the left and upper sides a plurality of photo-transistors 14 are provided in a row. Infrared radiation from the light-emitting diodes 12 is thus received by the respectively opposing photo-transistors 14.
By touching the LCD display 10a with a finger, certain light beams are blocked by the finger. Two-dimensional coordinates of the finger are thus specified from the positions of the photo-transistors 14 which can no longer receive light.
Thus, the light-emitting diodes 12 and light-receiving transistors 14 form a matrix of a corresponding number of reaction points as shown in FIG. 6. In the example of FIG. 6, 13 by 13=169 reaction points are formed.
The main ECU 30 sets the reaction area of a switch displayed on the LCD display 10a in correspondence to the shape of the switch. It detects the touch to the reaction points in the reaction area to determine the on-off state of the switch.
The detection of the on-off state of switch will now be described with reference to the flow chart of FIG. 7. In this example, the area directly above the display of the switch is set directly as a reaction area, and the reaction area is enlarged in the judging process.
First, when the touch panel 10b is touched in the presence of the switch display (step S11), the Nos. (1 to 169) of the touched reaction points are detected by the display ECU 22 and taken in to the main ECU 30 (step S12).
Then, the main ECU 30 checks whether there is any meaningless a reaction point (i.e., a reaction point not corresponding to any reaction area) (step S13). If no meaningless reaction point is found, it is determined that a switch corresponding to meaningful reaction point (i.e., reaction point corresponding to a reaction area) has been operated (step S14).
There is always one meaningless reaction point between two adjacent switches. Usually, there is no possibility that a reaction point corresponding to two switches will be touch in the absence of any meaningless reaction point. There may occur a converse state due to such cause as touching the touch panel 10 with two fingers. Such a state may be regarded to be an abornal switch operation or to be an absence of switch operation.
If a meaningless reaction point is found in the step S13, a check is made as to whether there is any reaction point in a reaction area corresponding to a switch (step S15).
If no such reaction point is detected, it is determined that a switch of a reaction area located at a reaction point of the No. smaller by one than that of the touched meaningless reaction point (i.e., a reaction point on the left side of the touched meaningless reaction point) has been touched (step S16). The left side reaction point is selected on the assumption that the operator is the driver seated in the right side seat. If it is known that preference is given to the left side seat, a switch of the No. larger by one (i.e., right side reaction point) may be determined. With the left end reaction point, this process is null because the reaction point with a No. smaller by one is on the right end. In many cases of usual operation, there is only a single switch located on the left side of a meaningless reaction point. When there are two or more such switches, the numbers of reaction points corresponding to these switches may be referred to for verification.
If it is found in the step S15 that there is a reaction point in a reaction area corresponding to a switch, the numbers of such reaction points and meaningless reaction points are compared (step S17). If the number of reaction points in the reaction area corresponding to the switch is greater, it is determined that the switch corresponding to these reaction points has been operated (step S18).
If it is found in the step S17 that the number of meaningless reaction points is greater, a check is made of the position relationship between the meaningless reaction points and the reaction area corresponding to the switch (step S19). According to the result of this check, switch operation is determined as follows. If the meaningless reaction points are located under a reaction area corresponding to a switch, it is determined that a switch immediately over the meaning reaction points has been operated (step S20). If the meaningless reaction points are located over the reaction area corresponding to a switch, it is determined that a switch immediately under the meaningless reaction points has been operated (step S21). If the meaningless reaction points are located on the right side of the reaction area corresponding to a switch, it is determined that a switch on the immediate left side of the meaningless reaction points has been operated (step S22). If the meaningless reaction points are located on the left side of the reaction area corresponding to a switch, it is determined that a switch on the immediate right side of the meaningless reaction points has been operated (step S23). The result of the check of the position relationship in the step S19 may be from a map or from operational processing of arithmetic formulas.
In the above way, the touch to a meaningless reaction point may be regarded to be the touch to a nearby reaction point. In the above flow, initially an area directly over a switch display was set directly as a reaction area, and the reaction area was changed when and only when the touch to a predetermined meaningless reaction point was detected. In this way, various reaction areas can be comparatively readily set. In addition, it is comparatively readily possible to set a reaction area afresh.
However, the main ECU 30 has internally memorized display switch positions, and it is possible to set a reaction area to an enlarged scale in advance. Further, it is possible to further enlarge a reaction area in the manner as shown in FIG. 7 as described at the time of detection of the touch to a meaningless reaction point in a check executed after the setting of the enlarged reaction area.
In the case of vehicles, the operator is usually the driver. Thus, the driver should be assumed to be the operator unless otherwise specified. In this case, it is effective to enlarge a reaction area not in all directions but in a particular direction only. In this case, reaction areas may be allotted to displayed switches as shown in FIGS. 8 to 10.
When the eyesight is directed from the right and downward with respect to the switch arrangement displayed on the LCD display 10a, as shown in FIG. 8, a reaction area is set which includes the entire top of the display switch and is enlarged to the right and downward. The touch of a reaction point in this reaction area is judged to be a switch operation. With such enlargement of the reaction area it is possible to preclude the error between the reaction points and the switch arrangement display when viewed from the operator, and the switch operation can be effectively detected. In addition, since the reaction area is merely enlarged, correct detection can also be obtained when the switch operation is made with the operator's face brought to the front of the screen.
When the display switch interval is comparatively large as shown in FIG. 9, the enlarged reaction area is made greater according to the interval. This is done because the possibility of erroneous detection due to reaction area enlargement is low. By determining the extent of reaction area enlargement according to the display switch interval, the switch operation can be detected more effectively.
In a navigation display or the like, switches which can not be operated during driving are also displayed. In a specific example, the setting of a destination or the like is made according to the input of the kind of destination (such as department store, golfing place, station, etc.), address, telephone No., etc. Such an operation is comparatively complicated and is executed by watching the display. During driving of the vehicle, such a switch is displayed in thin form (as shown shaded in the Figures) so such as it can not be operated. If such a switch is operated, the operation is made null, while displaying a message that "the switch can not be operated during driving of the vehicle, so please use the switch after parking." Thus, with such a display the possibility of operating a switch which can not be operated is low.
Accordingly, in this embodiment the reaction area is further enlarged as shown in FIG. 10, so that the enlarged reaction area of the switch covers up to reaction points adjacent to a switch which can not be operated during driving of the vehicle.
More specifically, where two adjacent switches are operable, it is necessary to avoid erroneous judgment by setting at least one line of reaction points between the two switches, which reaction points belong to neither of the switches and are hence meaningless. However, if the adjacent switch is not operable, the switch operation recognition factor can be increased by allotting all the reaction points between the two switches to the reaction area of the operable switch.
Now, operation during driving of the vehicle will be described with reference to FIG. 11. When a destination display is produced by choosing a destination setting during driving, any inoperable switch is tom down (step S31). Specifically, of the display switches as shown in FIG. 12 the "TELEPHONE NO." switch for setting the destination from a telephone No., the "ADDRESS" switch for setting a destination from an address, facility name switches for setting a destination from a facility name, the "MEMORIZED POINT" switch for setting destination among memorized points, and a "PREVIOUS DEPARTURE POINT" switch are toned down. Thus, only the "HOME" switch is held operable without toning it down. (In the Figure, only the "HOME" switch is shown shaded). This is done because the destination can be set by merely touching the "HOME" switch, the operation being thus simple and capable of execution without any trouble during driving. Afterwards, the area covering the reaction points corresponding to the operable switch is enlarged (step S32).
The running state of the vehicle may be detected with reference to zero vehicle speed. In the case of an automatic vehicle, the parking position of a shift lever may be detected using a parking position switch 40. Further, the parking state of the vehicle may be detected through detection of an "on" state of the side brake. Further, the routine shown in FIG. 11 is called by an interrupt when the running state of the vehicle is detected.
In this embodiment, it is a presumption that the display itself is fixed. However, among various available displays are those which are capable of angle adjustment. That is, there are displays which can be directed toward the driver's seat in some cases and toward the passenger seat in other cases. With such a display, whether the operator is the driver or the passenger can be determined from the display installation angle. It is thus possible to provide a display position detector (which may detect rotation of the display as the display is usually rotated about its center line) and determine the reaction area enlargement direction by specifying the operator according to the result of detection.
The method of the operator's eyesight direction estimation is not limited to the above display angle adjustment, and the same effects are obtainable with other methods.
A conceivable different method, for instance, is such that an "PASSENGER SEAT OPERATION" key is displayed, and that if this key is touched, it is determined that the operator's eyesight direction is from the passenger seat.
(Description of another structural embodiment)
Now, another structural embodiment of the invention applied to a navigation system will be described. This structural example concerns a map call/display system for a navigation system, which can call and display maps concerning specified items of call.
Usually, driving carried out to various destinations. When utilizing a navigation system, it is accordingly necessary to input and set information about the destination. It is possible to memorize a place which recurrently constitutes the destination, such as home, place of work, etc. As for general destinations, however, it is necessary to set information whenever a specific destination is chosen. In the usual navigation system, a map of the neighborhood of a destination is called and displayed for determining the final destination when information of a place as an indication of for destination setting is input or for confirming destination when information of the final destination is input. For other purposes than the destination setting, it is possible to call and display desired maps by inputting place-specifying information.
A map may be called by specifying a facility name such as a golfing place name, a station name, etc., specifying a telephone No. or an address, or specifying a memorized place as noted before. Prior art navigation systems can call maps according to such input. As an example, when the home of an acquaintance which is located behind a certain golfing place is chosen as a destination, first the name of the golfing place is input as a facility name. Then, a map of the neighborhood of the golfing place is called and displayed. On the displayed map, the final destination is set, and a route search is made.
When setting a destination according to a telephone No. or an address, a pertinent map may be called and displayed by inputting comparatively rough information, such as a city name, an urban telephone exchange name, etc., and the final destination may be set on the displayed map.
However, a map which is called in the above way according to the input information may sometimes be of a scale which is not suited to the setting of the final destination. For example, the same scale of display is not suitable when calling a private shop having a small installation area and when calling a golfing place having a vast area. In other words, if the map showing the golfing place is called with a scale that permits adequate display of the private shop, the called map can not be contained within the screen.
According to Japanese Patent Laid-open Publication No. 94132/1993, if the called map can not be contained within the screen, this is displayed, and the direction in which the locality that can not be displayed is present is displayed with an arrow. In this case, it is made possible to scroll the screen in the direction of the arrow. Thus, when the called subject is not satisfactorily displayed, it is possible to permit observation of the whole subject. It is thus possible to permit the setting of destination or the like by utiliizing the called map.
In the above prior art example, however, it is necessary to scroll the display in order to observe the whole subject called. Therefore, the operability is inferior. For example, when setting a destination, scrolling of the display is needed to obtain information of nearby roads and so forth. Further, when the destination is an acquaintance's house behind a golfing place as noted above, difficulty may be encountered in the search of the destination due to absence of the intended location in the display or lack of the display of the whole golfing place.
The instant structural example seeks to solve the above problems.
FIG. 13 is a schematic showing the entire structure of the example of a map call/display system. A display/touch system 110 as shown serves as map display means.
It includes an LCD (liquid crystal display) and a touch panel provided on the surface of the LCD. In addition to displaying various maps, it also has a role of detecting the touching of a displayed switch. An electro multi-television ECU 112 is connected to the dislay/touch panel 110. The electro multi-television ECU 112 has roles of providing displays and detecting the touching of the touch panel. A CD-ROM changer 114 is connected to the electro multi-television ECU 112. A map CD-ROM 114b is connected via magazine 114a to the CD-ROM changer 114. The map CD-ROM 114b serves as memory means for storing map data. The electro multi-television ECU 112 serves as map call means for obtaining desired map information from the CD-ROM changer 114.
A GPS receiver 120 is connected to the electro multi-television ECU 112. The GPS receiver 120 receives signals from an artificial satellite via a GPS antenna 120a to detect the absolute position (altitude and latitude) of the vehicle and supplies detection data to the electro muilti-television ECU 112. The absolute position of the vehicle can thus be detected in the electro multi-television ECU 112. Various sensors 122 for detecting the running state of the vehicle are connected to the electro multi-television ECU 112. The running condition thus can be ascertained at all times. In this example, the sensors 122 are a geo-magnetism sensor 122a, a wheel sensor 122b, a steering sensor 122d and a distance sensor 122d. The electro multi-television ECU 112 can thus grasp the bearing, speed and steering angle of the vehicle and distance covered thereby from the results of detection by the sensors 122. The electro multi-television ECU always calculates the present position from the detection signals from the sensors 122, and makes higher accuracy detection of the present position by combining the result of detection from the GPS receiver 20 with the calculated present position.
The electro multi-television ECU 112 is further supplied with detection signals from various switches 124. In this example, signals from an accessory switch 122a, an ignition switch 124b, a parking brake switch 124c, an alternator 124d and a check terminal 124e are supplied to the electro multi-television ECU 122. Thus, the electro multi-television ECU 112 can ascertain running conditions of the vehicle, such as whether the ignition key is "on", whether the vehicle is parked with the parking brake operated, whether the electricity generation state is satisfactory, and whether various accessories mounted in the vehicle are normal.
Further, a loudspeaker 126 is connected to the electro multi-television ECU 112. A guidance voice for route guidance is output from the loundspeaker 126.
Further, in this system a TV tuner 130 is provided to permit display of a television signal intercepted by a TV antenna 130a built into the window glass on the screen of the display/touch panel 110. An air conditioner ECU 132 is further connected to the display/touch panel 110.
The air conditioner can thus be operated by a switch displayed on the display/touch panel 110.
Further, the system includes an audio CD changer 134b for reproducing data from a CD-ROM 134c. CD-ROM 134c data for sight-seeing guides, etc. is stored.
Thus, video data can be supplied from the CD-ROM 134c to the electro multi-television ECU 112 to display sight-seeing guide displays on the display/touch panel 110. Further, voice data from a music CD 134b and CD-ROM 134c can be supplied through an audio amplifier 136 to loudspeakers 138 to output predetermined voices. Further, an audio head unit 140 is connected to the audio amplifier 136 to permit signals from an FM radio unit, an AM radio unit, etc. to be output to the loudspeakers 38.
Now, the above system will be described in connection with reading out a map for destination setting with reference to FIG. 14. In this case, information for the destination setting is first input by operating the display/touch panel 110. As an example, it is assumed that the destination is set according to an address. For example, information "Toyoda City, Akita Prefecture" is input, and it is determied to be the final choice item (step S101).
According to this input, the electro multi-television ECU 112 determines the center position of the input "Toyoda City" and the scale of the display from data stored in the map CD-ROM 114b (step S102). As the center position adopted may be the center position of the area (i.e., administrative area) of Toyoda City, the location of the administrative office of the city, etc. The scale of the display has been determined from the area and shape of the pertinent city (i.e., Toyoda City in this case), and this information has been stored. When displaying the area, a scale is required which permits maximum enlargement of the area without a non-display portion. This data is stored together with data of the item (i.e., Toyoda City in this case) in the CD-ROM 114b.
When the final choice item is determined in the step S101, the data has already been read out from the CD-ROM 114b into a memory of the electro multi-television ECU 112.
An example of the configuration of this data is shown in FIG. 15. As shown, the group name, name (i.e., kanji name in the case of a Japanese name), altitude and latitude of the destination and the scale of the display are stored. As the scale of display one is stored which is best suited, in both the altitude and latitude scales to the display of the area as described above. The group is the kind of item, such as golfing place, amusement park, etc. (corresponding to the facility name), and it represents the kind of call subject together with the name. In this way, the scale of display is stored in the CD-ROM as shown in FIG. 15. However, it is also possible to provide a ROM in the ECU 112 and store a table listing call subject kinds and display scales in one-to-one correspondence in the ROM. In this case, the data amount in the CD-ROM 114b may be reduced.
Since the center position of display and the display scale are determined in the above way, the pertinent map data are read out (step S103). The read-out map data are displayed on the display/touch panel (step S104).
As shown, in this example the best suited display scale is determined according to the kind of final choice item as determined in the step S101. If the final choice item determined in the step S101 is not at the level of city, town or village but specifies the address, the display scale is determined under the assumption that the area of the pertinent address is to be displayed. If an urban telephone exchange No. or a golfing place name is specified, a display scale corresponding to the final choice item is read out. Thus, the map call is always made on the basis of the best display scale, and the best map display is carried out.
For example, if a golfing place is displayed on a map with a scale of 1/100,000, it is partly displayed on the screen. When such a display is provided, the operator wishes to search for a desired place, for instance a club house parking area, in the golfing place. Previously, if the club house parking area be found correctly, the destination could be set correctly.
However, if the destination could not be found, it would be set to be a different place, for instance a parking area on the opposite side of the club house. Meanwhile, in this navigation system, when merely the name of the golfing place is specified, the destination is usually set to be the club house parking area which the general driver decides to be the destination. This means that by setting the destination directly on the display, correct destination setting can be obtained. In this example, the golfing place as a destination is set in an optimum size, i.e., a size which is not excessively large or small.
The destination setting can thus be completed at a level intending to find golfing place. In other words, the destination setting can be obtained without a change in the predetermined club house position. In this way, in this example correct destination setting can be obtained.
It will be seen that there is a correlation between the display scale and the operator's sense. When the operator wishesdesires to obtain display of a map in a certain scale, he or she obtains information which is available from that scale. In other words, for a detailed map detailed information is sought, while for a broad area map only a very rough positional relationship is sought. Thus, by obtaining display of a map in a scale adequate for the size and shape of the subject as in this example, the operator can confirm only the presence of the pertinent facility or the like, and he or she can promptly carry out correct destination setting without the need to set greater details.
Now, the actual operation of the navigation system will be described with reference to display examples. It is assumed that the set destination is among "OTHER FACILITIES" and that a point is set which has been set in advance as a memorized point on the route to the destination.
(Setting of Destination)
FIG. 16 shows a display for choosing a method of destination setting. As shown, the display includes "TELEPHONE NO." and "ADDRESS" keys. It also includes "GOLFING PLACE" and "OTHER FACILITIES" keys for facility names (or groups). It further includes "HOME", "MEMORIZED POINT" and "PREVIOUS DEPARTURE POINT" keys for registered points. When this display is provided, a navigation message saying "You can call and set a map of destination neighborhood from telephone No. or facility name." is produced. In this example, the "OTHER FACILITIES" key is touched. As a result, a destination facility name choice display as shown in FIG. 17 is provided. At this time, a navigation message saying "Please choose destination facility name." is also produced. The operator thus chooses and touches a desired key among "AMUSEMENT PARK", "SKIING PLACE", etc. keys. As a result, a prefectural district list display as shown in FIG. 18 is provided. At this time, a navigation message saying "Please choose prefecture name corresponding to destination." is produced. In this example, the "AMUSEMENT PARK" key is touched as facility name.
In this display, the prefectural district containing the pertinent amusement park is determined by touching a corresponding prefectural district key. (In the United States of America, the prefectural districts would be comparatively broad areas corresponding to the states.) The first prefectural district display part in the display has a "COUNTRY" key.
By touching this key, a display is provided in which the facilities all over the country are listed, while producing a navigation message saying "Please chose destination name, and you can call a neighborhood map." On this list display, a desired one of the facilities all over the country can be chosen. When a prefectural district is chosen, a facility choise display as shown in FIG. 19 is provided, which shows a list of amusement parks in the chosen prefectural district. On this list display, a desired facility is chosen. As a result, a map of the neighborhood of the chosen facility as shown in FIG. 20 is displayed, while a navigation message is produced which says "Please touch "SET" key, and you can set destination." By touching the "SET" key, the destination is set. In this example, the best map scale for display is stored as shown in FIG. 15 in correspondence to the area and shape of the specified facility (i.e., amusement park in this case). Thus, the called map is in the best scale, and the whole amusement park and the neighborhood thereof are displayed on the screen. By touching the "SET" key on the display shown in FIG. 20, the destination is set as the amusement park. The actual destination that is set is a parking area which is closest to the front gate of the amusement park which is usually set as a target.
Meanwhile, by touching the "POSITION CHANGE" key, eight arrow keys radially spaced-apart are displayed. At this time, a navigation message saying "Map is moved by touching an arrow, and destination can be set by touching "SET" key." is produced. When the operator touches, for instance, the upwardly directed arrow key, the destination can be moved to the north from the parking area at the front gate. Then, by touching the "SET" key at a desired point, the destination setting can be obtained. There may be cases when setting the destination from a telephone No that there is no pertinent facility but there is only a pertinent urban telephone exchange. In such a case, the pertinent map can be displayed, and a "SEARCH NEIGHBORHOOD" key can also be displayed. By touching the "SEARCH NEIGHBORHOOD" key, such item as the name of a town, intersection, etc. near the destination can be displayed. From this display, a pertinent neighborhood map can be searched for destination setting.
When the map is displayed, the destination setting is completed by touching the "SET" key. When the destination setting is completed, data of roads near the destination or the like are confirmed. If there is no problem, a search condition confirmation display is provided for search up to the destination. Meanwhile, if there is no nearby road or the like suited for navigation after the destination setting, such a display message as "No road suited for navigation is found, so please operate again after moving the destination point to the vicinity of a trunk road" is provided. Then, an arrow mark display like that shown in FIG. 21 is provided for re-setting of the destination.
(Setting of Search Conditions)
When the destination setting is completed in the above way, a search condition confirmation display as shown in FIG. 22 is provided. At this time, a navigation message saying "Touch "SEARCH START" key, and search of route up to destination will be started on condition that preference is given to toll road." is produced. On this display, whether or not there is a designated passing point is set. Also, setting as to whether or not preference is given to a toll road is made.
If preference is to be given to a toll road in the choice of road up to the destination, a "GIVE PREFERENCE" key is touched. Otherwise, a "GIVE NO PREFERENCE" key is touched. In this way, a choice is made as to whether or not preference is to be given to a toll road.
In addition, whether or not a passing point is to be designated on the route up to the destination is chosen by key touch. When a passing point is designated, a best route passing through that point before reaching the destination is chosen in the route search.
Specifically, by touching a "DESIGNATE" key, a passing point setting choice display as shown in FIG. 23 is provided. This display resembles that for the destination setting.
Thus, the passing point can be set in a manner similar to the destination setting as described above. When the passing point setting operation is ended, the "SET" key on the display is touched. Thus, the setting of the passing point is completed, and data of the vicinity of the passing point can be confirmed. Then, a search condition choice confirmation display as shown in FIG. 24 is provided. Here, route search is started by touching a "START SEARCH" key.
(Route Search)
When the "START SEARCH" key as shown in FIG. 24 is touched, a display as shown in FIG. 25 is provided for route search. When the route search is ended, the whole route from the present point to the destination is displayed on map. At the same time, the entire distance (in km) to be covered is displayed as shown in FIG. 26. At this time, it is informed by navigation message that navigation along a route passing through the designated passing point will be made. When it is found as a result of the search that the destination is near the present point, this is displayed, and it is instructed to drive the vehicle with reference to a map. When a point at which a route search can be made is reached, a "SEARCH" key is displayed. When this "SEARCH" key is touched, a navigation route search is made. Further, during the navigation route search a present position display can be provided by depressing a present position switch. It is thus possible to confirm the present position during the navigation. When the search is ended, the whole route and the entire distance to be covered are displayed by touching "DISPLAY ROUTE" key.
If the navigation route search could not be made, this is displayed, and also a "CON FIRM" key is displayed. Then, by touching the "CONFIRM" key, a present position display is provided. By touching the "SEARCH" key after departure and after running past the displayed present position, the navigation route search is started again. In other words, if there was no road suitable for navigation near the present position, this fact is displayed, and it is instructed to carry out a search again in the neighborhood of a trunk road. If the route search could not be made any reason other than a problem with a road in the vicinity of the present position, merely the fact that the route search could not be made is displayed, and it is instructed to undertake the operation afresh.
(Route Guidance)
After the whole route and the entire distance (in km) to be covered have been displayed after the end of the route search, route navigation up to the destination is started by touching a "START NAVIGATION" key on the display or after 15 seconds of running. That is, a route navigation display as shown in FIG. 27 is provided at this time.
As shown, the display includes a present position mark shown at the center of the display. Also, a "DURING NAVIGATION" display is provided in an upper portion of the display.
Further, the vicinity of the passing point (only when the passing point has been set) and the distance to be covered up to the destination neighborhood arc displayed in a right lower portion of the display.
If it is judged during running that the vehicle position is deviated from the navigation route, the "DURING NAVIGATION" display is removed, and a "RE-SEARCH" key is displayed in a lower portion of the display as shown in FIG. 28. By touching this "RE-SEARCH" key, the route search can be made afresh.
Further, by touching a "WHOLE ROUTE" key on the route navigation display, the whole navigation route from the present position to the destination is displayed. The display of the whole navigation route is similar to the display shown in FIG. 26. In this case, however, a "RESUME NAVIGATION" key is displayed in lieu of the "START NAVIGATION" key. By touching this "RESUME NAVIGATION" key, the usual navigation display is restored.
(Intersection Guidance)
When an intersection is approached during route navigation, it is detected, and an enlarged-scale display of the vicinity of the intersection is provided for navigation. More specifically, when an intersection is approached by the vehicle, this approach is detected, and an enlarged-scale intersection vicinity display as shown in FIG. 29 is provided. Also, the direction of proceeding along the navigation route is informed by a navigation message. That is, a navigation message saying, for instance, "Please turn to the left at so and so intersection about 300 meters ahead." is output. By touching the "DISPLAY ROUTE" key on the enlarged-scale intersection vicinity display, the usual route navigation display is provided. This display includes an "INTERSECTION" key display. By touching this "DISPLAY INTERSECTION" key, the enlarged-scale intersection vicinity display can be restored. When it is detected that the navigated intersection has been passed, the usual route navigation navigation display is restored.
(Route List)
It is possible to display the navigation route from the present position to the destination in the form of a list. Also, it is possible to display a map showing the vicinity of each point on the navigation route.
A display showing the whole navigation route from the present position to the destination has a "NAVIGATION ROAD LIST" key display. By touching this "NAVIGATION ROAD LIST" key, a display showing a navigation route list from the present position to the destination is provided. That is, a display showing the roads extending from the present position to the destination as shown in FIG. 30 is provided. In this display, the navigation route is divided at points of change in road type or at interchanges or junctions where roads are entered or left. Also, the distances between adjacent points noted above are shown as actual distances. If the route from the present position to the destination can not be shown on the same display, the display can be scrolled with "FORWARD" and "BACKWARD" keys (only the "FORWARD" key being shown in the Figure). In this way, a display of the whole route up to the destination can be obtained.
Further, by touching a "SURROUNDING AREA" key which is provided outside the navigation route list display and on one side of each of the points noted above, a map centered on that point is displayed. This display has eight arrow keys corresponding to respective directions, and map movement can be made by using these keys.
(Route Re-search)
When the navigation route is deviated from the present position on the route navigation display, it is possible to carry out navigation route re-search and display a new navigation route.
More specifically, if it is judged that the position of the running vehicle has been deviated from the navigation route while the route navigation display is provided, as described before, the "DURING NAVIGATION" display is removed, and a "RE-SEARCH" key is displayed on a lower portion of the display. By touching this "RE-SEARCH" key, a new route in the vicinity of the present position is searched. When the re-search is ended, the new navigation route from the vicinity of the present position is displayed. If the re-search could not be made, the previous navigation route is displayed again. At this time, a navigation message saying "New route could not be found, so the previous route will be displayed." is output. Also, this content is displayed.
The re-search which is instigated by touching the "RE-SEARCH" key, is a search of a route from the present position to a navigation route which has already been searched.
During the route search up to this navigation route, a "SEARCH WHOLE ROUTE" key is displayed. By touching this "SEARCH WHOLE ROUTE" key, an entire new route from the present position to the destination is searched. When the search is ended, the whole new navigation route is displayed. The display of this whole new navigation route is like the case when a route search is carried out by setting the destination. Further, when a passing point has been set at the time of the re-search, an "ERASE PASSING POINT" key is displayed simultaneously with the "SEARCH WHOLE ROUTE" key. By touching this "ERASE PASSING POINT" key, a search of a whole new route without any designated passing point is made. When the search of the whole new route after erasing of the passing point is ended, the whole new route without any designated passing point is displayed.
(Route Change)
After the end of the navigation route search, the navigation route or search condition may be changed. The whole route display (see FIG. 26) after the end of the navigation route search has a "CHANGE ROUTE" key display. By touching this "CHANGE ROUTE" key, a route change choice display as shown in FIG. 31 is provided.
This display has a "RE-SEARCH TO SET DIFFERENT ROUTE" key. By touching this key, a whole new navigation route is searched. When the search is ended, the whole new navigation route is displayed. When the result of the search is the same as before the navigation route change, the navigation route before the navigation route change is restored.
The route change choice display also has a "CHANGE SEARCH CONDITION" key. By touching this key, a search condition confirmation display is provided. In this case, it is possible to change the setting of any designated passing point and also to change the setting as to whether or not preference is to be given to toll roads. Further, it is possible to change the setting so as to make a further change of the navigation route.
In the presence of the route navigation display, it is possible to correct the navigation route with a change in the present position and also to correct the navigation route from toll road to general road or from general road to toll road. More specifically, by touching a "CORRECT" key on the route navigation display, a route correction display is provided.
This display, as shown in FIG. 32, has eight direction arrow keys for correcting the present position. By touching these keys the present position is changed to a new position desired on a separate route. Then a"SET" key is touched. As a result, the present position is changed, and a new navigation route from the changed position is displayed.
When a toll road is set as part of the navigation route along which the vehicle is to run, a "CORRECT ROUTE TO GENERAL ROAD" key is displayed on the route correction display. This key is touched if it is desired to use an ordinary road instead of the toll road. As a result, the route is corrected to a new one using ordinary roads only. If ordinary roads only is set as the navigation road along which the vehicle is to run, a "CORRECT ROUTE TO TOLL ROAD" key is displayed on the route correction display. This key is touched if it is desired to use toll roads. As a result, the route is corrected to a new one using toll roads.
(Navigation concerning Arrival at Designated Passing Point)
When the vicinity of a designated passing point is approached in the presence of a route navigation display with the designated passing point, it is informed by a navigation message that the present position is near the designated passing point. That is, a navigation message saying "Now, you are near the designated passing point." is output. When the passing point has been passed, a message saying "Navigation is switched to one up to destination neighborhood." is output. Also, a display of this navigation content is provided. The navigation is then switched to the one up to destination neighborhood.
(Navigation concerning Arrival at Destination)
When destination neighborhood is approached, ending navigation is provided by a navigation message. More specifically, when the destination that has been set is approached by the vehicle, a message saying "You are in the destination neighborhood, and this is an ending navigation message." is output, thus bringing an end to the navigation.
At this time, if the destination can not be displayed on the displayed map, a message that "You can confirm destination on a wide area map." is displayed on the display. When this message is displayed, the destination can be confirmed with a wide area map displayed by a map area expanding operation. A display on which arrival at the destination is determined is shown in FIG. 33. When the destination is not shown in the display, a message that it is desired to confirm the destination on an expanded area map is displayed.
When a ferry stop is detected, arrival at the ferry stop is informed by a navigation message, and then the navigation message is interrupted.
(Other Functions)
(Map Movement)
By touching a position of a displayed map, the vicinity of that position is moved to the center of the display. Thus, a desired portion of display can be brought to the center thereof. When it is desired to obtain display of a map located adjacent to a prevailing display, the edge of the display is touched on the desired side. By so doing, the map display can be realized.
(Map inversion)
A displayed map can be inverted by touching a bearing display on it. That is, a map displayed with the north shown upward is inverted to one with the north shown downward.
(Display Switching during Running)
A map display which is too detailed can not be seen during running. Accordingly, during display a substitute display showing main roads only is provided by automatic switching. When the vehicle is parked, the detailed map display is restored.
(Features of this Structural Example)
This example is a map call-out system for a navigation system for calling out and displaying maps concerning call subjects specified according to input information, and it features that it comprises memory means for storing map data, map call-out means for calling out map data about a call subject from the memory means, and map display means for displaying the called map, the map call-out means calling out map data of a scale suited to the display according to kind of map call-out subject, the display means displaying the call-out subject map in a scale corresponding to the subject.
Thus, in this example of a map call-out system, suited map scales are stored in the memory means in correspondence to the kinds of call-out subjects. Thus, when displaying a map corresponding to a destination in destination setting or the like, the map can be displayed in a scale suited to the subject. For example, in the case of a golfing place, a scale which permits display of the whole golfing place is set. In the case of a private house, a scale which permits sufficient specification of the private house is set. It is thus possible to avoid such a situation that it becomes impossible to obtain correct destination setting due to unnecessary operation made in map display for destination setting or similar purposes.

Claims (12)

What is claimed is:
1. A display touch type input system for inputting information by detecting touching of a switch part of a display, comprising:
a display means for displaying a switch image in a switch display area;
a touch panel provided above the display means and having a number of reaction points, the touching of the reaction points by the operator being detected; and
switch operation detecting means for setting a reaction area covering certain ones of the reaction points, the touching of the reaction area being judged to be the touching of a corresponding switch displayed on the display means;
the switch operation detecting means being configured to enlarge the reaction area in a direction towards the operator based on a distance between adjacent switches.
2. The system according to claim 1, wherein:
the switch operation detecting means includes a computer unit for performing arithmetic operations and detecting the touching of a switch from an instruction of the display of the switch on the display means and the result of detection the touching of the touch panel.
3. The system according to claim 2, wherein:
the computer unit compares a reaction area corresponding to a switch display area and a touched reaction point and judges the switch that has been touched by enlarging the reaction area when it is found as the result of comparison that a meaningless reaction point outside the reaction area has been touched.
4. The system of claim 1, wherein the display means is mounted in a vehicle including a driver's seat and the reaction area is enlarged toward the driver's seat.
5. A display touch type input system for a vehicle, said system being mounted in the vehicle and detecting information by detecting the touching of a switch part of a display, comprising:
a display means for displaying images of a normal switch operable at all times and a parking time operable switch, the operation of which is prohibited during running of the vehicle, in respective switch display areas;
a touch panel provided above the display means and having a number of reaction points, the touching of the reaction points by the operator being detected;
switch operation detecting means for setting an area covering certain ones of the reaction points, the touching of the area being judged to be the touching of a corresponding switch displayed on the display means, the touching of the switch being detected from the result of detection of the touching of the touch panel; and
running condition detecting means for detecting vehicle running conditions;
the switch operation detecting means enlarging the reaction area corresponding to the normally operable switch toward the parking time operable switch in the running state of the vehicle when the parking time operable switch is displayed adjacent to the normally operable switch.
6. A display touch type input system for a vehicle, said system being mounted in the vehicle and inputting information by detecting the touching of a switch part of a display, comprising:
a display means for displaying images of a normally operable switch and a parking time operable switch, the operation of which is prohibited during running of the vehicle, in respective switch display areas;
a touch panel provided above the display means and having a number of reaction points, the touching of the reaction points by the operator being detected;
switch operation detecting means for setting an area covering certain ones of the reaction points, the touching of said area being judged to be the touching of a corresponding switch displayed on the display means, the touching of the switch being detected from the result of detection of the touching of the touch panel; and
running condition detecting means for detecting vehicle running conditions;
the switch operation detecting means setting reaction points in an area including a switch display area and enlarged in the operator's eyesight direction to be a reaction area of reaction points, the touching of which is judged to be the touching of the displayed switch, and enlarges a reaction area corresponding to a normally operable switch normally operable during running of the vehicle toward a parking time operable switch when the parking time operable switch is displayed adjacent to the normally operable switch.
7. The system according to claim 6, wherein:
the switch operation detecting means includes a computer for performing arithmetic operations and detects the touching of a switch from an instruction of display of a switch on the display means and the result of detection of the touching of the touch panel.
8. A display touch type input system for inputting information by detecting touching of a switch part of a display, comprising:
a display means for displaying a switch image in a switch display area;
a touch panel provided above the display means and having a number of reaction points, the touching of the reaction points by the operator being detected; and
switch operation detecting means for setting a reaction area covering certain ones of the reaction points, the touching of the reaction area being judged to be the touching of a corresponding switch displayed on the display means;
the switch operation detecting means being configured to enlarge the reaction area only in a direction towards the operator,
wherein the switch operation detection means is configured to enlarge the reaction area before the touch panel is touched by the operator.
9. A display touch type input system, comprising:
a display means for displaying a switch image in a switch display area;
a touch panel provided above the display means and having meaningful and meaningless reaction points, the touching of the meaningful and meaningless reaction points by the operator being detected; and
switch operation detecting means for setting a reaction area covering certain ones of the reaction points, the touching of the reaction area being judged to be the touching of a corresponding switch displayed on the display means,
wherein the switch operation detecting means forms an enlarged reaction area by enlarging the reaction area in response to the touching of meaningless reaction points.
10. The system of claim 9, wherein the enlarged reaction area extends beyond the switch display area on only two sides.
11. A display touch type input system, comprising:
a display means for displaying a switch image in a switch display area;
a touch panel provided above the display means and having a plurality of reaction points, the touching of the reaction points by the operator being detected; and
switch operation detecting means for setting a reaction area covering certain ones of the reaction points, the touching of the reaction area being judged to be the touching of a corresponding switch displayed on the display means,
wherein the switch operation detecting means forms an enlarged reaction area by enlarging the reaction area based on the position of the operator relative to the display means.
12. The display touch type input system of claim 11, wherein the display means in mounted in a vehicle.
US08/909,765 1994-09-22 1997-08-12 Touch display type information input system Expired - Fee Related US5877751A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US08/909,765 US5877751A (en) 1994-09-22 1997-08-12 Touch display type information input system

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
JP22846694A JP3330239B2 (en) 1994-09-22 1994-09-22 Screen touch input device
JP6-228466 1994-09-22
JP6-244373 1994-10-07
JP24437394A JP3469329B2 (en) 1994-10-07 1994-10-07 Map calling device for navigation system
US48317695A 1995-06-07 1995-06-07
US08/909,765 US5877751A (en) 1994-09-22 1997-08-12 Touch display type information input system

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US48317695A Continuation 1994-09-22 1995-06-07

Publications (1)

Publication Number Publication Date
US5877751A true US5877751A (en) 1999-03-02

Family

ID=26528268

Family Applications (1)

Application Number Title Priority Date Filing Date
US08/909,765 Expired - Fee Related US5877751A (en) 1994-09-22 1997-08-12 Touch display type information input system

Country Status (3)

Country Link
US (1) US5877751A (en)
EP (1) EP0703525B1 (en)
DE (1) DE69524340T2 (en)

Cited By (182)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6088652A (en) * 1996-03-29 2000-07-11 Sanyo Electric Co., Ltd. Navigation device
EP1054236A2 (en) * 1999-05-21 2000-11-22 CLARION Co., Ltd. Navigation system and method and recording medium for recording navigation software
US6414674B1 (en) * 1999-12-17 2002-07-02 International Business Machines Corporation Data processing system and method including an I/O touch pad having dynamically alterable location indicators
EP1228917A1 (en) 2001-02-01 2002-08-07 Pi Technology A control arrangement
US6563492B1 (en) * 1999-03-03 2003-05-13 Yazaki Corporation Multi-function switch unit and function indicating method of the same
US6570559B1 (en) * 1997-05-15 2003-05-27 Sony Corporation Information display apparatus, and display state detection method, display state adjustment method and maintenance management method therefor
US20030201914A1 (en) * 1996-09-13 2003-10-30 Toshio Fujiwara Information display system for displaying specified location with map therearound on display equipment
US20030222858A1 (en) * 2002-05-28 2003-12-04 Pioneer Corporation Touch panel device
US20040131212A1 (en) * 2003-01-07 2004-07-08 Steven Chang LCD display with an infrared transmission interface for transmitting audio signals in stereo
US20040174396A1 (en) * 2000-01-05 2004-09-09 Apple Computer, Inc. Method and system for providing an embedded application tool bar
WO2004076976A1 (en) * 2003-02-26 2004-09-10 Tomtom B.V. Navigation device and method for displaying alternative routes
US20040178994A1 (en) * 2003-03-10 2004-09-16 International Business Machines Corporation Dynamic resizing of clickable areas of touch screen applications
US6795059B2 (en) * 2000-08-17 2004-09-21 Alpine Electronics, Inc. Operating device for controlling electronic devices utilizing a touch panel
US6826473B1 (en) 2002-02-08 2004-11-30 Garmin Ltd. PDA with integrated navigation functions and expense reporting
US20050114021A1 (en) * 2001-12-21 2005-05-26 Garmin Ltd., A Cayman Islands Corporation PDA with integrated address book and electronic map waypoints
US20050131641A1 (en) * 2001-12-11 2005-06-16 Garmin Ltd., A Cayman Islands Corporation System and method for estimating impedance time through a road network
US20050253818A1 (en) * 2002-06-25 2005-11-17 Esa Nettamo Method of interpreting control command, and portable electronic device
US20060026521A1 (en) * 2004-07-30 2006-02-02 Apple Computer, Inc. Gestures for touch sensitive input devices
US20060033724A1 (en) * 2004-07-30 2006-02-16 Apple Computer, Inc. Virtual input device placement on a touch screen user interface
US20060053387A1 (en) * 2004-07-30 2006-03-09 Apple Computer, Inc. Operation of a computer with touch screen interface
US20060085757A1 (en) * 2004-07-30 2006-04-20 Apple Computer, Inc. Activating virtual keys of a touch-screen virtual keyboard
US20060122769A1 (en) * 2004-12-02 2006-06-08 Denson Corporation Navigation system
US20060212185A1 (en) * 2003-02-27 2006-09-21 Philp Joseph W Method and apparatus for automatic selection of train activity locations
US7120539B2 (en) 2001-12-21 2006-10-10 Garmin Ltd. Navigation system, method and device with detour algorithm
US7184886B1 (en) 2001-12-21 2007-02-27 Garmin Ltd. Navigation system, method and device with detour algorithm
US20070207842A1 (en) * 2006-03-06 2007-09-06 Garmin Ltd. A Cayman Islands Corporation Electronic device mount
US7269508B1 (en) 2001-12-21 2007-09-11 Garmin Ltd. Guidance with feature accounting for insignificant roads
US7277794B1 (en) 2001-12-21 2007-10-02 Garmin Ltd. Guidance with feature accounting for insignificant roads
US7283905B1 (en) 2001-12-11 2007-10-16 Garmin Ltd. System and method for estimating impedance time through a road network
US7308359B1 (en) 2001-12-21 2007-12-11 Garmin Ltd. Navigation system, method and device with automatic next turn page
US20080036743A1 (en) * 1998-01-26 2008-02-14 Apple Computer, Inc. Gesturing with a multipoint sensing device
US20080055259A1 (en) * 2006-08-31 2008-03-06 Honeywell International, Inc. Method for dynamically adapting button size on touch screens to compensate for hand tremor
US20080094356A1 (en) * 2006-09-06 2008-04-24 Bas Ording Methods for Determining a Cursor Position from a Finger Contact with a Touch Screen Display
US20080165160A1 (en) * 2007-01-07 2008-07-10 Kenneth Kocienda Portable Multifunction Device, Method, and Graphical User Interface for Interpreting a Finger Gesture on a Touch Screen Display
US20080168396A1 (en) * 2007-01-07 2008-07-10 Michael Matas Portable Multifunction Device, Method, and Graphical User Interface for Providing Maps and Directions
US20080165142A1 (en) * 2006-10-26 2008-07-10 Kenneth Kocienda Portable Multifunction Device, Method, and Graphical User Interface for Adjusting an Insertion Point Marker
US7409288B1 (en) 2001-12-20 2008-08-05 Garmin Ltd. Portable navigation system and device with audible turn instructions
US20080320419A1 (en) * 2007-06-22 2008-12-25 Michael Matas Touch Screen Device, Method, and Graphical User Interface for Providing Maps, Directions, and Location-Based Information
US20090119007A1 (en) * 2007-11-06 2009-05-07 Honda Motor Co., Ltd Navigation System
US20090174680A1 (en) * 2008-01-06 2009-07-09 Freddy Allen Anzures Portable Multifunction Device, Method, and Graphical User Interface for Viewing and Managing Electronic Calendars
US20090178007A1 (en) * 2008-01-06 2009-07-09 Michael Matas Touch Screen Device, Method, and Graphical User Interface for Displaying and Selecting Application Options
US20090183120A1 (en) * 1999-12-20 2009-07-16 Apple Inc. User interface for providing consolidation and access
US20090228842A1 (en) * 2008-03-04 2009-09-10 Apple Inc. Selecting of text using gestures
US7653883B2 (en) 2004-07-30 2010-01-26 Apple Inc. Proximity detector in handheld device
US20100201455A1 (en) * 2008-09-23 2010-08-12 Aerovironment, Inc. Predictive pulse width modulation for an open delta h-bridge driven high efficiency ironless permanent magnet machine
US20100235726A1 (en) * 2009-03-16 2010-09-16 Bas Ording Methods and Graphical User Interfaces for Editing on a Multifunction Device with a Touch Screen Display
US20100283746A1 (en) * 2009-05-08 2010-11-11 Vuong Thanh V Target zones for menu items on a touch-sensitive display
US20100309149A1 (en) * 2009-06-07 2010-12-09 Chris Blumenberg Device, Method, and Graphical User Interface for Providing Maps, Directions, and Location-Based Information
US20110082603A1 (en) * 2008-06-20 2011-04-07 Bayerische Motoren Werke Aktiengesellschaft Process for Controlling Functions in a Motor Vehicle Having Neighboring Operating Elements
US20110080364A1 (en) * 2006-10-26 2011-04-07 Bas Ording Method, System, and Graphical User Interface for Positioning an Insertion Marker in a Touch Screen Display
US20110134517A1 (en) * 2009-12-04 2011-06-09 Olympus Corporation Microscope controller and microscope system comprising microscope controller
CN1754082B (en) * 2003-02-26 2011-07-06 通腾科技股份有限公司 Navigation device and method for displaying alternative routes
US20110167058A1 (en) * 2010-01-06 2011-07-07 Van Os Marcel Device, Method, and Graphical User Interface for Mapping Directions Between Search Results
US20110163874A1 (en) * 2010-01-06 2011-07-07 Van Os Marcel Device, Method, and Graphical User Interface for Tracking Movement on a Map
WO2011113513A1 (en) * 2010-03-18 2011-09-22 Volkswagen Aktiengesellschaft Method for controlling an electronic system for a vehicle and corresponding controller
EP2407865A1 (en) * 2010-07-16 2012-01-18 Gigaset Communications GmbH Adaptive calibration of sensor monitors for optimising interface quality
US8201109B2 (en) 2008-03-04 2012-06-12 Apple Inc. Methods and graphical user interfaces for editing on a portable multifunction device
US8381135B2 (en) 2004-07-30 2013-02-19 Apple Inc. Proximity detector in handheld device
US8427445B2 (en) 2004-07-30 2013-04-23 Apple Inc. Visual expander
US8456445B2 (en) 2010-04-30 2013-06-04 Honeywell International Inc. Touch screen and method for adjusting screen objects
US8661339B2 (en) 2011-05-31 2014-02-25 Apple Inc. Devices, methods, and graphical user interfaces for document manipulation
US8744852B1 (en) 2004-10-01 2014-06-03 Apple Inc. Spoken interfaces
US8892446B2 (en) 2010-01-18 2014-11-18 Apple Inc. Service orchestration for intelligent automated assistant
EP2249239B1 (en) * 2009-05-08 2015-07-15 BlackBerry Limited Target zones for menu items on a touch-sensitive display
CN104823023A (en) * 2012-10-17 2015-08-05 通腾科技股份有限公司 Methods and systems of providing information using navigation apparatus
US9239673B2 (en) 1998-01-26 2016-01-19 Apple Inc. Gesturing with a multipoint sensing device
US9262612B2 (en) 2011-03-21 2016-02-16 Apple Inc. Device access using voice authentication
US9300784B2 (en) 2013-06-13 2016-03-29 Apple Inc. System and method for emergency calls initiated by voice command
US9330720B2 (en) 2008-01-03 2016-05-03 Apple Inc. Methods and apparatus for altering audio output signals
US9338493B2 (en) 2014-06-30 2016-05-10 Apple Inc. Intelligent automated assistant for TV user interactions
US9368114B2 (en) 2013-03-14 2016-06-14 Apple Inc. Context-sensitive handling of interruptions
US9430463B2 (en) 2014-05-30 2016-08-30 Apple Inc. Exemplar-based natural language processing
CN103809830B (en) * 2014-03-03 2016-08-31 欧浦登(福建)光学有限公司 Implementation method based on single-layer double-side conductor wire membrane capacitance formula gate inhibition's touch-control
US9483461B2 (en) 2012-03-06 2016-11-01 Apple Inc. Handling speech synthesis of content for multiple languages
US9495129B2 (en) 2012-06-29 2016-11-15 Apple Inc. Device, method, and user interface for voice-activated navigation and browsing of a document
US9502031B2 (en) 2014-05-27 2016-11-22 Apple Inc. Method for supporting dynamic grammars in WFST-based ASR
US9501178B1 (en) * 2000-02-10 2016-11-22 Intel Corporation Generating audible tooltips
US9535906B2 (en) 2008-07-31 2017-01-03 Apple Inc. Mobile device having human language translation capability with positional feedback
US9576574B2 (en) 2012-09-10 2017-02-21 Apple Inc. Context-sensitive handling of interruptions by intelligent digital assistant
US9582608B2 (en) 2013-06-07 2017-02-28 Apple Inc. Unified ranking with entropy-weighted information for phrase-based semantic auto-completion
US9620104B2 (en) 2013-06-07 2017-04-11 Apple Inc. System and method for user-specified pronunciation of words for speech synthesis and recognition
US9620105B2 (en) 2014-05-15 2017-04-11 Apple Inc. Analyzing audio input for efficient speech and music recognition
US9626955B2 (en) 2008-04-05 2017-04-18 Apple Inc. Intelligent text-to-speech conversion
US9633660B2 (en) 2010-02-25 2017-04-25 Apple Inc. User profiling for voice input processing
US9633004B2 (en) 2014-05-30 2017-04-25 Apple Inc. Better resolution when referencing to concepts
US9633674B2 (en) 2013-06-07 2017-04-25 Apple Inc. System and method for detecting errors in interactions with a voice-based digital assistant
US9646609B2 (en) 2014-09-30 2017-05-09 Apple Inc. Caching apparatus for serving phonetic pronunciations
US9646614B2 (en) 2000-03-16 2017-05-09 Apple Inc. Fast, language-independent method for user authentication by voice
US9668121B2 (en) 2014-09-30 2017-05-30 Apple Inc. Social reminders
USD790583S1 (en) 2010-04-09 2017-06-27 Citigroup Technology, Inc. Display screen of an electronic device with graphical user interface
US9697822B1 (en) 2013-03-15 2017-07-04 Apple Inc. System and method for updating an adaptive speech recognition model
US9697820B2 (en) 2015-09-24 2017-07-04 Apple Inc. Unit-selection text-to-speech synthesis using concatenation-sensitive neural networks
US9711141B2 (en) 2014-12-09 2017-07-18 Apple Inc. Disambiguating heteronyms in speech synthesis
US9715875B2 (en) 2014-05-30 2017-07-25 Apple Inc. Reducing the need for manual start/end-pointing and trigger phrases
US9721566B2 (en) 2015-03-08 2017-08-01 Apple Inc. Competing devices responding to voice triggers
US9734193B2 (en) 2014-05-30 2017-08-15 Apple Inc. Determining domain salience ranking from ambiguous words in natural speech
US9739632B2 (en) 2012-10-17 2017-08-22 Tomtom Navigation B.V. Methods and systems of providing information using a navigation apparatus
US9760559B2 (en) 2014-05-30 2017-09-12 Apple Inc. Predictive text input
US9785630B2 (en) 2014-05-30 2017-10-10 Apple Inc. Text prediction using combined word N-gram and unigram language models
US9798393B2 (en) 2011-08-29 2017-10-24 Apple Inc. Text correction processing
US9818400B2 (en) 2014-09-11 2017-11-14 Apple Inc. Method and apparatus for discovering trending terms in speech requests
US9842101B2 (en) 2014-05-30 2017-12-12 Apple Inc. Predictive conversion of language input
US9842105B2 (en) 2015-04-16 2017-12-12 Apple Inc. Parsimonious continuous-space phrase representations for natural language processing
US9858925B2 (en) 2009-06-05 2018-01-02 Apple Inc. Using context information to facilitate processing of commands in a virtual assistant
US9865280B2 (en) 2015-03-06 2018-01-09 Apple Inc. Structured dictation using intelligent automated assistants
US9886953B2 (en) 2015-03-08 2018-02-06 Apple Inc. Virtual assistant activation
US9886432B2 (en) 2014-09-30 2018-02-06 Apple Inc. Parsimonious handling of word inflection via categorical stem + suffix N-gram language models
US9899019B2 (en) 2015-03-18 2018-02-20 Apple Inc. Systems and methods for structured stem and suffix language models
US9922642B2 (en) 2013-03-15 2018-03-20 Apple Inc. Training an at least partial voice command system
US9934775B2 (en) 2016-05-26 2018-04-03 Apple Inc. Unit-selection text-to-speech synthesis based on predicted concatenation parameters
US9953088B2 (en) 2012-05-14 2018-04-24 Apple Inc. Crowd sourcing information to fulfill user requests
US9959870B2 (en) 2008-12-11 2018-05-01 Apple Inc. Speech recognition involving a mobile device
US9966065B2 (en) 2014-05-30 2018-05-08 Apple Inc. Multi-command single utterance input method
US9966068B2 (en) 2013-06-08 2018-05-08 Apple Inc. Interpreting and acting upon commands that involve sharing information with remote devices
US9972304B2 (en) 2016-06-03 2018-05-15 Apple Inc. Privacy preserving distributed evaluation framework for embedded personalized systems
US9971774B2 (en) 2012-09-19 2018-05-15 Apple Inc. Voice-based media searching
US10049663B2 (en) 2016-06-08 2018-08-14 Apple, Inc. Intelligent automated assistant for media exploration
US10049668B2 (en) 2015-12-02 2018-08-14 Apple Inc. Applying neural network language models to weighted finite state transducers for automatic speech recognition
US10057736B2 (en) 2011-06-03 2018-08-21 Apple Inc. Active transport based notifications
US10067938B2 (en) 2016-06-10 2018-09-04 Apple Inc. Multilingual word prediction
US10074360B2 (en) 2014-09-30 2018-09-11 Apple Inc. Providing an indication of the suitability of speech recognition
US10079014B2 (en) 2012-06-08 2018-09-18 Apple Inc. Name recognition system
US10078631B2 (en) 2014-05-30 2018-09-18 Apple Inc. Entropy-guided text prediction using combined word and character n-gram language models
US10083688B2 (en) 2015-05-27 2018-09-25 Apple Inc. Device voice control for selecting a displayed affordance
US10089072B2 (en) 2016-06-11 2018-10-02 Apple Inc. Intelligent device arbitration and control
US10101822B2 (en) 2015-06-05 2018-10-16 Apple Inc. Language input correction
US10127220B2 (en) 2015-06-04 2018-11-13 Apple Inc. Language identification from short strings
US10127911B2 (en) 2014-09-30 2018-11-13 Apple Inc. Speaker identification and unsupervised speaker adaptation techniques
US10134385B2 (en) 2012-03-02 2018-11-20 Apple Inc. Systems and methods for name pronunciation
US20180348758A1 (en) * 2017-06-02 2018-12-06 Honda Motor Co., Ltd. Vehicle control system, vehicle control method, and storage medium
US10170123B2 (en) 2014-05-30 2019-01-01 Apple Inc. Intelligent assistant for home automation
US10176167B2 (en) 2013-06-09 2019-01-08 Apple Inc. System and method for inferring user intent from speech inputs
US10185542B2 (en) 2013-06-09 2019-01-22 Apple Inc. Device, method, and graphical user interface for enabling conversation persistence across two or more instances of a digital assistant
US10186254B2 (en) 2015-06-07 2019-01-22 Apple Inc. Context-based endpoint detection
US10192552B2 (en) 2016-06-10 2019-01-29 Apple Inc. Digital assistant providing whispered speech
US10199051B2 (en) 2013-02-07 2019-02-05 Apple Inc. Voice trigger for a digital assistant
US10223066B2 (en) 2015-12-23 2019-03-05 Apple Inc. Proactive assistance based on dialog communication between devices
US10241644B2 (en) 2011-06-03 2019-03-26 Apple Inc. Actionable reminder entries
US10241752B2 (en) 2011-09-30 2019-03-26 Apple Inc. Interface for a virtual digital assistant
US10249300B2 (en) 2016-06-06 2019-04-02 Apple Inc. Intelligent list reading
US10255907B2 (en) 2015-06-07 2019-04-09 Apple Inc. Automatic accent detection using acoustic models
US20190107412A1 (en) * 2016-05-19 2019-04-11 Aisin Aw Co., Ltd. Map display system and map display program
US10269345B2 (en) 2016-06-11 2019-04-23 Apple Inc. Intelligent task discovery
US10276170B2 (en) 2010-01-18 2019-04-30 Apple Inc. Intelligent automated assistant
US10283110B2 (en) 2009-07-02 2019-05-07 Apple Inc. Methods and apparatuses for automatic speech recognition
US10289433B2 (en) 2014-05-30 2019-05-14 Apple Inc. Domain specific language for encoding assistant dialog
US10297253B2 (en) 2016-06-11 2019-05-21 Apple Inc. Application integration with a digital assistant
US10318871B2 (en) 2005-09-08 2019-06-11 Apple Inc. Method and apparatus for building an intelligent automated assistant
US10354011B2 (en) 2016-06-09 2019-07-16 Apple Inc. Intelligent automated assistant in a home environment
US10366158B2 (en) 2015-09-29 2019-07-30 Apple Inc. Efficient word encoding for recurrent neural network language models
US10446141B2 (en) 2014-08-28 2019-10-15 Apple Inc. Automatic speech recognition based on user feedback
US10446143B2 (en) 2016-03-14 2019-10-15 Apple Inc. Identification of voice inputs providing credentials
US10490187B2 (en) 2016-06-10 2019-11-26 Apple Inc. Digital assistant providing automated status report
US10496753B2 (en) 2010-01-18 2019-12-03 Apple Inc. Automatically adapting user interfaces for hands-free interaction
US10509862B2 (en) 2016-06-10 2019-12-17 Apple Inc. Dynamic phrase expansion of language input
US10521466B2 (en) 2016-06-11 2019-12-31 Apple Inc. Data driven natural language event detection and classification
US10552013B2 (en) 2014-12-02 2020-02-04 Apple Inc. Data detection
US10553209B2 (en) 2010-01-18 2020-02-04 Apple Inc. Systems and methods for hands-free notification summaries
US10567477B2 (en) 2015-03-08 2020-02-18 Apple Inc. Virtual assistant continuity
US10568032B2 (en) 2007-04-03 2020-02-18 Apple Inc. Method and system for operating a multi-function portable electronic device using voice-activation
US10592095B2 (en) 2014-05-23 2020-03-17 Apple Inc. Instantaneous speaking of content on touch devices
US10593346B2 (en) 2016-12-22 2020-03-17 Apple Inc. Rank-reduced token representation for automatic speech recognition
US10607140B2 (en) 2010-01-25 2020-03-31 Newvaluexchange Ltd. Apparatuses, methods and systems for a digital conversation management platform
US10659851B2 (en) 2014-06-30 2020-05-19 Apple Inc. Real-time digital assistant knowledge updates
US10671428B2 (en) 2015-09-08 2020-06-02 Apple Inc. Distributed personal assistant
US10679605B2 (en) 2010-01-18 2020-06-09 Apple Inc. Hands-free list-reading by intelligent automated assistant
US10691473B2 (en) 2015-11-06 2020-06-23 Apple Inc. Intelligent automated assistant in a messaging environment
US10705794B2 (en) 2010-01-18 2020-07-07 Apple Inc. Automatically adapting user interfaces for hands-free interaction
US10706373B2 (en) 2011-06-03 2020-07-07 Apple Inc. Performing actions associated with task items that represent tasks to perform
US10733993B2 (en) 2016-06-10 2020-08-04 Apple Inc. Intelligent digital assistant in a multi-tasking environment
US10747498B2 (en) 2015-09-08 2020-08-18 Apple Inc. Zero latency digital assistant
US10762293B2 (en) 2010-12-22 2020-09-01 Apple Inc. Using parts-of-speech tagging and named entity recognition for spelling correction
US10791176B2 (en) 2017-05-12 2020-09-29 Apple Inc. Synchronization and task delegation of a digital assistant
US10789041B2 (en) 2014-09-12 2020-09-29 Apple Inc. Dynamic thresholds for always listening speech trigger
US10791216B2 (en) 2013-08-06 2020-09-29 Apple Inc. Auto-activating smart responses based on activities from remote devices
US10810274B2 (en) 2017-05-15 2020-10-20 Apple Inc. Optimizing dialogue policy decisions for digital assistants using implicit feedback
USD904427S1 (en) * 2019-03-07 2020-12-08 Fujifilm Sonosite, Inc. Display screen or portion thereof with a graphical user interface
USD904428S1 (en) * 2019-03-07 2020-12-08 Fujifilm Sonosite, Inc. Display screen or portion thereof with a graphical user interface
US11010550B2 (en) 2015-09-29 2021-05-18 Apple Inc. Unified language modeling framework for word prediction, auto-completion and auto-correction
US11025565B2 (en) 2015-06-07 2021-06-01 Apple Inc. Personalized prediction of responses for instant messaging
USD930676S1 (en) * 2018-09-07 2021-09-14 Samsung Display Co., Ltd. Display device with generated image for display
US11587559B2 (en) 2015-09-30 2023-02-21 Apple Inc. Intelligent device identification

Families Citing this family (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100245267B1 (en) * 1996-06-03 2000-02-15 모리 하루오 Car navigation system
KR100260760B1 (en) * 1996-07-31 2000-07-01 모리 하루오 Information display system with touch panel
EP1078223A1 (en) * 1998-05-14 2001-02-28 Intel Corporation Position sensitive display controller
ES2194623T3 (en) * 2001-09-11 2006-02-16 TRW AUTOMOTIVE ELECTRONICS & COMPONENTS GMBH & CO. KG REGULATION SYSTEM FOR A VEHICLE AIR CONDITIONING DEVICE.
US9164654B2 (en) 2002-12-10 2015-10-20 Neonode Inc. User interface for mobile computer unit
US9052777B2 (en) 2001-11-02 2015-06-09 Neonode Inc. Optical elements with alternating reflective lens facets
US8095879B2 (en) 2002-12-10 2012-01-10 Neonode Inc. User interface for mobile handheld computer unit
US8674966B2 (en) 2001-11-02 2014-03-18 Neonode Inc. ASIC controller for light-based touch screen
SE0103835L (en) 2001-11-02 2003-05-03 Neonode Ab Touch screen realized by display unit with light transmitting and light receiving units
US9778794B2 (en) 2001-11-02 2017-10-03 Neonode Inc. Light-based touch screen
US8416217B1 (en) 2002-11-04 2013-04-09 Neonode Inc. Light-based finger gesture user interface
DE102004044355A1 (en) * 2004-09-09 2006-03-30 E.G.O. Elektro-Gerätebau GmbH Method for optically marking a touch switch and such a touch switch
DE102005004202B4 (en) * 2005-01-29 2013-07-25 Volkswagen Ag Display device for a data processing device in a vehicle
DE102007051015A1 (en) * 2007-10-25 2009-04-30 Bayerische Motoren Werke Aktiengesellschaft Dialog system for a motor vehicle
US8775023B2 (en) 2009-02-15 2014-07-08 Neanode Inc. Light-based touch controls on a steering wheel and dashboard
US9092093B2 (en) 2012-11-27 2015-07-28 Neonode Inc. Steering wheel user interface
US9840618B2 (en) 2014-10-02 2017-12-12 Lg Chem, Ltd. Thermoplastic resin composition having superior chemical resistance and transparency, method of preparing the same and molded article including the same
WO2020112585A1 (en) 2018-11-28 2020-06-04 Neonode Inc. Motorist user interface sensor
CN116420125A (en) 2020-09-30 2023-07-11 内奥诺德公司 Optical touch sensor

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4523188A (en) * 1982-10-25 1985-06-11 The United States Of America As Represented By The Secretary Of The Army Automated map and display alignment
EP0171365A2 (en) * 1984-07-06 1986-02-12 BE.BO.CAR-TRONIC S.r.l. An automatic electronic elapsed time indicator
US4672558A (en) * 1984-09-25 1987-06-09 Aquila Technologies Group, Inc. Touch-sensitive data input device
DE4033832A1 (en) * 1989-10-24 1991-06-27 Mitsubishi Electric Corp Touch-contact operating field for automobile navigation aid - has temporary display of switch pattern on screen used to display navigation map
EP0476972A2 (en) * 1990-09-17 1992-03-25 Xerox Corporation Touch screen user interface with expanding touch locations for a reprographic machine
US5146049A (en) * 1990-01-22 1992-09-08 Fujitsu Limited Method and system for inputting coordinates using digitizer
US5266931A (en) * 1991-05-09 1993-11-30 Sony Corporation Apparatus and method for inputting data
US5502461A (en) * 1993-05-11 1996-03-26 Sanyo Electric Co., Ltd. Hand written character input system/allowing change of size of character writing frames
US5565894A (en) * 1993-04-01 1996-10-15 International Business Machines Corporation Dynamic touchscreen button adjustment mechanism
US5638060A (en) * 1992-10-15 1997-06-10 Yazaki Corporation System switch device

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS62121641U (en) 1986-01-22 1987-08-01
JPH0594132A (en) 1991-10-01 1993-04-16 Toshiba Corp Navigation device

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4523188A (en) * 1982-10-25 1985-06-11 The United States Of America As Represented By The Secretary Of The Army Automated map and display alignment
EP0171365A2 (en) * 1984-07-06 1986-02-12 BE.BO.CAR-TRONIC S.r.l. An automatic electronic elapsed time indicator
US4672558A (en) * 1984-09-25 1987-06-09 Aquila Technologies Group, Inc. Touch-sensitive data input device
DE4033832A1 (en) * 1989-10-24 1991-06-27 Mitsubishi Electric Corp Touch-contact operating field for automobile navigation aid - has temporary display of switch pattern on screen used to display navigation map
US5539429A (en) * 1989-10-24 1996-07-23 Mitsubishi Denki Kabushiki Kaisha Touch device panel
US5146049A (en) * 1990-01-22 1992-09-08 Fujitsu Limited Method and system for inputting coordinates using digitizer
EP0476972A2 (en) * 1990-09-17 1992-03-25 Xerox Corporation Touch screen user interface with expanding touch locations for a reprographic machine
US5119079A (en) * 1990-09-17 1992-06-02 Xerox Corporation Touch screen user interface with expanding touch locations for a reprographic machine
US5266931A (en) * 1991-05-09 1993-11-30 Sony Corporation Apparatus and method for inputting data
US5638060A (en) * 1992-10-15 1997-06-10 Yazaki Corporation System switch device
US5565894A (en) * 1993-04-01 1996-10-15 International Business Machines Corporation Dynamic touchscreen button adjustment mechanism
US5502461A (en) * 1993-05-11 1996-03-26 Sanyo Electric Co., Ltd. Hand written character input system/allowing change of size of character writing frames

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
European Search Report. *
IBM Technical Disclosure Bulletin, vol. 33, No. 10a, Mar. 1991, New York, USA, pp. 223 227, XP 000110024, Algorithm for Decreasing the Error Rate of Data Entered on a Touch Sensitive Terminal , entire document. *
IBM Technical Disclosure Bulletin, vol. 33, No. 10a, Mar. 1991, New York, USA, pp. 223-227, XP 000110024, "Algorithm for Decreasing the Error Rate of Data Entered on a Touch-Sensitive Terminal", entire document.

Cited By (322)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6088652A (en) * 1996-03-29 2000-07-11 Sanyo Electric Co., Ltd. Navigation device
US20040012506A1 (en) * 1996-09-13 2004-01-22 Toshio Fujiwara Information display system for displaying specified location with map therearound on display equipment
US20030201914A1 (en) * 1996-09-13 2003-10-30 Toshio Fujiwara Information display system for displaying specified location with map therearound on display equipment
US6570559B1 (en) * 1997-05-15 2003-05-27 Sony Corporation Information display apparatus, and display state detection method, display state adjustment method and maintenance management method therefor
US9292111B2 (en) 1998-01-26 2016-03-22 Apple Inc. Gesturing with a multipoint sensing device
US20080036743A1 (en) * 1998-01-26 2008-02-14 Apple Computer, Inc. Gesturing with a multipoint sensing device
US9239673B2 (en) 1998-01-26 2016-01-19 Apple Inc. Gesturing with a multipoint sensing device
US6563492B1 (en) * 1999-03-03 2003-05-13 Yazaki Corporation Multi-function switch unit and function indicating method of the same
EP1710539A1 (en) * 1999-05-21 2006-10-11 Clarion Co., Ltd. Navigation system and method and recording medium for recording navigation software
EP1054236A2 (en) * 1999-05-21 2000-11-22 CLARION Co., Ltd. Navigation system and method and recording medium for recording navigation software
EP1054236A3 (en) * 1999-05-21 2004-09-29 CLARION Co., Ltd. Navigation system and method and recording medium for recording navigation software
US6414674B1 (en) * 1999-12-17 2002-07-02 International Business Machines Corporation Data processing system and method including an I/O touch pad having dynamically alterable location indicators
US8032843B2 (en) 1999-12-20 2011-10-04 Apple Inc. User interface for providing consolidation and access
US8640044B2 (en) 1999-12-20 2014-01-28 Apple Inc. User interface for providing consolidation and access
US8640045B2 (en) 1999-12-20 2014-01-28 Apple Inc. User interface for providing consolidation and access
US9684436B2 (en) 1999-12-20 2017-06-20 Apple Inc. User interface for providing consolidation and access
US20090183120A1 (en) * 1999-12-20 2009-07-16 Apple Inc. User interface for providing consolidation and access
US20040174396A1 (en) * 2000-01-05 2004-09-09 Apple Computer, Inc. Method and system for providing an embedded application tool bar
US8799813B2 (en) 2000-01-05 2014-08-05 Apple Inc. Method and system for providing an embedded application tool bar
US9501178B1 (en) * 2000-02-10 2016-11-22 Intel Corporation Generating audible tooltips
US9646614B2 (en) 2000-03-16 2017-05-09 Apple Inc. Fast, language-independent method for user authentication by voice
US6795059B2 (en) * 2000-08-17 2004-09-21 Alpine Electronics, Inc. Operating device for controlling electronic devices utilizing a touch panel
EP1228917A1 (en) 2001-02-01 2002-08-07 Pi Technology A control arrangement
US20050131641A1 (en) * 2001-12-11 2005-06-16 Garmin Ltd., A Cayman Islands Corporation System and method for estimating impedance time through a road network
US7283905B1 (en) 2001-12-11 2007-10-16 Garmin Ltd. System and method for estimating impedance time through a road network
US7206692B2 (en) 2001-12-11 2007-04-17 Garmin Ltd. System and method for estimating impedance time through a road network
US7409288B1 (en) 2001-12-20 2008-08-05 Garmin Ltd. Portable navigation system and device with audible turn instructions
US7269508B1 (en) 2001-12-21 2007-09-11 Garmin Ltd. Guidance with feature accounting for insignificant roads
US7277794B1 (en) 2001-12-21 2007-10-02 Garmin Ltd. Guidance with feature accounting for insignificant roads
US20050114021A1 (en) * 2001-12-21 2005-05-26 Garmin Ltd., A Cayman Islands Corporation PDA with integrated address book and electronic map waypoints
US7184886B1 (en) 2001-12-21 2007-02-27 Garmin Ltd. Navigation system, method and device with detour algorithm
US20070067101A1 (en) * 2001-12-21 2007-03-22 Garmin Ltd. Navigation system, method and device with detour algorithm
US7043362B2 (en) 2001-12-21 2006-05-09 Garmin Ltd. PDA with integrated address book and electronic map waypoints
US7308359B1 (en) 2001-12-21 2007-12-11 Garmin Ltd. Navigation system, method and device with automatic next turn page
US7120539B2 (en) 2001-12-21 2006-10-10 Garmin Ltd. Navigation system, method and device with detour algorithm
US9606668B2 (en) 2002-02-07 2017-03-28 Apple Inc. Mode-based graphical user interfaces for touch sensitive input devices
US6826473B1 (en) 2002-02-08 2004-11-30 Garmin Ltd. PDA with integrated navigation functions and expense reporting
US20030222858A1 (en) * 2002-05-28 2003-12-04 Pioneer Corporation Touch panel device
US7154483B2 (en) * 2002-05-28 2006-12-26 Pioneer Corporation Touch panel device
US20050253818A1 (en) * 2002-06-25 2005-11-17 Esa Nettamo Method of interpreting control command, and portable electronic device
US20040131212A1 (en) * 2003-01-07 2004-07-08 Steven Chang LCD display with an infrared transmission interface for transmitting audio signals in stereo
US8019531B2 (en) 2003-02-26 2011-09-13 Tomtom International B.V. Navigation device and method for displaying alternative routes
EP1811269A3 (en) * 2003-02-26 2008-03-05 TomTom International B.V. Navigation device and method for displaying alternative routes
CN1754082B (en) * 2003-02-26 2011-07-06 通腾科技股份有限公司 Navigation device and method for displaying alternative routes
EP2264405A2 (en) * 2003-02-26 2010-12-22 TomTom International B.V. Navigation device and method for displaying alternative routes
US20070005233A1 (en) * 2003-02-26 2007-01-04 Ayal Pinkus Navigation device and method for displaying alternative routes
EP2264405A3 (en) * 2003-02-26 2011-02-16 TomTom International B.V. Navigation device and method for displaying alternative routes
US9367239B2 (en) 2003-02-26 2016-06-14 Tomtom International B.V. Navigation device and method for displaying alternative routes
WO2004076976A1 (en) * 2003-02-26 2004-09-10 Tomtom B.V. Navigation device and method for displaying alternative routes
US20110144904A1 (en) * 2003-02-26 2011-06-16 Tomtom International B.V. Navigation device and method for displaying alternative routes
US20060212185A1 (en) * 2003-02-27 2006-09-21 Philp Joseph W Method and apparatus for automatic selection of train activity locations
US20040178994A1 (en) * 2003-03-10 2004-09-16 International Business Machines Corporation Dynamic resizing of clickable areas of touch screen applications
US7103852B2 (en) 2003-03-10 2006-09-05 International Business Machines Corporation Dynamic resizing of clickable areas of touch screen applications
US9239677B2 (en) 2004-05-06 2016-01-19 Apple Inc. Operation of a computer with touch screen interface
US10338789B2 (en) 2004-05-06 2019-07-02 Apple Inc. Operation of a computer with touch screen interface
US20070247442A1 (en) * 2004-07-30 2007-10-25 Andre Bartley K Activating virtual keys of a touch-screen virtual keyboard
US8479122B2 (en) 2004-07-30 2013-07-02 Apple Inc. Gestures for touch sensitive input devices
US20060026521A1 (en) * 2004-07-30 2006-02-02 Apple Computer, Inc. Gestures for touch sensitive input devices
US20080211784A1 (en) * 2004-07-30 2008-09-04 Apple Inc. Gestures for touch sensitive input devices
US7614008B2 (en) 2004-07-30 2009-11-03 Apple Inc. Operation of a computer with touch screen interface
US7653883B2 (en) 2004-07-30 2010-01-26 Apple Inc. Proximity detector in handheld device
US20060033724A1 (en) * 2004-07-30 2006-02-16 Apple Computer, Inc. Virtual input device placement on a touch screen user interface
US7900156B2 (en) 2004-07-30 2011-03-01 Apple Inc. Activating virtual keys of a touch-screen virtual keyboard
US9348458B2 (en) 2004-07-30 2016-05-24 Apple Inc. Gestures for touch sensitive input devices
US10042418B2 (en) 2004-07-30 2018-08-07 Apple Inc. Proximity detector in handheld device
US8612856B2 (en) 2004-07-30 2013-12-17 Apple Inc. Proximity detector in handheld device
US20060085757A1 (en) * 2004-07-30 2006-04-20 Apple Computer, Inc. Activating virtual keys of a touch-screen virtual keyboard
US11036282B2 (en) 2004-07-30 2021-06-15 Apple Inc. Proximity detector in handheld device
US8427445B2 (en) 2004-07-30 2013-04-23 Apple Inc. Visual expander
US8381135B2 (en) 2004-07-30 2013-02-19 Apple Inc. Proximity detector in handheld device
US8239784B2 (en) 2004-07-30 2012-08-07 Apple Inc. Mode-based graphical user interfaces for touch sensitive input devices
US7844914B2 (en) 2004-07-30 2010-11-30 Apple Inc. Activating virtual keys of a touch-screen virtual keyboard
US20060053387A1 (en) * 2004-07-30 2006-03-09 Apple Computer, Inc. Operation of a computer with touch screen interface
US8744852B1 (en) 2004-10-01 2014-06-03 Apple Inc. Spoken interfaces
US20060122769A1 (en) * 2004-12-02 2006-06-08 Denson Corporation Navigation system
US10318871B2 (en) 2005-09-08 2019-06-11 Apple Inc. Method and apparatus for building an intelligent automated assistant
WO2007032843A3 (en) * 2005-09-16 2007-05-24 Apple Computer Activating virtual keys of a touch-screen virtual keyboard
WO2007032843A2 (en) * 2005-09-16 2007-03-22 Apple Inc. Activating virtual keys of a touch-screen virtual keyboard
US7925320B2 (en) 2006-03-06 2011-04-12 Garmin Switzerland Gmbh Electronic device mount
US20070207842A1 (en) * 2006-03-06 2007-09-06 Garmin Ltd. A Cayman Islands Corporation Electronic device mount
US20080055259A1 (en) * 2006-08-31 2008-03-06 Honeywell International, Inc. Method for dynamically adapting button size on touch screens to compensate for hand tremor
US8013839B2 (en) 2006-09-06 2011-09-06 Apple Inc. Methods for determining a cursor position from a finger contact with a touch screen display
US7843427B2 (en) 2006-09-06 2010-11-30 Apple Inc. Methods for determining a cursor position from a finger contact with a touch screen display
US20080094356A1 (en) * 2006-09-06 2008-04-24 Bas Ording Methods for Determining a Cursor Position from a Finger Contact with a Touch Screen Display
US20110074677A1 (en) * 2006-09-06 2011-03-31 Bas Ording Methods for Determining a Cursor Position from a Finger Contact with a Touch Screen Display
US8942986B2 (en) 2006-09-08 2015-01-27 Apple Inc. Determining user intent based on ontologies of domains
US9117447B2 (en) 2006-09-08 2015-08-25 Apple Inc. Using event alert text as input to an automated assistant
US8930191B2 (en) 2006-09-08 2015-01-06 Apple Inc. Paraphrasing of user requests and results by automated digital assistant
US20080165142A1 (en) * 2006-10-26 2008-07-10 Kenneth Kocienda Portable Multifunction Device, Method, and Graphical User Interface for Adjusting an Insertion Point Marker
US8570278B2 (en) 2006-10-26 2013-10-29 Apple Inc. Portable multifunction device, method, and graphical user interface for adjusting an insertion point marker
US20110080364A1 (en) * 2006-10-26 2011-04-07 Bas Ording Method, System, and Graphical User Interface for Positioning an Insertion Marker in a Touch Screen Display
US9348511B2 (en) 2006-10-26 2016-05-24 Apple Inc. Method, system, and graphical user interface for positioning an insertion marker in a touch screen display
US9632695B2 (en) 2006-10-26 2017-04-25 Apple Inc. Portable multifunction device, method, and graphical user interface for adjusting an insertion point marker
US9207855B2 (en) 2006-10-26 2015-12-08 Apple Inc. Portable multifunction device, method, and graphical user interface for adjusting an insertion point marker
WO2008086220A2 (en) * 2007-01-07 2008-07-17 Apple Inc. Graphical user interface for providing lists of locations for a map application in a portable multifunction device
US20080168396A1 (en) * 2007-01-07 2008-07-10 Michael Matas Portable Multifunction Device, Method, and Graphical User Interface for Providing Maps and Directions
WO2008086220A3 (en) * 2007-01-07 2009-03-05 Apple Inc Graphical user interface for providing lists of locations for a map application in a portable multifunction device
US8519963B2 (en) 2007-01-07 2013-08-27 Apple Inc. Portable multifunction device, method, and graphical user interface for interpreting a finger gesture on a touch screen display
US20080165160A1 (en) * 2007-01-07 2008-07-10 Kenneth Kocienda Portable Multifunction Device, Method, and Graphical User Interface for Interpreting a Finger Gesture on a Touch Screen Display
US8607167B2 (en) 2007-01-07 2013-12-10 Apple Inc. Portable multifunction device, method, and graphical user interface for providing maps and directions
US10568032B2 (en) 2007-04-03 2020-02-18 Apple Inc. Method and system for operating a multi-function portable electronic device using voice-activation
US20080320419A1 (en) * 2007-06-22 2008-12-25 Michael Matas Touch Screen Device, Method, and Graphical User Interface for Providing Maps, Directions, and Location-Based Information
US8302033B2 (en) 2007-06-22 2012-10-30 Apple Inc. Touch screen device, method, and graphical user interface for providing maps, directions, and location-based information
US10686930B2 (en) 2007-06-22 2020-06-16 Apple Inc. Touch screen device, method, and graphical user interface for providing maps, directions, and location based information
US11849063B2 (en) 2007-06-22 2023-12-19 Apple Inc. Touch screen device, method, and graphical user interface for providing maps, directions, and location-based information
US20090119007A1 (en) * 2007-11-06 2009-05-07 Honda Motor Co., Ltd Navigation System
US9330720B2 (en) 2008-01-03 2016-05-03 Apple Inc. Methods and apparatus for altering audio output signals
US10381016B2 (en) 2008-01-03 2019-08-13 Apple Inc. Methods and apparatus for altering audio output signals
US20090174680A1 (en) * 2008-01-06 2009-07-09 Freddy Allen Anzures Portable Multifunction Device, Method, and Graphical User Interface for Viewing and Managing Electronic Calendars
US20090178007A1 (en) * 2008-01-06 2009-07-09 Michael Matas Touch Screen Device, Method, and Graphical User Interface for Displaying and Selecting Application Options
US11126326B2 (en) 2008-01-06 2021-09-21 Apple Inc. Portable multifunction device, method, and graphical user interface for viewing and managing electronic calendars
US9330381B2 (en) 2008-01-06 2016-05-03 Apple Inc. Portable multifunction device, method, and graphical user interface for viewing and managing electronic calendars
US8327272B2 (en) 2008-01-06 2012-12-04 Apple Inc. Portable multifunction device, method, and graphical user interface for viewing and managing electronic calendars
US8171432B2 (en) 2008-01-06 2012-05-01 Apple Inc. Touch screen device, method, and graphical user interface for displaying and selecting application options
US10521084B2 (en) 2008-01-06 2019-12-31 Apple Inc. Portable multifunction device, method, and graphical user interface for viewing and managing electronic calendars
US10503366B2 (en) 2008-01-06 2019-12-10 Apple Inc. Portable multifunction device, method, and graphical user interface for viewing and managing electronic calendars
US9792001B2 (en) 2008-01-06 2017-10-17 Apple Inc. Portable multifunction device, method, and graphical user interface for viewing and managing electronic calendars
US20090228842A1 (en) * 2008-03-04 2009-09-10 Apple Inc. Selecting of text using gestures
US8201109B2 (en) 2008-03-04 2012-06-12 Apple Inc. Methods and graphical user interfaces for editing on a portable multifunction device
US9529524B2 (en) 2008-03-04 2016-12-27 Apple Inc. Methods and graphical user interfaces for editing on a portable multifunction device
US8650507B2 (en) 2008-03-04 2014-02-11 Apple Inc. Selecting of text using gestures
US9865248B2 (en) 2008-04-05 2018-01-09 Apple Inc. Intelligent text-to-speech conversion
US9626955B2 (en) 2008-04-05 2017-04-18 Apple Inc. Intelligent text-to-speech conversion
US8788112B2 (en) * 2008-06-20 2014-07-22 Bayerische Motoren Werke Aktiengesellschaft Process for controlling functions in a motor vehicle having neighboring operating elements
US20110082603A1 (en) * 2008-06-20 2011-04-07 Bayerische Motoren Werke Aktiengesellschaft Process for Controlling Functions in a Motor Vehicle Having Neighboring Operating Elements
US9535906B2 (en) 2008-07-31 2017-01-03 Apple Inc. Mobile device having human language translation capability with positional feedback
US10108612B2 (en) 2008-07-31 2018-10-23 Apple Inc. Mobile device having human language translation capability with positional feedback
US20100201455A1 (en) * 2008-09-23 2010-08-12 Aerovironment, Inc. Predictive pulse width modulation for an open delta h-bridge driven high efficiency ironless permanent magnet machine
US9959870B2 (en) 2008-12-11 2018-05-01 Apple Inc. Speech recognition involving a mobile device
US20100235785A1 (en) * 2009-03-16 2010-09-16 Bas Ording Methods and Graphical User Interfaces for Editing on a Multifunction Device with a Touch Screen Display
US20100235778A1 (en) * 2009-03-16 2010-09-16 Kocienda Kenneth L Methods and Graphical User Interfaces for Editing on a Multifunction Device with a Touch Screen Display
US9846533B2 (en) 2009-03-16 2017-12-19 Apple Inc. Methods and graphical user interfaces for editing on a multifunction device with a touch screen display
US8255830B2 (en) 2009-03-16 2012-08-28 Apple Inc. Methods and graphical user interfaces for editing on a multifunction device with a touch screen display
US8756534B2 (en) 2009-03-16 2014-06-17 Apple Inc. Methods and graphical user interfaces for editing on a multifunction device with a touch screen display
US20100235726A1 (en) * 2009-03-16 2010-09-16 Bas Ording Methods and Graphical User Interfaces for Editing on a Multifunction Device with a Touch Screen Display
US8370736B2 (en) 2009-03-16 2013-02-05 Apple Inc. Methods and graphical user interfaces for editing on a multifunction device with a touch screen display
US10761716B2 (en) 2009-03-16 2020-09-01 Apple, Inc. Methods and graphical user interfaces for editing on a multifunction device with a touch screen display
US9875013B2 (en) 2009-03-16 2018-01-23 Apple Inc. Methods and graphical user interfaces for editing on a multifunction device with a touch screen display
US8661362B2 (en) 2009-03-16 2014-02-25 Apple Inc. Methods and graphical user interfaces for editing on a multifunction device with a touch screen display
US20100235729A1 (en) * 2009-03-16 2010-09-16 Kocienda Kenneth L Methods and Graphical User Interfaces for Editing on a Multifunction Device with a Touch Screen Display
US20100235735A1 (en) * 2009-03-16 2010-09-16 Bas Ording Methods and Graphical User Interfaces for Editing on a Multifunction Device with a Touch Screen Display
US20100235783A1 (en) * 2009-03-16 2010-09-16 Bas Ording Methods and Graphical User Interfaces for Editing on a Multifunction Device with a Touch Screen Display
US20100235734A1 (en) * 2009-03-16 2010-09-16 Bas Ording Methods and Graphical User Interfaces for Editing on a Multifunction Device with a Touch Screen Display
US8510665B2 (en) 2009-03-16 2013-08-13 Apple Inc. Methods and graphical user interfaces for editing on a multifunction device with a touch screen display
US8584050B2 (en) 2009-03-16 2013-11-12 Apple Inc. Methods and graphical user interfaces for editing on a multifunction device with a touch screen display
US8253705B2 (en) 2009-05-08 2012-08-28 Research In Motion Limited Target zones for menu items on a touch-sensitive display
US20100283746A1 (en) * 2009-05-08 2010-11-11 Vuong Thanh V Target zones for menu items on a touch-sensitive display
EP2249239B1 (en) * 2009-05-08 2015-07-15 BlackBerry Limited Target zones for menu items on a touch-sensitive display
US11080012B2 (en) 2009-06-05 2021-08-03 Apple Inc. Interface for a virtual digital assistant
US10475446B2 (en) 2009-06-05 2019-11-12 Apple Inc. Using context information to facilitate processing of commands in a virtual assistant
US10795541B2 (en) 2009-06-05 2020-10-06 Apple Inc. Intelligent organization of tasks items
US9858925B2 (en) 2009-06-05 2018-01-02 Apple Inc. Using context information to facilitate processing of commands in a virtual assistant
US8464182B2 (en) 2009-06-07 2013-06-11 Apple Inc. Device, method, and graphical user interface for providing maps, directions, and location-based information
US20100309149A1 (en) * 2009-06-07 2010-12-09 Chris Blumenberg Device, Method, and Graphical User Interface for Providing Maps, Directions, and Location-Based Information
US10283110B2 (en) 2009-07-02 2019-05-07 Apple Inc. Methods and apparatuses for automatic speech recognition
US8699131B2 (en) * 2009-12-04 2014-04-15 Olympus Corporation Microscope controller and microscope system comprising microscope controller
US20110134517A1 (en) * 2009-12-04 2011-06-09 Olympus Corporation Microscope controller and microscope system comprising microscope controller
US8862576B2 (en) 2010-01-06 2014-10-14 Apple Inc. Device, method, and graphical user interface for mapping directions between search results
US10169431B2 (en) 2010-01-06 2019-01-01 Apple Inc. Device, method, and graphical user interface for mapping directions between search results
US8456297B2 (en) 2010-01-06 2013-06-04 Apple Inc. Device, method, and graphical user interface for tracking movement on a map
US20110163874A1 (en) * 2010-01-06 2011-07-07 Van Os Marcel Device, Method, and Graphical User Interface for Tracking Movement on a Map
US20110167058A1 (en) * 2010-01-06 2011-07-07 Van Os Marcel Device, Method, and Graphical User Interface for Mapping Directions Between Search Results
US10276170B2 (en) 2010-01-18 2019-04-30 Apple Inc. Intelligent automated assistant
US8892446B2 (en) 2010-01-18 2014-11-18 Apple Inc. Service orchestration for intelligent automated assistant
US10705794B2 (en) 2010-01-18 2020-07-07 Apple Inc. Automatically adapting user interfaces for hands-free interaction
US10706841B2 (en) 2010-01-18 2020-07-07 Apple Inc. Task flow identification based on user intent
US11423886B2 (en) 2010-01-18 2022-08-23 Apple Inc. Task flow identification based on user intent
US9318108B2 (en) 2010-01-18 2016-04-19 Apple Inc. Intelligent automated assistant
US10679605B2 (en) 2010-01-18 2020-06-09 Apple Inc. Hands-free list-reading by intelligent automated assistant
US10496753B2 (en) 2010-01-18 2019-12-03 Apple Inc. Automatically adapting user interfaces for hands-free interaction
US8903716B2 (en) 2010-01-18 2014-12-02 Apple Inc. Personalized vocabulary for digital assistant
US9548050B2 (en) 2010-01-18 2017-01-17 Apple Inc. Intelligent automated assistant
US10553209B2 (en) 2010-01-18 2020-02-04 Apple Inc. Systems and methods for hands-free notification summaries
US10607141B2 (en) 2010-01-25 2020-03-31 Newvaluexchange Ltd. Apparatuses, methods and systems for a digital conversation management platform
US10607140B2 (en) 2010-01-25 2020-03-31 Newvaluexchange Ltd. Apparatuses, methods and systems for a digital conversation management platform
US10984326B2 (en) 2010-01-25 2021-04-20 Newvaluexchange Ltd. Apparatuses, methods and systems for a digital conversation management platform
US11410053B2 (en) 2010-01-25 2022-08-09 Newvaluexchange Ltd. Apparatuses, methods and systems for a digital conversation management platform
US10984327B2 (en) 2010-01-25 2021-04-20 New Valuexchange Ltd. Apparatuses, methods and systems for a digital conversation management platform
US10049675B2 (en) 2010-02-25 2018-08-14 Apple Inc. User profiling for voice input processing
US9633660B2 (en) 2010-02-25 2017-04-25 Apple Inc. User profiling for voice input processing
CN102792258A (en) * 2010-03-18 2012-11-21 大众汽车有限公司 Method for controlling an electronic system for a vehicle and corresponding controller
CN102792258B (en) * 2010-03-18 2017-04-12 大众汽车有限公司 Method for controlling an electronic system for a vehicle and corresponding controller
WO2011113513A1 (en) * 2010-03-18 2011-09-22 Volkswagen Aktiengesellschaft Method for controlling an electronic system for a vehicle and corresponding controller
USD790583S1 (en) 2010-04-09 2017-06-27 Citigroup Technology, Inc. Display screen of an electronic device with graphical user interface
US8456445B2 (en) 2010-04-30 2013-06-04 Honeywell International Inc. Touch screen and method for adjusting screen objects
EP2407865A1 (en) * 2010-07-16 2012-01-18 Gigaset Communications GmbH Adaptive calibration of sensor monitors for optimising interface quality
US10762293B2 (en) 2010-12-22 2020-09-01 Apple Inc. Using parts-of-speech tagging and named entity recognition for spelling correction
US10102359B2 (en) 2011-03-21 2018-10-16 Apple Inc. Device access using voice authentication
US9262612B2 (en) 2011-03-21 2016-02-16 Apple Inc. Device access using voice authentication
US10664144B2 (en) 2011-05-31 2020-05-26 Apple Inc. Devices, methods, and graphical user interfaces for document manipulation
US8677232B2 (en) 2011-05-31 2014-03-18 Apple Inc. Devices, methods, and graphical user interfaces for document manipulation
US9244605B2 (en) 2011-05-31 2016-01-26 Apple Inc. Devices, methods, and graphical user interfaces for document manipulation
US9092130B2 (en) 2011-05-31 2015-07-28 Apple Inc. Devices, methods, and graphical user interfaces for document manipulation
US11256401B2 (en) 2011-05-31 2022-02-22 Apple Inc. Devices, methods, and graphical user interfaces for document manipulation
US8719695B2 (en) 2011-05-31 2014-05-06 Apple Inc. Devices, methods, and graphical user interfaces for document manipulation
US8661339B2 (en) 2011-05-31 2014-02-25 Apple Inc. Devices, methods, and graphical user interfaces for document manipulation
US11120372B2 (en) 2011-06-03 2021-09-14 Apple Inc. Performing actions associated with task items that represent tasks to perform
US10241644B2 (en) 2011-06-03 2019-03-26 Apple Inc. Actionable reminder entries
US10706373B2 (en) 2011-06-03 2020-07-07 Apple Inc. Performing actions associated with task items that represent tasks to perform
US10057736B2 (en) 2011-06-03 2018-08-21 Apple Inc. Active transport based notifications
US9798393B2 (en) 2011-08-29 2017-10-24 Apple Inc. Text correction processing
US10241752B2 (en) 2011-09-30 2019-03-26 Apple Inc. Interface for a virtual digital assistant
US10134385B2 (en) 2012-03-02 2018-11-20 Apple Inc. Systems and methods for name pronunciation
US9483461B2 (en) 2012-03-06 2016-11-01 Apple Inc. Handling speech synthesis of content for multiple languages
US9953088B2 (en) 2012-05-14 2018-04-24 Apple Inc. Crowd sourcing information to fulfill user requests
US10079014B2 (en) 2012-06-08 2018-09-18 Apple Inc. Name recognition system
US9495129B2 (en) 2012-06-29 2016-11-15 Apple Inc. Device, method, and user interface for voice-activated navigation and browsing of a document
US9576574B2 (en) 2012-09-10 2017-02-21 Apple Inc. Context-sensitive handling of interruptions by intelligent digital assistant
US9971774B2 (en) 2012-09-19 2018-05-15 Apple Inc. Voice-based media searching
US9739632B2 (en) 2012-10-17 2017-08-22 Tomtom Navigation B.V. Methods and systems of providing information using a navigation apparatus
KR20210082554A (en) * 2012-10-17 2021-07-05 톰톰 네비게이션 비.브이. Methods and systems of providing information using a navigation apparatus
US10612935B2 (en) * 2012-10-17 2020-04-07 Tomtom Navigation B.V. Methods and systems of providing information using a navigation apparatus
CN104823023A (en) * 2012-10-17 2015-08-05 通腾科技股份有限公司 Methods and systems of providing information using navigation apparatus
US20150241239A1 (en) * 2012-10-17 2015-08-27 Tomtom International B.V. Methods and systems of providing information using a navigation apparatus
CN110836676A (en) * 2012-10-17 2020-02-25 通腾导航技术股份有限公司 Method and system for providing information using navigation device
US10199051B2 (en) 2013-02-07 2019-02-05 Apple Inc. Voice trigger for a digital assistant
US10978090B2 (en) 2013-02-07 2021-04-13 Apple Inc. Voice trigger for a digital assistant
US9368114B2 (en) 2013-03-14 2016-06-14 Apple Inc. Context-sensitive handling of interruptions
US9697822B1 (en) 2013-03-15 2017-07-04 Apple Inc. System and method for updating an adaptive speech recognition model
US9922642B2 (en) 2013-03-15 2018-03-20 Apple Inc. Training an at least partial voice command system
US9633674B2 (en) 2013-06-07 2017-04-25 Apple Inc. System and method for detecting errors in interactions with a voice-based digital assistant
US9966060B2 (en) 2013-06-07 2018-05-08 Apple Inc. System and method for user-specified pronunciation of words for speech synthesis and recognition
US9620104B2 (en) 2013-06-07 2017-04-11 Apple Inc. System and method for user-specified pronunciation of words for speech synthesis and recognition
US9582608B2 (en) 2013-06-07 2017-02-28 Apple Inc. Unified ranking with entropy-weighted information for phrase-based semantic auto-completion
US9966068B2 (en) 2013-06-08 2018-05-08 Apple Inc. Interpreting and acting upon commands that involve sharing information with remote devices
US10657961B2 (en) 2013-06-08 2020-05-19 Apple Inc. Interpreting and acting upon commands that involve sharing information with remote devices
US10176167B2 (en) 2013-06-09 2019-01-08 Apple Inc. System and method for inferring user intent from speech inputs
US10185542B2 (en) 2013-06-09 2019-01-22 Apple Inc. Device, method, and graphical user interface for enabling conversation persistence across two or more instances of a digital assistant
US9300784B2 (en) 2013-06-13 2016-03-29 Apple Inc. System and method for emergency calls initiated by voice command
US10791216B2 (en) 2013-08-06 2020-09-29 Apple Inc. Auto-activating smart responses based on activities from remote devices
CN103809830B (en) * 2014-03-03 2016-08-31 欧浦登(福建)光学有限公司 Implementation method based on single-layer double-side conductor wire membrane capacitance formula gate inhibition's touch-control
US9620105B2 (en) 2014-05-15 2017-04-11 Apple Inc. Analyzing audio input for efficient speech and music recognition
US10592095B2 (en) 2014-05-23 2020-03-17 Apple Inc. Instantaneous speaking of content on touch devices
US9502031B2 (en) 2014-05-27 2016-11-22 Apple Inc. Method for supporting dynamic grammars in WFST-based ASR
US10083690B2 (en) 2014-05-30 2018-09-25 Apple Inc. Better resolution when referencing to concepts
US9715875B2 (en) 2014-05-30 2017-07-25 Apple Inc. Reducing the need for manual start/end-pointing and trigger phrases
US9966065B2 (en) 2014-05-30 2018-05-08 Apple Inc. Multi-command single utterance input method
US11257504B2 (en) 2014-05-30 2022-02-22 Apple Inc. Intelligent assistant for home automation
US11133008B2 (en) 2014-05-30 2021-09-28 Apple Inc. Reducing the need for manual start/end-pointing and trigger phrases
US10078631B2 (en) 2014-05-30 2018-09-18 Apple Inc. Entropy-guided text prediction using combined word and character n-gram language models
US9842101B2 (en) 2014-05-30 2017-12-12 Apple Inc. Predictive conversion of language input
US9633004B2 (en) 2014-05-30 2017-04-25 Apple Inc. Better resolution when referencing to concepts
US10289433B2 (en) 2014-05-30 2019-05-14 Apple Inc. Domain specific language for encoding assistant dialog
US10169329B2 (en) 2014-05-30 2019-01-01 Apple Inc. Exemplar-based natural language processing
US9430463B2 (en) 2014-05-30 2016-08-30 Apple Inc. Exemplar-based natural language processing
US10170123B2 (en) 2014-05-30 2019-01-01 Apple Inc. Intelligent assistant for home automation
US9734193B2 (en) 2014-05-30 2017-08-15 Apple Inc. Determining domain salience ranking from ambiguous words in natural speech
US10497365B2 (en) 2014-05-30 2019-12-03 Apple Inc. Multi-command single utterance input method
US9760559B2 (en) 2014-05-30 2017-09-12 Apple Inc. Predictive text input
US9785630B2 (en) 2014-05-30 2017-10-10 Apple Inc. Text prediction using combined word N-gram and unigram language models
US10904611B2 (en) 2014-06-30 2021-01-26 Apple Inc. Intelligent automated assistant for TV user interactions
US10659851B2 (en) 2014-06-30 2020-05-19 Apple Inc. Real-time digital assistant knowledge updates
US9338493B2 (en) 2014-06-30 2016-05-10 Apple Inc. Intelligent automated assistant for TV user interactions
US9668024B2 (en) 2014-06-30 2017-05-30 Apple Inc. Intelligent automated assistant for TV user interactions
US10446141B2 (en) 2014-08-28 2019-10-15 Apple Inc. Automatic speech recognition based on user feedback
US9818400B2 (en) 2014-09-11 2017-11-14 Apple Inc. Method and apparatus for discovering trending terms in speech requests
US10431204B2 (en) 2014-09-11 2019-10-01 Apple Inc. Method and apparatus for discovering trending terms in speech requests
US10789041B2 (en) 2014-09-12 2020-09-29 Apple Inc. Dynamic thresholds for always listening speech trigger
US9986419B2 (en) 2014-09-30 2018-05-29 Apple Inc. Social reminders
US10127911B2 (en) 2014-09-30 2018-11-13 Apple Inc. Speaker identification and unsupervised speaker adaptation techniques
US9886432B2 (en) 2014-09-30 2018-02-06 Apple Inc. Parsimonious handling of word inflection via categorical stem + suffix N-gram language models
US9646609B2 (en) 2014-09-30 2017-05-09 Apple Inc. Caching apparatus for serving phonetic pronunciations
US9668121B2 (en) 2014-09-30 2017-05-30 Apple Inc. Social reminders
US10074360B2 (en) 2014-09-30 2018-09-11 Apple Inc. Providing an indication of the suitability of speech recognition
US11556230B2 (en) 2014-12-02 2023-01-17 Apple Inc. Data detection
US10552013B2 (en) 2014-12-02 2020-02-04 Apple Inc. Data detection
US9711141B2 (en) 2014-12-09 2017-07-18 Apple Inc. Disambiguating heteronyms in speech synthesis
US9865280B2 (en) 2015-03-06 2018-01-09 Apple Inc. Structured dictation using intelligent automated assistants
US9886953B2 (en) 2015-03-08 2018-02-06 Apple Inc. Virtual assistant activation
US11087759B2 (en) 2015-03-08 2021-08-10 Apple Inc. Virtual assistant activation
US10567477B2 (en) 2015-03-08 2020-02-18 Apple Inc. Virtual assistant continuity
US9721566B2 (en) 2015-03-08 2017-08-01 Apple Inc. Competing devices responding to voice triggers
US10311871B2 (en) 2015-03-08 2019-06-04 Apple Inc. Competing devices responding to voice triggers
US9899019B2 (en) 2015-03-18 2018-02-20 Apple Inc. Systems and methods for structured stem and suffix language models
US9842105B2 (en) 2015-04-16 2017-12-12 Apple Inc. Parsimonious continuous-space phrase representations for natural language processing
US10083688B2 (en) 2015-05-27 2018-09-25 Apple Inc. Device voice control for selecting a displayed affordance
US10127220B2 (en) 2015-06-04 2018-11-13 Apple Inc. Language identification from short strings
US10101822B2 (en) 2015-06-05 2018-10-16 Apple Inc. Language input correction
US11025565B2 (en) 2015-06-07 2021-06-01 Apple Inc. Personalized prediction of responses for instant messaging
US10186254B2 (en) 2015-06-07 2019-01-22 Apple Inc. Context-based endpoint detection
US10255907B2 (en) 2015-06-07 2019-04-09 Apple Inc. Automatic accent detection using acoustic models
US10671428B2 (en) 2015-09-08 2020-06-02 Apple Inc. Distributed personal assistant
US10747498B2 (en) 2015-09-08 2020-08-18 Apple Inc. Zero latency digital assistant
US11500672B2 (en) 2015-09-08 2022-11-15 Apple Inc. Distributed personal assistant
US9697820B2 (en) 2015-09-24 2017-07-04 Apple Inc. Unit-selection text-to-speech synthesis using concatenation-sensitive neural networks
US10366158B2 (en) 2015-09-29 2019-07-30 Apple Inc. Efficient word encoding for recurrent neural network language models
US11010550B2 (en) 2015-09-29 2021-05-18 Apple Inc. Unified language modeling framework for word prediction, auto-completion and auto-correction
US11587559B2 (en) 2015-09-30 2023-02-21 Apple Inc. Intelligent device identification
US11526368B2 (en) 2015-11-06 2022-12-13 Apple Inc. Intelligent automated assistant in a messaging environment
US10691473B2 (en) 2015-11-06 2020-06-23 Apple Inc. Intelligent automated assistant in a messaging environment
US10049668B2 (en) 2015-12-02 2018-08-14 Apple Inc. Applying neural network language models to weighted finite state transducers for automatic speech recognition
US10223066B2 (en) 2015-12-23 2019-03-05 Apple Inc. Proactive assistance based on dialog communication between devices
US10446143B2 (en) 2016-03-14 2019-10-15 Apple Inc. Identification of voice inputs providing credentials
US10663317B2 (en) * 2016-05-19 2020-05-26 Aisin Aw Co., Ltd. Map display system and map display program
US20190107412A1 (en) * 2016-05-19 2019-04-11 Aisin Aw Co., Ltd. Map display system and map display program
US9934775B2 (en) 2016-05-26 2018-04-03 Apple Inc. Unit-selection text-to-speech synthesis based on predicted concatenation parameters
US9972304B2 (en) 2016-06-03 2018-05-15 Apple Inc. Privacy preserving distributed evaluation framework for embedded personalized systems
US10249300B2 (en) 2016-06-06 2019-04-02 Apple Inc. Intelligent list reading
US11069347B2 (en) 2016-06-08 2021-07-20 Apple Inc. Intelligent automated assistant for media exploration
US10049663B2 (en) 2016-06-08 2018-08-14 Apple, Inc. Intelligent automated assistant for media exploration
US10354011B2 (en) 2016-06-09 2019-07-16 Apple Inc. Intelligent automated assistant in a home environment
US11037565B2 (en) 2016-06-10 2021-06-15 Apple Inc. Intelligent digital assistant in a multi-tasking environment
US10490187B2 (en) 2016-06-10 2019-11-26 Apple Inc. Digital assistant providing automated status report
US10067938B2 (en) 2016-06-10 2018-09-04 Apple Inc. Multilingual word prediction
US10509862B2 (en) 2016-06-10 2019-12-17 Apple Inc. Dynamic phrase expansion of language input
US10733993B2 (en) 2016-06-10 2020-08-04 Apple Inc. Intelligent digital assistant in a multi-tasking environment
US10192552B2 (en) 2016-06-10 2019-01-29 Apple Inc. Digital assistant providing whispered speech
US10521466B2 (en) 2016-06-11 2019-12-31 Apple Inc. Data driven natural language event detection and classification
US11152002B2 (en) 2016-06-11 2021-10-19 Apple Inc. Application integration with a digital assistant
US10269345B2 (en) 2016-06-11 2019-04-23 Apple Inc. Intelligent task discovery
US10297253B2 (en) 2016-06-11 2019-05-21 Apple Inc. Application integration with a digital assistant
US10089072B2 (en) 2016-06-11 2018-10-02 Apple Inc. Intelligent device arbitration and control
US10593346B2 (en) 2016-12-22 2020-03-17 Apple Inc. Rank-reduced token representation for automatic speech recognition
US11405466B2 (en) 2017-05-12 2022-08-02 Apple Inc. Synchronization and task delegation of a digital assistant
US10791176B2 (en) 2017-05-12 2020-09-29 Apple Inc. Synchronization and task delegation of a digital assistant
US10810274B2 (en) 2017-05-15 2020-10-20 Apple Inc. Optimizing dialogue policy decisions for digital assistants using implicit feedback
US11194326B2 (en) * 2017-06-02 2021-12-07 Honda Motor Co., Ltd. Vehicle control system, vehicle control method, and storage medium storing vehicle control program
US20180348758A1 (en) * 2017-06-02 2018-12-06 Honda Motor Co., Ltd. Vehicle control system, vehicle control method, and storage medium
USD930676S1 (en) * 2018-09-07 2021-09-14 Samsung Display Co., Ltd. Display device with generated image for display
USD985604S1 (en) 2018-09-07 2023-05-09 Samsung Display Co., Ltd. Display device with generated image for display
USD904427S1 (en) * 2019-03-07 2020-12-08 Fujifilm Sonosite, Inc. Display screen or portion thereof with a graphical user interface
USD904428S1 (en) * 2019-03-07 2020-12-08 Fujifilm Sonosite, Inc. Display screen or portion thereof with a graphical user interface

Also Published As

Publication number Publication date
EP0703525B1 (en) 2001-12-05
DE69524340T2 (en) 2002-08-14
DE69524340D1 (en) 2002-01-17
EP0703525A1 (en) 1996-03-27

Similar Documents

Publication Publication Date Title
US5877751A (en) Touch display type information input system
US5274387A (en) Navigation apparatus for vehicles
US5729217A (en) Vehicle route guidance apparatus for reseaching for a route when vehicle goes out of route
CN101101219B (en) Vehicle-mounted displaying device and displaying method employed for the same
KR940001235B1 (en) Navigation system for movable body
US6006161A (en) Land vehicle navigation system with multi-screen mode selectivity
JP3814992B2 (en) Vehicle navigation device
US20020053984A1 (en) Lane guidance display method, and navigation device and recording medium for realizing the method
US7818122B2 (en) Navigation device, method and program
JPH10153449A (en) Navigation device for vehicle and storing medium
JP2593999B2 (en) Navigation device
JP3469329B2 (en) Map calling device for navigation system
JP2978088B2 (en) Target object display device and target object display method
JP2000043652A (en) Display control system for on-vehicle device
JPH0895708A (en) Screen touch type input device
JP3008839B2 (en) Navigation device
EP0953826A2 (en) Apparatus for displaying characters and symbols on map for use in navigation system
JPH07105493A (en) Screen display device for automobile
JP3109955B2 (en) Car navigation system
JP3967188B2 (en) Image output control device, method thereof, program thereof, recording medium recording the program, and navigation device
JPH07190789A (en) Navigation device
KR100777899B1 (en) Method of displaying search result for a car navigation system
JPH05313577A (en) Navigation device
JP2840011B2 (en) Navigation device
JPH0861964A (en) Navigation system with road map switching function

Legal Events

Date Code Title Description
FPAY Fee payment

Year of fee payment: 4

FPAY Fee payment

Year of fee payment: 8

REMI Maintenance fee reminder mailed
LAPS Lapse for failure to pay maintenance fees
STCH Information on status: patent discontinuation

Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362

FP Lapsed due to failure to pay maintenance fee

Effective date: 20110302