US20100315369A1 - Method and User Interface for Entering Characters - Google Patents

Method and User Interface for Entering Characters Download PDF

Info

Publication number
US20100315369A1
US20100315369A1 US12/862,223 US86222310A US2010315369A1 US 20100315369 A1 US20100315369 A1 US 20100315369A1 US 86222310 A US86222310 A US 86222310A US 2010315369 A1 US2010315369 A1 US 2010315369A1
Authority
US
United States
Prior art keywords
character
touch
movement
characters
entering function
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US12/862,223
Inventor
Timo Tokkonen
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to US12/862,223 priority Critical patent/US20100315369A1/en
Publication of US20100315369A1 publication Critical patent/US20100315369A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/02Input arrangements using manually operated switches, e.g. using keyboards or dials
    • G06F3/023Arrangements for converting discrete items of information into a coded form, e.g. arrangements for interpreting keyboard generated codes as alphanumeric codes, operand codes or instruction codes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/02Input arrangements using manually operated switches, e.g. using keyboards or dials
    • G06F3/023Arrangements for converting discrete items of information into a coded form, e.g. arrangements for interpreting keyboard generated codes as alphanumeric codes, operand codes or instruction codes
    • G06F3/0233Character input methods
    • G06F3/0236Character input methods using selection techniques to select from displayed items
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Definitions

  • the invention relates to a method for entering characters in a user interface of an electronic device, and a user interface of an electronic device.
  • Electronic devices such as mobile telephones, are continuously reduced in size.
  • a special problem in the usage of an electronic device is the entering of text.
  • the keypad used in electronic devices such as a character keypad, becomes impracticable, because it is difficult for the user of the device to press extremely small keys.
  • touch screens are often used to replace the mouse and the keypad, for example. The user gives control commands to the device by touching contact areas visible on the touch screen.
  • Several portable devices are provided with a feature that identifies handwriting, by means of which the device converts handwritten text, for example, into composed text.
  • There are different automatic identification methods of handwritten symbols in which the characters of the entered text are written directly on the touch screen.
  • the user writes characters in an area for writing characters on a touch screen by means of a pen or a finger, for example.
  • the device then identifies the written character based on the detected contact points in said area.
  • some text entry systems have been proposed, wherein the characters visible on the touch screen of an electronic device are selected by sliding a pen or a finger over the character to be entered.
  • the current text entry systems such as handwriting or speech recognition systems, are often slow and error prone. Also in the handwriting systems the area reserved on the touch screen for writing characters is small, which makes it hard to write in said area in a moving vehicle, for example. If the selection of the characters is conducted by sliding a pen over the desired character visible on the touch screen, it is difficult to hit the correct character when, for example, the writer's hand shakes in a rush hour bus.
  • a method for entering characters in a user interface of an electronic device comprising: a display and an input device, the method comprising: predetermining a given character area on the display for each character, which character areas are in relation with each other in order to achieve a given character area pattern; detecting a start of a character entering function; detecting a starting point on the display on the basis of the detected start of the character entering function.
  • the method of the invention comprises: detecting the direction of movement indicated by the input device when the start of the character entering function has been detected; showing the character of the character area on the display towards which character area the direction of movement indicated by the input device is proceeding; detecting the termination of the character entering function and interpreting the character towards whose character area the direction of movement was last detected to proceed as the character to be entered next, when the termination of the character entering function is detected.
  • the invention also relates to a user interface for entering characters in an electronic device, the user interface comprising: a display for showing the entered characters; an input device for giving control commands for entering the characters; a control unit for controlling the functions of the user interface, the control unit being connected to the display and configured to: show characters on the display; receive control commands from the input device; predetermine a given character area on the display for each character, which character areas are in relation with each other in order to achieve a given character area pattern; detect a start of a character entering function; detect a starting point on the display on the basis of the detected start of the character entering function.
  • the control unit is further configured to: detect the direction of movement indicated by the input device, when the start of the character entering function has been detected; show the character of the character area on the display, towards which character area the direction of movement indicated by the input device is proceeding; detect the termination of the character entering function and interpret the character towards whose character area the direction of movement was last detected to proceed as the character to be entered next, when the termination of the character entering function is detected.
  • the invention further relates to a computer program product encoding a computer program of instructions for executing a computer process for entering characters in a user interface of an electronic device, the user interface comprising: a display and an input device, the process comprising: predetermining a given character area on the display for each character, which character areas are in relation with each other in order to achieve a given character area pattern; detecting a start of a character entering function; detecting a starting point on the display on the basis of the detected start of the character entering function, the process further comprising: detecting the direction of movement indicated by the input device when the start of the character entering function has been detected; showing the character of the character area on the display towards which character area the direction of movement indicated by the input device is proceeding; detecting the termination of the character entering function; interpreting the character towards whose character area the direction of movement was last detected to proceed as the character to be entered next, when the termination of the character entering function is detected.
  • the invention also relates to a user interface for entering characters in an electronic device, the user interface comprising: display means for showing the entered characters; input means for giving control commands for entering the characters; processing means for controlling the functions of the user interface, the processing means being connected to the display means and configured to: show characters on the display; receive control commands from the input means; predetermine a given character area on the display for each character, which character areas are in relation with each other in order to achieve a given character area pattern; detect a start of a character entering function; detect a starting point on the display on the basis of the detected start of the character entering function, wherein processing means further comprise: detection means for detecting the direction of movement indicated by the input device when the start of the character entering function has been detected; means for showing the character of the character area on the display towards which character area the direction of movement indicated by the input device is proceeding; means for detecting the termination of the character entering function; interpreting means for interpreting the character towards whose character area the direction of movement was last detected to proceed as the character to be entered next, when the termination
  • the method and the user interface of the invention provide several advantages.
  • the method of entering characters is especially fast, easy and accurate. Great accuracy is not required of the users of the user interface according to the invention in order to select the right characters to be entered.
  • FIG. 1 shows a device of the invention
  • FIGS. 2A and 2B show details of a display of the device of the invention
  • FIG. 3 is a block diagram of an embodiment of the invention.
  • FIG. 4 is a block diagram of another embodiment of the invention.
  • the embodiments of the invention are applicable to portable electronic devices, such as a mobile station used as a terminal in telecommunication systems comprising one or more base stations and terminals communicating with the base stations.
  • the device may be used for short-range communication implemented with a Bluetooth chip, an infrared or WLAN connection, for example.
  • the portable electronic device is for example a mobile telephone or another device including telecommunication means, such as a portable computer, a handheld computer or a smart telephone.
  • the portable electronic device may be a PDA (Personal Digital Assistant) device including the necessary telecommunication means for establishing a network connection, or a PDA device that can be coupled to a mobile telephone, for instance, for a network connection.
  • the portable electronic device may also be a computer or PDA device not including telecommunication means.
  • FIG. 1 shows a block diagram of the structure of a portable electronic device.
  • a control unit 100 typically implemented by means of a microprocessor and software or separate components, controls the basic functions of the device.
  • the user interface of the device comprises a display 104 , such as a touch screen implemented by manners known per se.
  • the user interface of the device may include a loudspeaker and a keypad part.
  • the device of FIG. 1 such as a mobile station, also includes conventional means 108 that implement the functions of a mobile station and include speech and channel coders, modulators and RF parts.
  • the device also comprises an antenna 110 .
  • the functions of the device are controlled by means of an input device, such as a mouse 106 , a hand-held locator operated by moving it on a surface.
  • an input device such as a mouse 106 , a hand-held locator operated by moving it on a surface.
  • a sign or symbol shows the location of a mouse cursor on the display 104 and often also the function running in the device, or its state.
  • the display 104 is by itself the input device achieved by means of a touch screen such that the desired functions are selected by touching the desired objects visible on the display 104 .
  • the touch on the display 104 is carried out by means of a pen or a finger, for example.
  • the input device 104 , 106 is for giving control commands for entering the characters.
  • the control unit 100 controls the functions of the user interface and is connected to the display 104 and configured to show characters on the display 104 .
  • the control unit 100 receives control commands from the input device 104 , 106 .
  • the entered character may be one or more letters, digits, images or a combination thereof including two or more characters. It is possible that there are different functions for entering certain kinds of characters. Thus, the characters of the alphabet and the numbers, for example, have separate specific character entering functions.
  • the characters are entered in a character entering function controlled by the control unit 100 of the electronic device.
  • the character entering function operates such that the desired characters visible on the display 104 , for example, are first selected by means of the input device. Then, the control unit interprets the selected character as the character to be entered next and displays the character on the display 104 .
  • the control unit 100 detects a start of a character entering function. The start of the character entering function is detected for instance on the basis of a touch on the display 104 . Alternatively, the start of the character entering function is detected by means of a start signal given with an input device 104 , 106 .
  • a touch on the display 104 results in the software in the memory of the control unit 100 detecting the start of the character entering function, and after that, the control unit 100 detects a starting point on the display 104 , based on the detected start of the character entering function.
  • the starting point on the display 104 may be a touch point on the display or a point where a mouse cursor was located at the moment when the start of the character entering function was detected, for example.
  • the control unit is configured to predetermine a given character area on the display 104 for each entered character, such as a character of the alphabet.
  • the character areas are in relation with each other in order to form a given character area pattern.
  • the character areas may virtually form a certain pattern, such as a circle or a triangle, on the display 104 .
  • Other possible character area patterns are linear lines, for example.
  • the locations of the character areas on the display 104 are based on the location of the starting point, for example. If the character areas are in relation to the starting point on the display 104 , the locations of the character areas change according to the location of the starting point.
  • the characters are not visible on the display 104 . It is possible, however, that the character areas are visible on the display 104 . Alternatively, the character areas and/or the characters are visible on the display 104 .
  • the control unit 100 detects a direction of movement indicated by the input device 104 , 106 .
  • the direction of movement is detected on the basis of a direction vector between the starting point and another point on the display 104 , to which other point the touch of the pen or the cursor of the mouse on the display 104 moves, for example.
  • the character of the character area on the display 104 towards which character area the direction of movement indicated by the input device 104 , 106 is proceeding, is shown on the display 104 .
  • the character of the character area, towards which character area the direction of movement is proceeding is visible on the display 104 .
  • the character of the character area, towards which the direction of movement is proceeding is shown with the help of outlining or lights, for example.
  • the character area towards which the direction of movement is proceeding is determined in the control unit 100 by appropriate numerical methods, such as interpolation or extrapolation, known per se.
  • the control unit 100 continues to monitor the direction of movement indicated by the input device 104 , 106 .
  • the direction of movement is recalculated after given periods of time.
  • the latest direction of movement is based on a direction vector calculated by means of two points selected from the route of the movement on the display 104 , the two points being at a given distance apart from each other, for example. If a change in the direction of movement is detected, then another character of the character area, towards which character area the new direction of movement is proceeding, is shown on the display 104 .
  • the direction of movement can be determined also such that after the calculation of the direction vector, the starting point on the display 104 is interpreted to be also the starting point for the direction vector.
  • the detection of the last direction of movement is based on the location of the starting point on the display 104 as well, for example.
  • the control unit 100 continues to monitor the direction of movement and to show the characters until termination of the character entering function is detected.
  • the termination of the character entering function is detected when the movement indicated by the input device stops, for example.
  • the termination of the character entering function is detected on the basis of a signal given with the input device.
  • the speed of the movement indicated by the input device is detected after the start of the character entering function has been detected, and the termination of the character entering function is detected when the speed of the movement indicated by the input device is of a predetermined value.
  • the control unit 100 interprets the character towards whose character area the direction of movement was last detected to proceed as the character to be entered next when the termination of the character entering function is detected.
  • the entered character is shown in an area for entered characters on the display 104 , for example.
  • a given period of time can be predetermined to lapse, during which time the movement of the input device is to be on halt, before the character towards whose character area the direction of movement was last detected to proceed is shown on the display 104 .
  • the user of the electronic device wishes to interrupt the entering of the characters, lifting the input device off the display before the given period of time has lapsed, results in exiting the character entering function without any character selection.
  • the detection of lifting the input device off the display can be predetermined to result in other effects as well.
  • FIGS. 2A and 2B show a display 200 of an electronic device, such as a PDA device.
  • the characters are selected by means of an input device, such as a pen or a mouse.
  • a character is one or more letters, digits, images or a combination thereof including two or more symbols.
  • the characters are letters of the alphabet.
  • the user of the device first starts the character entering function by using a pen or a mouse, for example.
  • FIGS. 2A and 2B shown a starting point 216 on the display 200 , which starting point 216 is detected on the basis of the detected start of the character entering function.
  • the user may, for example, touch the display 200 with a pen in order to start the character entering function. Then the starting point 216 is, for example, a point on the display 200 where the pen first touched.
  • the predetermined character areas 202 , 204 , 206 , 208 , 210 , 212 on the display 200 are also shown.
  • the character areas 202 - 212 virtually form a circle 214 around the starting point 216 .
  • a separate character area 202 - 212 is predetermined for each character in relation with each other and to the starting point 216 , for example. Thereby, where ever on the display 200 the starting point 216 is detected to locate, the character areas 202 - 212 are always at the same locations on the display 200 in relation to the starting point 216 .
  • the character areas 202 - 212 are in a circular form.
  • the character areas 202 - 212 are predetermined in the settings of the electronic device by the manufacturer, for example. Alternatively, the user of the device chooses the desired character areas 202 - 212 by using different setting options of the device.
  • the user of the device next starts to move the pen, for example, on the display 200 towards the given character area 202 - 212 of the desired character.
  • the characters to be entered are, for example, invisible on the display 200 .
  • the character of the character area 202 - 212 towards which the movement of the pen is detected to proceed is shown on the display 200 by means of lights or outlining, for example.
  • the desired character is shown on the display, the user then selects the character by terminating the character entering function.
  • the termination of the character entering function is detected when the user stops moving the pen or lifts the pen off the display 200 , for example.
  • the termination of the character entering function is detected when the user presses a key of the mouse, for example.
  • the user has already entered some characters shown in the area for entered characters 218 .
  • the user wishes to enter the character “s”.
  • the user touches the display with the pen at the starting point 216 .
  • the starting point 216 is, for example, at the point on the display that the touch of the pen hits first.
  • the user starts moving the pen towards the character areas 202 - 212 .
  • the route 220 of the pen moving on the display 200 is also shown.
  • the pen has moved towards the character area 210 .
  • the character of the character area 210 is shown on the display 200 with lights, for example.
  • FIG. 2A the pen is moved only a short distance on the display 200 , along the route 220 of the moving pen.
  • the movement of the pen can be stopped.
  • the last detected direction of movement, before the detection of the termination of the character entering function is shown in FIG. 2A with a dashed arrow 222 .
  • the user notices the desired character “s” on the display 200 , he terminates the character entering function in order to enter the character “s”.
  • the character “s” is shown in the area 218 for entered characters.
  • the user starts the character entering function, by touching the display 200 with a pen, for example.
  • control commands for editing are, for example, based on successive detections of sudden direction changes of movements indicated by the input device, such as the pen.
  • the control command for removing the entered character comprises, for example, moving the pen first to the right and then moving the pen back to the left.
  • other functions can be based on detections of sudden direction changes of movements indicated by the input device.
  • the entering of special characters or spaces can be accomplished by moving the input device in different directions in a predetermined fashion. The user may predetermine given successive movements of the input device to be associated to certain functions.
  • the areas 202 - 212 for the characters are linear in such a way that the character areas 202 - 212 are virtually forming two linear lines 213 , 215 on the display 200 and on different sides of the starting point 216 .
  • all the character areas 202 - 212 can virtually form a single linear line 213 , 215 on the display 200 . It is possible that only a few character areas 202 - 212 for certain characters, such as the characters most commonly used, are located on the other side of the starting point 216 than where all the other character areas 202 - 212 for the other characters are located.
  • FIG. 2B there is a character showing area 224 for showing the character of the character area 202 - 212 towards which the direction of movement of the input device is last detected to proceed. Additionally, the characters of the character areas 202 - 212 towards which the direction of movement of the input device is detected to proceed can be indicated with lights, outlining or with sounds of different tone height, for example.
  • the user has already entered some characters shown in the area for entered characters 218 .
  • the user wishes to enter the character “s”.
  • the user touches the display with the pen at the starting point 216 .
  • the starting point 216 is, for example, at the point on the display that the touch of the pen hits first.
  • the arrow lines 219 , 220 , 221 , 225 illustrate the different directions to which the user moves the pen at given times.
  • the user starts moving the pen towards the character areas 202 - 217 .
  • the direction of the movement of the pen indicated by the arrow line 219 , is towards the character area 208 for the letter “m”.
  • the letter “m” of the character area 208 is shown on the display 200 when the user is moving the pen towards it.
  • the letter “m” is shown in the character showing area 224 , for example.
  • the user notices that the movement of the pen is going in the wrong direction and he next moves the pen slightly in another direction indicated by the arrow line 220 .
  • the direction of the pen proceeds towards the area 210 for the letter “n”.
  • the letter towards which the movement of the pen is detected to proceed is shown on the display 200 in the character showing area 224 , for example.
  • the user then adjusts the direction of the movement of the pen a bit more until the direction, indicated by the arrow line 221 , is towards the character area 212 for the letter “s”.
  • the last direction of movement indicated by the pen is detected on the basis of a direction vector generated with the help of two points from the route of the movement on the display 104 , the two points being at a given distance from each other, for example.
  • the two points defining the direction vector indicated by the arrow line 221 are located at the beginning and at the end of the arrow line 221 . If the last direction of movement were detected based on the starting point 216 as well, the situation in FIGS. 2A and 2B would be different in such a way that the direction vector, indicated by the arrow line 221 , would then be considered to virtually start from the starting point 216 .
  • the arrow line 225 shows the new location of the direction vector, corresponding to the arrow line 221 when the direction of movement is detected based on the starting point 216 as well.
  • the direction of movement indicated by the arrow line 225 would be interpreted to proceed towards the character area 217 for the letter “u”, for example.
  • the last direction in which the movement of the pen would be detected to proceed before the character entering function is terminated is indicated with the dashed arrow 226 .
  • the user only has to move the pen for as long as the desired letter “s” is shown on the display 200 .
  • the letter “s” is shown on the display 200 when the pen is at the end of the arrow 221 that is proceeding towards the character area 212 for the letter “s”, for example.
  • Different limits can be preset to predetermine how long and/or for how far the pen, for example, has to be moved in a certain direction before the character of the character area 202 - 217 towards which the movement of the pen is detected to proceed, is shown on the display 200 .
  • a second character area 202 - 217 next to the first character area 202 - 217 towards which the direction of movement was first indicated to proceed is shown on the display even before the direction of movement of the input device actually is detected to proceed exactly towards the second character area 202 - 217 .
  • the user When satisfied with the character shown on the display 200 , the user terminates the character entering function by stopping the movement of the pen, for example. As the termination of the character entering function is detected, the character “s” is shown in the area 218 for entered characters. In order to enter the next character the user starts the character entering function, by touching the display 200 with a pen, for example, or by continuing the movement of the pen after the previous character has been selected. Thus, it is possible to enter the desired characters even without lifting the pen from the display 200 between the character selections.
  • FIG. 3 shows a block diagram of the character entering method.
  • a separate character area is predetermined for each character on the display, which character areas are in relation with each other.
  • the device is in an idle state and monitors the state of the user interface. In the idle state the start of the character entering function by touching the display of the user interface, for example, is feasible. Giving a start signal with another input device can start the character entering function as well.
  • Such an input device may be for instance a separate keypad, provided the device comprises a keypad, and the start signal is for instance the depression of a given key or keys of the keypad.
  • the input device may also be the display itself or a start signal area specified in the display area, the touching of which starts the character entering function.
  • control unit detects the start of the character entering function, based for instance on a start signal given with an input device, the starting point on the display is detected and block 304 is entered, where the control unit starts detecting the direction of movement indicated by the input device.
  • block 306 is entered, where the character of the character area towards which the direction of movement is directed, is shown on the display. Alternatively, the character is shown only after the direction of the movement indicated by the input device has been to the same direction for a given period of time.
  • the most probable character area on the display towards which the direction of movement indicated by the input device is proceeding is detected in block 304 , and in block 306 the character of the most probable character area on the display towards which the direction of movement indicated by the input device is proceeding is shown on the display.
  • block 308 possible changes in the direction of movement is observed. If in block 308 a change in the direction of movement is detected, block 304 is re-entered, where the direction of movement is detected. If in block 308 no changes in the direction of movement are detected, block 310 is entered, where the termination of the character entering function is monitored. If in block 310 no termination of the character entering function is detected, block 306 remains, where the character of the character area towards which the direction of movement is directed, is shown on the display. When the termination of the character entering function in block 310 is detected, block 312 is entered, where the character towards whose character area the direction of movement was last detected to proceed is interpreted as the character to be entered next. Finally, in block 314 the interpreted character is shown on the display, in the area for entered characters, for example.
  • FIG. 4 shows a block diagram of the character entering method.
  • a separate character area is predetermined for each character on the display, which character areas are in relation with each other in order to achieve a given character area pattern.
  • the device is in an idle state and monitors the state of the user interface.
  • the control unit detects the start of the character entering function, based for instance on a start signal given with an input device, the starting point on the display is detected and block 406 is entered, where the control unit determines, on the basis of the location of the starting point, the locations of the character areas on the display.
  • the character areas may be in form of a circle around the starting point, for example.
  • block 408 the control unit starts detecting the direction of movement indicated by the input device.
  • block 410 is entered, where the character of the character area towards which the direction of movement is directed is shown on the display. Then, if in block 412 a change in the direction of movement is detected, block 408 is re-entered. If in block 412 no changes in the direction of movement are detected, block 414 is entered, where the termination of the character entering function is monitored. If in block 414 no termination of the character entering function is detected, block 410 remains.
  • block 416 is entered, where the character towards whose character area the direction of movement was last detected to proceed is interpreted as the character to be entered next.
  • the interpreted character is shown on the display, in the area for entered characters, for example.

Abstract

A direction of movement indicated by an input device is detected when a start of a character entering function has been detected. The entered character is shown on the character area on a display, towards which character area the direction of movement indicated by the input device is proceeding. The termination of the character entering function is detected, and the character towards whose character area the direction of movement was last detected to proceed as the character to be entered next is detected.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • This is a continuation application of application Ser. No. 11/132,479, filed May 19, 2005, which is a continuation of International Application No. PCT/FI03/000889 filed on Nov. 19, 2003, which claims priority to European Patent Application No. 02102606.7 filed on Nov. 20, 2002, the contents of which are incorporated herein by reference in their entirety.
  • FIELD
  • The invention relates to a method for entering characters in a user interface of an electronic device, and a user interface of an electronic device.
  • BACKGROUND
  • Electronic devices, such as mobile telephones, are continuously reduced in size. A special problem in the usage of an electronic device is the entering of text. Eventually the keypad used in electronic devices, such as a character keypad, becomes impracticable, because it is difficult for the user of the device to press extremely small keys. Since separate keypads in the devices increase the size of the devices, small electronic devices with only a touch screen as the user interface have become common. In electronic devices, touch screens are often used to replace the mouse and the keypad, for example. The user gives control commands to the device by touching contact areas visible on the touch screen.
  • Several portable devices are provided with a feature that identifies handwriting, by means of which the device converts handwritten text, for example, into composed text. There are different automatic identification methods of handwritten symbols in which the characters of the entered text are written directly on the touch screen. The user writes characters in an area for writing characters on a touch screen by means of a pen or a finger, for example. The device then identifies the written character based on the detected contact points in said area. Also some text entry systems have been proposed, wherein the characters visible on the touch screen of an electronic device are selected by sliding a pen or a finger over the character to be entered.
  • The current text entry systems, such as handwriting or speech recognition systems, are often slow and error prone. Also in the handwriting systems the area reserved on the touch screen for writing characters is small, which makes it hard to write in said area in a moving vehicle, for example. If the selection of the characters is conducted by sliding a pen over the desired character visible on the touch screen, it is difficult to hit the correct character when, for example, the writer's hand shakes in a rush hour bus.
  • BRIEF DESCRIPTION OF THE INVENTION
  • It is an object of the invention to provide a method and a user interface so as to alleviate prior art problems. This is achieved by a method for entering characters in a user interface of an electronic device, the user interface comprising: a display and an input device, the method comprising: predetermining a given character area on the display for each character, which character areas are in relation with each other in order to achieve a given character area pattern; detecting a start of a character entering function; detecting a starting point on the display on the basis of the detected start of the character entering function. The method of the invention comprises: detecting the direction of movement indicated by the input device when the start of the character entering function has been detected; showing the character of the character area on the display towards which character area the direction of movement indicated by the input device is proceeding; detecting the termination of the character entering function and interpreting the character towards whose character area the direction of movement was last detected to proceed as the character to be entered next, when the termination of the character entering function is detected.
  • The invention also relates to a user interface for entering characters in an electronic device, the user interface comprising: a display for showing the entered characters; an input device for giving control commands for entering the characters; a control unit for controlling the functions of the user interface, the control unit being connected to the display and configured to: show characters on the display; receive control commands from the input device; predetermine a given character area on the display for each character, which character areas are in relation with each other in order to achieve a given character area pattern; detect a start of a character entering function; detect a starting point on the display on the basis of the detected start of the character entering function. The control unit is further configured to: detect the direction of movement indicated by the input device, when the start of the character entering function has been detected; show the character of the character area on the display, towards which character area the direction of movement indicated by the input device is proceeding; detect the termination of the character entering function and interpret the character towards whose character area the direction of movement was last detected to proceed as the character to be entered next, when the termination of the character entering function is detected.
  • The invention further relates to a computer program product encoding a computer program of instructions for executing a computer process for entering characters in a user interface of an electronic device, the user interface comprising: a display and an input device, the process comprising: predetermining a given character area on the display for each character, which character areas are in relation with each other in order to achieve a given character area pattern; detecting a start of a character entering function; detecting a starting point on the display on the basis of the detected start of the character entering function, the process further comprising: detecting the direction of movement indicated by the input device when the start of the character entering function has been detected; showing the character of the character area on the display towards which character area the direction of movement indicated by the input device is proceeding; detecting the termination of the character entering function; interpreting the character towards whose character area the direction of movement was last detected to proceed as the character to be entered next, when the termination of the character entering function is detected.
  • The invention also relates to a user interface for entering characters in an electronic device, the user interface comprising: display means for showing the entered characters; input means for giving control commands for entering the characters; processing means for controlling the functions of the user interface, the processing means being connected to the display means and configured to: show characters on the display; receive control commands from the input means; predetermine a given character area on the display for each character, which character areas are in relation with each other in order to achieve a given character area pattern; detect a start of a character entering function; detect a starting point on the display on the basis of the detected start of the character entering function, wherein processing means further comprise: detection means for detecting the direction of movement indicated by the input device when the start of the character entering function has been detected; means for showing the character of the character area on the display towards which character area the direction of movement indicated by the input device is proceeding; means for detecting the termination of the character entering function; interpreting means for interpreting the character towards whose character area the direction of movement was last detected to proceed as the character to be entered next, when the termination of the character entering function is detected.
  • Preferred embodiments of the invention are described in the dependent claims.
  • The method and the user interface of the invention provide several advantages. In a preferred embodiment of the invention the method of entering characters is especially fast, easy and accurate. Great accuracy is not required of the users of the user interface according to the invention in order to select the right characters to be entered.
  • LIST OF THE DRAWINGS
  • In the following, the invention will be described in greater detail with reference to the preferred embodiments and the accompanying drawings, in which
  • FIG. 1 shows a device of the invention;
  • FIGS. 2A and 2B show details of a display of the device of the invention;
  • FIG. 3 is a block diagram of an embodiment of the invention; and
  • FIG. 4 is a block diagram of another embodiment of the invention.
  • DESCRIPTION OF THE EMBODIMENTS
  • The embodiments of the invention are applicable to portable electronic devices, such as a mobile station used as a terminal in telecommunication systems comprising one or more base stations and terminals communicating with the base stations. The device may be used for short-range communication implemented with a Bluetooth chip, an infrared or WLAN connection, for example. The portable electronic device is for example a mobile telephone or another device including telecommunication means, such as a portable computer, a handheld computer or a smart telephone. The portable electronic device may be a PDA (Personal Digital Assistant) device including the necessary telecommunication means for establishing a network connection, or a PDA device that can be coupled to a mobile telephone, for instance, for a network connection. The portable electronic device may also be a computer or PDA device not including telecommunication means.
  • FIG. 1 shows a block diagram of the structure of a portable electronic device. A control unit 100, typically implemented by means of a microprocessor and software or separate components, controls the basic functions of the device. The user interface of the device comprises a display 104, such as a touch screen implemented by manners known per se. In addition, the user interface of the device may include a loudspeaker and a keypad part. Depending on the type of device, there may be different and a different number of user interface parts. The device of FIG. 1, such as a mobile station, also includes conventional means 108 that implement the functions of a mobile station and include speech and channel coders, modulators and RF parts. The device also comprises an antenna 110.
  • The functions of the device are controlled by means of an input device, such as a mouse 106, a hand-held locator operated by moving it on a surface. When using a mouse 106, for example, a sign or symbol shows the location of a mouse cursor on the display 104 and often also the function running in the device, or its state. It is also possible that the display 104 is by itself the input device achieved by means of a touch screen such that the desired functions are selected by touching the desired objects visible on the display 104. The touch on the display 104 is carried out by means of a pen or a finger, for example.
  • The input device 104, 106 is for giving control commands for entering the characters. The control unit 100 controls the functions of the user interface and is connected to the display 104 and configured to show characters on the display 104. The control unit 100 receives control commands from the input device 104, 106. The entered character may be one or more letters, digits, images or a combination thereof including two or more characters. It is possible that there are different functions for entering certain kinds of characters. Thus, the characters of the alphabet and the numbers, for example, have separate specific character entering functions.
  • The characters are entered in a character entering function controlled by the control unit 100 of the electronic device. The character entering function operates such that the desired characters visible on the display 104, for example, are first selected by means of the input device. Then, the control unit interprets the selected character as the character to be entered next and displays the character on the display 104. In an embodiment of the invention the control unit 100 detects a start of a character entering function. The start of the character entering function is detected for instance on the basis of a touch on the display 104. Alternatively, the start of the character entering function is detected by means of a start signal given with an input device 104, 106. A touch on the display 104 results in the software in the memory of the control unit 100 detecting the start of the character entering function, and after that, the control unit 100 detects a starting point on the display 104, based on the detected start of the character entering function. The starting point on the display 104 may be a touch point on the display or a point where a mouse cursor was located at the moment when the start of the character entering function was detected, for example.
  • According to an embodiment of the invention, the control unit is configured to predetermine a given character area on the display 104 for each entered character, such as a character of the alphabet. The character areas are in relation with each other in order to form a given character area pattern. The character areas may virtually form a certain pattern, such as a circle or a triangle, on the display 104. Other possible character area patterns are linear lines, for example. The locations of the character areas on the display 104 are based on the location of the starting point, for example. If the character areas are in relation to the starting point on the display 104, the locations of the character areas change according to the location of the starting point. During the character entering function, the characters are not visible on the display 104. It is possible, however, that the character areas are visible on the display 104. Alternatively, the character areas and/or the characters are visible on the display 104.
  • When the start of the character entering function has been detected, the control unit 100 detects a direction of movement indicated by the input device 104, 106. The direction of movement is detected on the basis of a direction vector between the starting point and another point on the display 104, to which other point the touch of the pen or the cursor of the mouse on the display 104 moves, for example. Next, the character of the character area on the display 104, towards which character area the direction of movement indicated by the input device 104, 106 is proceeding, is shown on the display 104. Thus, only the character of the character area, towards which character area the direction of movement is proceeding, is visible on the display 104. Alternatively, if all the characters are visible on the display, the character of the character area, towards which the direction of movement is proceeding, is shown with the help of outlining or lights, for example. The character area towards which the direction of movement is proceeding is determined in the control unit 100 by appropriate numerical methods, such as interpolation or extrapolation, known per se.
  • The control unit 100 continues to monitor the direction of movement indicated by the input device 104, 106. As the movement indicated by the input device 104, 106 proceeds, the direction of movement is recalculated after given periods of time. The latest direction of movement is based on a direction vector calculated by means of two points selected from the route of the movement on the display 104, the two points being at a given distance apart from each other, for example. If a change in the direction of movement is detected, then another character of the character area, towards which character area the new direction of movement is proceeding, is shown on the display 104. The direction of movement can be determined also such that after the calculation of the direction vector, the starting point on the display 104 is interpreted to be also the starting point for the direction vector. Thus, the detection of the last direction of movement is based on the location of the starting point on the display 104 as well, for example.
  • The control unit 100 continues to monitor the direction of movement and to show the characters until termination of the character entering function is detected. The termination of the character entering function is detected when the movement indicated by the input device stops, for example. Alternatively the termination of the character entering function is detected on the basis of a signal given with the input device. In an embodiment of the invention, the speed of the movement indicated by the input device is detected after the start of the character entering function has been detected, and the termination of the character entering function is detected when the speed of the movement indicated by the input device is of a predetermined value.
  • The control unit 100 interprets the character towards whose character area the direction of movement was last detected to proceed as the character to be entered next when the termination of the character entering function is detected. The entered character is shown in an area for entered characters on the display 104, for example.
  • If the termination of the character entering function is detected when the movement indicated by the input device stops, a given period of time can be predetermined to lapse, during which time the movement of the input device is to be on halt, before the character towards whose character area the direction of movement was last detected to proceed is shown on the display 104. Thus, if the user of the electronic device wishes to interrupt the entering of the characters, lifting the input device off the display before the given period of time has lapsed, results in exiting the character entering function without any character selection. The detection of lifting the input device off the display can be predetermined to result in other effects as well.
  • Let us next study embodiments of the invention by means of FIGS. 2A and 2B. FIGS. 2A and 2B show a display 200 of an electronic device, such as a PDA device. The characters are selected by means of an input device, such as a pen or a mouse. A character, in turn, is one or more letters, digits, images or a combination thereof including two or more symbols. In FIGS. 2A and 2B the characters are letters of the alphabet. When wishing to start entering characters, the user of the device first starts the character entering function by using a pen or a mouse, for example. FIGS. 2A and 2B shown a starting point 216 on the display 200, which starting point 216 is detected on the basis of the detected start of the character entering function. The user may, for example, touch the display 200 with a pen in order to start the character entering function. Then the starting point 216 is, for example, a point on the display 200 where the pen first touched. In FIGS. 2A and 2B the predetermined character areas 202, 204, 206, 208, 210, 212 on the display 200 are also shown. On the display 200 there is also an area 218 for the entered characters.
  • In FIG. 2A the character areas 202-212 virtually form a circle 214 around the starting point 216. A separate character area 202-212 is predetermined for each character in relation with each other and to the starting point 216, for example. Thereby, where ever on the display 200 the starting point 216 is detected to locate, the character areas 202-212 are always at the same locations on the display 200 in relation to the starting point 216. For example, in FIG. 2A the character areas 202-212 are in a circular form. The character areas 202-212 are predetermined in the settings of the electronic device by the manufacturer, for example. Alternatively, the user of the device chooses the desired character areas 202-212 by using different setting options of the device.
  • In order to choose a desired character to be entered, the user of the device next starts to move the pen, for example, on the display 200 towards the given character area 202-212 of the desired character. The characters to be entered are, for example, invisible on the display 200. As the user moves the pen towards a given character area 202-212, the character of the character area 202-212 towards which the movement of the pen is detected to proceed is shown on the display 200 by means of lights or outlining, for example. When the desired character is shown on the display, the user then selects the character by terminating the character entering function. The termination of the character entering function is detected when the user stops moving the pen or lifts the pen off the display 200, for example. Alternatively, the termination of the character entering function is detected when the user presses a key of the mouse, for example.
  • In the situation of FIG. 2A the user has already entered some characters shown in the area for entered characters 218. Next the user wishes to enter the character “s”. At first the user touches the display with the pen at the starting point 216. The starting point 216 is, for example, at the point on the display that the touch of the pen hits first. Then the user starts moving the pen towards the character areas 202-212. In FIG. 2A the route 220 of the pen moving on the display 200 is also shown. At first, the pen has moved towards the character area 210. The character of the character area 210 is shown on the display 200 with lights, for example. When the user notices, on the basis of the characters shown on the display, that the movement of the pen is going to the wrong direction, he then adjusts the direction of the movement of the pen. Moving the pen towards the character area 212, which character area 212 is predetermined for the character “s”, shows the character “s” on the display 200. The user only has to move the pen for as long as the desired character is shown on the display 200.
  • In FIG. 2A the pen is moved only a short distance on the display 200, along the route 220 of the moving pen. As soon as the character area 212 towards which the direction of movement indicated by the pen is proceeding is detected, the movement of the pen can be stopped. The last detected direction of movement, before the detection of the termination of the character entering function, is shown in FIG. 2A with a dashed arrow 222. As the user notices the desired character “s” on the display 200, he terminates the character entering function in order to enter the character “s”. As the termination of the character entering function is detected, the character “s” is shown in the area 218 for entered characters. In order to enter the next character the user starts the character entering function, by touching the display 200 with a pen, for example.
  • If he has accidentally entered a wrong character or wishes to remove character already entered for some reason, the user may give control commands for editing the entered character. The control commands for editing are, for example, based on successive detections of sudden direction changes of movements indicated by the input device, such as the pen. The control command for removing the entered character comprises, for example, moving the pen first to the right and then moving the pen back to the left. Also other functions can be based on detections of sudden direction changes of movements indicated by the input device. Thus, for example, the entering of special characters or spaces can be accomplished by moving the input device in different directions in a predetermined fashion. The user may predetermine given successive movements of the input device to be associated to certain functions.
  • For a situation when the character entering function is in progress and the user wishes to stop entering characters entirely without selecting any characters, it is possible to predetermine a specific ending signal, the character entering function being interrupted once the ending signal has been detected by the control unit of the user interface. The detection of the input device moving randomly back and forth on the display, for example, can be interpreted as such an ending signal.
  • In FIG. 2B the areas 202-212 for the characters are linear in such a way that the character areas 202-212 are virtually forming two linear lines 213, 215 on the display 200 and on different sides of the starting point 216. Alternatively, all the character areas 202-212 can virtually form a single linear line 213, 215 on the display 200. It is possible that only a few character areas 202-212 for certain characters, such as the characters most commonly used, are located on the other side of the starting point 216 than where all the other character areas 202-212 for the other characters are located.
  • In an embodiment of the invention illustrated in FIG. 2B there is a character showing area 224 for showing the character of the character area 202-212 towards which the direction of movement of the input device is last detected to proceed. Additionally, the characters of the character areas 202-212 towards which the direction of movement of the input device is detected to proceed can be indicated with lights, outlining or with sounds of different tone height, for example.
  • Also in the situation of FIG. 2B the user has already entered some characters shown in the area for entered characters 218. Next the user wishes to enter the character “s”. At first the user touches the display with the pen at the starting point 216. The starting point 216 is, for example, at the point on the display that the touch of the pen hits first. In FIG. 2B the arrow lines 219, 220, 221, 225 illustrate the different directions to which the user moves the pen at given times. The user starts moving the pen towards the character areas 202-217. First, the direction of the movement of the pen, indicated by the arrow line 219, is towards the character area 208 for the letter “m”. The letter “m” of the character area 208 is shown on the display 200 when the user is moving the pen towards it. The letter “m” is shown in the character showing area 224, for example. The user notices that the movement of the pen is going in the wrong direction and he next moves the pen slightly in another direction indicated by the arrow line 220. Next the direction of the pen proceeds towards the area 210 for the letter “n”. Once again, the letter towards which the movement of the pen is detected to proceed is shown on the display 200 in the character showing area 224, for example. The user then adjusts the direction of the movement of the pen a bit more until the direction, indicated by the arrow line 221, is towards the character area 212 for the letter “s”. When the user moves the pen towards the character area 212 for the letter “s”, the letter “s” is shown on the display 200. The last direction in which the movement of the pen is detected to proceed, before the character entering function is terminated, is indicated with the dashed arrow 222.
  • In FIGS. 2A and 2B the last direction of movement indicated by the pen is detected on the basis of a direction vector generated with the help of two points from the route of the movement on the display 104, the two points being at a given distance from each other, for example. In FIG. 2B, for example, the two points defining the direction vector indicated by the arrow line 221 are located at the beginning and at the end of the arrow line 221. If the last direction of movement were detected based on the starting point 216 as well, the situation in FIGS. 2A and 2B would be different in such a way that the direction vector, indicated by the arrow line 221, would then be considered to virtually start from the starting point 216. The arrow line 225 shows the new location of the direction vector, corresponding to the arrow line 221 when the direction of movement is detected based on the starting point 216 as well. Here, the direction of movement indicated by the arrow line 225 would be interpreted to proceed towards the character area 217 for the letter “u”, for example. Thus, the last direction in which the movement of the pen would be detected to proceed before the character entering function is terminated is indicated with the dashed arrow 226.
  • Once again, the user only has to move the pen for as long as the desired letter “s” is shown on the display 200. In the situation of FIG. 2B, the letter “s” is shown on the display 200 when the pen is at the end of the arrow 221 that is proceeding towards the character area 212 for the letter “s”, for example. Different limits can be preset to predetermine how long and/or for how far the pen, for example, has to be moved in a certain direction before the character of the character area 202-217 towards which the movement of the pen is detected to proceed, is shown on the display 200. In an embodiment of the invention it is also possible that when the direction of movement of the input device is detected to change, a second character area 202-217 next to the first character area 202-217 towards which the direction of movement was first indicated to proceed, is shown on the display even before the direction of movement of the input device actually is detected to proceed exactly towards the second character area 202-217.
  • When satisfied with the character shown on the display 200, the user terminates the character entering function by stopping the movement of the pen, for example. As the termination of the character entering function is detected, the character “s” is shown in the area 218 for entered characters. In order to enter the next character the user starts the character entering function, by touching the display 200 with a pen, for example, or by continuing the movement of the pen after the previous character has been selected. Thus, it is possible to enter the desired characters even without lifting the pen from the display 200 between the character selections.
  • Let us next study an embodiment of the invention by means of FIG. 3. FIG. 3 shows a block diagram of the character entering method. A separate character area is predetermined for each character on the display, which character areas are in relation with each other. In block 300 the device is in an idle state and monitors the state of the user interface. In the idle state the start of the character entering function by touching the display of the user interface, for example, is feasible. Giving a start signal with another input device can start the character entering function as well. Such an input device may be for instance a separate keypad, provided the device comprises a keypad, and the start signal is for instance the depression of a given key or keys of the keypad. The input device may also be the display itself or a start signal area specified in the display area, the touching of which starts the character entering function.
  • If in block 302 the control unit detects the start of the character entering function, based for instance on a start signal given with an input device, the starting point on the display is detected and block 304 is entered, where the control unit starts detecting the direction of movement indicated by the input device. When the control unit has detected the direction of movement indicated by the input device, block 306 is entered, where the character of the character area towards which the direction of movement is directed, is shown on the display. Alternatively, the character is shown only after the direction of the movement indicated by the input device has been to the same direction for a given period of time. In an embodiment of the invention it is also possible that the most probable character area on the display towards which the direction of movement indicated by the input device is proceeding is detected in block 304, and in block 306 the character of the most probable character area on the display towards which the direction of movement indicated by the input device is proceeding is shown on the display.
  • In block 308 possible changes in the direction of movement is observed. If in block 308 a change in the direction of movement is detected, block 304 is re-entered, where the direction of movement is detected. If in block 308 no changes in the direction of movement are detected, block 310 is entered, where the termination of the character entering function is monitored. If in block 310 no termination of the character entering function is detected, block 306 remains, where the character of the character area towards which the direction of movement is directed, is shown on the display. When the termination of the character entering function in block 310 is detected, block 312 is entered, where the character towards whose character area the direction of movement was last detected to proceed is interpreted as the character to be entered next. Finally, in block 314 the interpreted character is shown on the display, in the area for entered characters, for example.
  • Let us next study another embodiment of the invention by means of FIG. 4. FIG. 4 shows a block diagram of the character entering method. In block 400 a separate character area is predetermined for each character on the display, which character areas are in relation with each other in order to achieve a given character area pattern. In block 402 the device is in an idle state and monitors the state of the user interface. If in block 404 the control unit detects the start of the character entering function, based for instance on a start signal given with an input device, the starting point on the display is detected and block 406 is entered, where the control unit determines, on the basis of the location of the starting point, the locations of the character areas on the display. Thus, where ever on the display the starting point is detected to locate, the character areas are always in relation with the starting point. The character areas may be in form of a circle around the starting point, for example.
  • In block 408 the control unit starts detecting the direction of movement indicated by the input device. When the control unit has detected the direction of movement indicated by the input device, block 410 is entered, where the character of the character area towards which the direction of movement is directed is shown on the display. Then, if in block 412 a change in the direction of movement is detected, block 408 is re-entered. If in block 412 no changes in the direction of movement are detected, block 414 is entered, where the termination of the character entering function is monitored. If in block 414 no termination of the character entering function is detected, block 410 remains. When the termination of the character entering function in block 414 is detected, block 416 is entered, where the character towards whose character area the direction of movement was last detected to proceed is interpreted as the character to be entered next. Finally, in block 418 the interpreted character is shown on the display, in the area for entered characters, for example.
  • Even though the invention is described above with reference to an example according to the accompanying drawings, it is clear that the invention is not restricted thereto but it can be modified in several ways within the scope of the appended claims.

Claims (18)

1. A device, comprising:
a touch screen configured to display a keypad and receive touch inputs to the keypad, wherein the touch inputs comprise at least a first touch input and one more touch movements that are input in sequence without lifting of the touch inputs from the keypad; and
a processor configured via software to cause the device at least to:
detect the start of a character entering function that comprises at least the first touch input to the keypad that enters a first character;
detect the one more touch movements on the touch screen subsequent to the character entering function, wherein endings of each of the touch movements correspond to locations of the keypad that cause entering one or more respective other characters;
detect a termination of the character entering function; and
displaying at least the first and other characters on an area of the touch screen that displays entered characters.
2. The device of claim 1, wherein the termination of the character entering function comprises a lifting of the touch inputs following the one or more touch movements.
3. The device of claim 1, wherein the termination of the character entering function comprises a stopping of movement of the touch inputs following the one or more touch movements.
4. The device of claim 1, wherein the endings of at least one of the one or more touch movements comprises a change in direction of the touch inputs that corresponds to the beginning of a subsequent touch movement of the one or more touch movements that follows the at least one touch movement.
5. The device of claim 1, wherein the endings of at least one of the touch movements comprises stopping of movement the touch inputs.
6. The device of claim 1, wherein the processor further causes the device to detect a random back and forth input to the touch screen during the character entering function and, in response thereto, ending the character entering function without entering the first and other characters.
7. The device of claim 1, wherein the processor further causes the device to detect sudden direction changes of movement input to the touch screen during the character entering function and, in response thereto, executing a control command for editing.
8. The device of claim 1, wherein the processor further causes the device to detect sudden direction changes of movement input to the touch screen during the character entering function and, in response thereto, entering a special character.
9. A method, comprising:
receiving touch inputs to a keypad of a touch screen, wherein the touch inputs comprise at least a first touch input and one more touch movements that are input in sequence without lifting of the touch inputs from the keypad;
detecting the start of a character entering function that comprises at least the first touch input to the keypad that enters a first character;
detecting the one more touch movements on the touch screen subsequent to the character entering function, wherein endings of each of the touch movements correspond to locations of the keypad that cause entering one or more respective other characters;
detecting a termination of the character entering function; and
displaying at least the first and other characters on an area of the touch screen that displays entered characters.
10. The method of claim 9, wherein the termination of the character entering function comprises a lifting of the touch inputs following the one or more touch movements.
11. The method of claim 9, wherein the termination of the character entering function comprises a stopping of movement of the touch inputs following the one or more touch movements.
12. The method of claim 9, wherein the endings of at least one of the one or more touch movements comprises a change in direction of the touch inputs that corresponds to the beginning of a subsequent touch movement of the one or more touch movements that follows the at least one touch movement.
13. The method of claim 9, wherein the endings of at least one of the touch movements comprises stopping of movement of the touch inputs.
14. The method of claim 9, further comprising detecting a random back and forth input to the touch screen and, in response thereto, ending the character entering function without entering the first and other characters.
15. The method of claim 9, further comprising detecting sudden direction changes of movement input to the touch screen during the character entering function and, in response thereto, executing a control command for editing.
16. The method of claim 9, further comprising detecting sudden direction changes of movement input to the touch screen during the character entering function and, in response thereto, entering a special character.
17. A computer program product encoding a computer program of instructions for executing the method of claim 9 as a computer process.
18. A user interface device, comprising:
means for receiving touch inputs to a keypad of a touch screen, wherein the touch inputs comprise at least a first touch input and one more touch movements that are input in sequence without lifting of the touch inputs from the keypad;
means for detecting the start of a character entering function that comprises at least the first touch input to the keypad that enters a first character;
means for detecting the one more touch movements on the touch screen subsequent to the character entering function, wherein endings of each of the touch movements correspond to locations of the keypad that cause entering one or more respective other characters;
means for detecting a termination of the character entering function; and
means for displaying at least the first and other characters on an area of the touch screen that displays entered characters.
US12/862,223 2002-11-20 2010-08-24 Method and User Interface for Entering Characters Abandoned US20100315369A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US12/862,223 US20100315369A1 (en) 2002-11-20 2010-08-24 Method and User Interface for Entering Characters

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
EP02102606.7 2002-11-20
EP02102606A EP1422599B1 (en) 2002-11-20 2002-11-20 Method and user interface for entering characters
PCT/FI2003/000889 WO2004046904A1 (en) 2002-11-20 2003-11-19 Method and user interface for entering characters
US11/132,479 US7973770B2 (en) 2002-11-20 2005-05-19 Method and user interface for entering characters
US12/862,223 US20100315369A1 (en) 2002-11-20 2010-08-24 Method and User Interface for Entering Characters

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US11/132,479 Continuation US7973770B2 (en) 2002-11-20 2005-05-19 Method and user interface for entering characters

Publications (1)

Publication Number Publication Date
US20100315369A1 true US20100315369A1 (en) 2010-12-16

Family

ID=32187251

Family Applications (2)

Application Number Title Priority Date Filing Date
US11/132,479 Expired - Fee Related US7973770B2 (en) 2002-11-20 2005-05-19 Method and user interface for entering characters
US12/862,223 Abandoned US20100315369A1 (en) 2002-11-20 2010-08-24 Method and User Interface for Entering Characters

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US11/132,479 Expired - Fee Related US7973770B2 (en) 2002-11-20 2005-05-19 Method and user interface for entering characters

Country Status (8)

Country Link
US (2) US7973770B2 (en)
EP (1) EP1422599B1 (en)
KR (1) KR100712065B1 (en)
CN (1) CN1711517B (en)
AT (1) ATE332528T1 (en)
AU (1) AU2003283447A1 (en)
DE (1) DE60212976T2 (en)
WO (1) WO2004046904A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110032200A1 (en) * 2009-08-06 2011-02-10 Samsung Electronics Co., Ltd. Method and apparatus for inputting a character in a portable terminal having a touch screen
US8316319B1 (en) * 2011-05-16 2012-11-20 Google Inc. Efficient selection of characters and commands based on movement-inputs at a user-inerface

Families Citing this family (39)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB9722766D0 (en) 1997-10-28 1997-12-24 British Telecomm Portable computers
US7469381B2 (en) * 2007-01-07 2008-12-23 Apple Inc. List scrolling and document translation, scaling, and rotation on a touch-screen display
US20080168402A1 (en) 2007-01-07 2008-07-10 Christopher Blumenberg Application Programming Interfaces for Gesture Operations
US20080168478A1 (en) 2007-01-07 2008-07-10 Andrew Platzer Application Programming Interfaces for Scrolling
US7844915B2 (en) 2007-01-07 2010-11-30 Apple Inc. Application programming interfaces for scrolling operations
US8717305B2 (en) 2008-03-04 2014-05-06 Apple Inc. Touch event model for web pages
US8645827B2 (en) 2008-03-04 2014-02-04 Apple Inc. Touch event model
US8205157B2 (en) 2008-03-04 2012-06-19 Apple Inc. Methods and graphical user interfaces for conducting searches on a portable multifunction device
US8416196B2 (en) 2008-03-04 2013-04-09 Apple Inc. Touch event model programming interface
US20100175022A1 (en) * 2009-01-07 2010-07-08 Cisco Technology, Inc. User interface
US9311112B2 (en) 2009-03-16 2016-04-12 Apple Inc. Event recognition
US8285499B2 (en) 2009-03-16 2012-10-09 Apple Inc. Event recognition
US9684521B2 (en) 2010-01-26 2017-06-20 Apple Inc. Systems having discrete and continuous gesture recognizers
US8589374B2 (en) 2009-03-16 2013-11-19 Apple Inc. Multifunction device with integrated search and application selection
US8566045B2 (en) 2009-03-16 2013-10-22 Apple Inc. Event recognition
US8413065B2 (en) * 2009-09-07 2013-04-02 Qualcomm Incorporated User interface methods for ending an application
US9292161B2 (en) * 2010-03-24 2016-03-22 Microsoft Technology Licensing, Llc Pointer tool with touch-enabled precise placement
US10216408B2 (en) 2010-06-14 2019-02-26 Apple Inc. Devices and methods for identifying user interface objects based on view hierarchy
US9001055B2 (en) 2010-11-26 2015-04-07 Htc Corporation Portable device and method for operating portable device
EP2469384A1 (en) * 2010-12-23 2012-06-27 Research In Motion Limited Portable electronic device and method of controlling same
US8730188B2 (en) 2010-12-23 2014-05-20 Blackberry Limited Gesture input on a portable electronic device and method of controlling the same
US9298363B2 (en) 2011-04-11 2016-03-29 Apple Inc. Region activation for touch sensitive surface
US8754861B2 (en) * 2011-07-06 2014-06-17 Google Inc. Touch-screen keyboard facilitating touch typing with minimal finger movement
US9317196B2 (en) 2011-08-10 2016-04-19 Microsoft Technology Licensing, Llc Automatic zooming for text selection/cursor placement
US8782549B2 (en) 2012-10-05 2014-07-15 Google Inc. Incremental feature-based gesture-keyboard decoding
US9021380B2 (en) 2012-10-05 2015-04-28 Google Inc. Incremental multi-touch gesture recognition
US8843845B2 (en) 2012-10-16 2014-09-23 Google Inc. Multi-gesture text input prediction
US8701032B1 (en) 2012-10-16 2014-04-15 Google Inc. Incremental multi-word recognition
US8850350B2 (en) 2012-10-16 2014-09-30 Google Inc. Partial gesture text entry
US8819574B2 (en) 2012-10-22 2014-08-26 Google Inc. Space prediction for text input
US8832589B2 (en) 2013-01-15 2014-09-09 Google Inc. Touch keyboard using language and spatial models
US9081500B2 (en) 2013-05-03 2015-07-14 Google Inc. Alternative hypothesis error correction for gesture typing
US9733716B2 (en) 2013-06-09 2017-08-15 Apple Inc. Proxy gesture recognizer
KR102200823B1 (en) * 2013-09-10 2021-01-11 삼성전자 주식회사 Inputting method and system for electronic device
JP5934280B2 (en) * 2014-04-24 2016-06-15 京セラ株式会社 Character input device, character input method, and character input program
JP6208808B2 (en) * 2016-05-06 2017-10-04 京セラ株式会社 Character input device, character input method, and character input program
US10365823B2 (en) * 2017-03-02 2019-07-30 International Business Machines Corporation Simplified text entry user interface for touch devices
US10671181B2 (en) * 2017-04-03 2020-06-02 Microsoft Technology Licensing, Llc Text entry interface
JP2023535212A (en) 2020-07-24 2023-08-16 アジリス アイズフリー タッチスクリーン キーボーズ エルティディ Adaptable touch screen keypad with dead zone

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5805167A (en) * 1994-09-22 1998-09-08 Van Cruyningen; Izak Popup menus with directional gestures
US6011542A (en) * 1998-02-13 2000-01-04 Sony Corporation Graphical text entry wheel
US6337698B1 (en) * 1998-11-20 2002-01-08 Microsoft Corporation Pen-based interface for a notepad computer
US20040095393A1 (en) * 2002-11-19 2004-05-20 Microsoft Corporation System and method for inputting characters using a directional pad
US6741235B1 (en) * 2000-06-13 2004-05-25 Michael Goren Rapid entry of data and information on a reduced size input area
US6801190B1 (en) * 1999-05-27 2004-10-05 America Online Incorporated Keyboard system with automatic correction
US7145554B2 (en) * 2000-07-21 2006-12-05 Speedscript Ltd. Method for a high-speed writing system and high -speed writing device

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5689667A (en) * 1995-06-06 1997-11-18 Silicon Graphics, Inc. Methods and system of controlling menus with radial and linear portions
DE19843421B4 (en) * 1997-11-25 2007-07-05 Bayerische Motoren Werke Ag Device for selecting points of a menu structure consisting of menus and / or submenus and / or functions and / or function values

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5805167A (en) * 1994-09-22 1998-09-08 Van Cruyningen; Izak Popup menus with directional gestures
US6011542A (en) * 1998-02-13 2000-01-04 Sony Corporation Graphical text entry wheel
US6337698B1 (en) * 1998-11-20 2002-01-08 Microsoft Corporation Pen-based interface for a notepad computer
US6801190B1 (en) * 1999-05-27 2004-10-05 America Online Incorporated Keyboard system with automatic correction
US6741235B1 (en) * 2000-06-13 2004-05-25 Michael Goren Rapid entry of data and information on a reduced size input area
US7145554B2 (en) * 2000-07-21 2006-12-05 Speedscript Ltd. Method for a high-speed writing system and high -speed writing device
US20040095393A1 (en) * 2002-11-19 2004-05-20 Microsoft Corporation System and method for inputting characters using a directional pad

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110032200A1 (en) * 2009-08-06 2011-02-10 Samsung Electronics Co., Ltd. Method and apparatus for inputting a character in a portable terminal having a touch screen
US8316319B1 (en) * 2011-05-16 2012-11-20 Google Inc. Efficient selection of characters and commands based on movement-inputs at a user-inerface

Also Published As

Publication number Publication date
US20050270269A1 (en) 2005-12-08
ATE332528T1 (en) 2006-07-15
WO2004046904A1 (en) 2004-06-03
DE60212976D1 (en) 2006-08-17
AU2003283447A1 (en) 2004-06-15
EP1422599A1 (en) 2004-05-26
CN1711517A (en) 2005-12-21
KR100712065B1 (en) 2007-05-02
KR20050065678A (en) 2005-06-29
DE60212976T2 (en) 2006-11-16
CN1711517B (en) 2010-05-05
US7973770B2 (en) 2011-07-05
EP1422599B1 (en) 2006-07-05

Similar Documents

Publication Publication Date Title
US7973770B2 (en) Method and user interface for entering characters
US9710162B2 (en) Apparatus and method for inputting character using touch screen in portable terminal
US8797192B2 (en) Virtual keypad input device
US8577100B2 (en) Remote input method using fingerprint recognition sensor
KR100539904B1 (en) Pointing device in terminal having touch screen and method for using it
US20070192730A1 (en) Electronic device, computer program product and method of managing application windows
US9104247B2 (en) Virtual keypad input device
EP0738950A1 (en) Data processing method and apparatus using a handwriting instrument
US20080166049A1 (en) Apparatus and Method for Handwriting Recognition
US20050219226A1 (en) Apparatus and method for handwriting recognition
WO2007012698A1 (en) Method of controlling software functions, electronic device, and computer program product
EP1236076A1 (en) A portable communication device and method
KR100343950B1 (en) Portable terminal having software keyboard for note-recognition and note-recognition method by pen input
KR20060135056A (en) Apparatus and method for handwriting recognition
JP3747022B2 (en) Control system using tactile input field
KR100704312B1 (en) Navigation pad of touch type and wireless communication device having the same
KR100565851B1 (en) Character Recognition Interface Device Input Character Recognition Method
KR100691815B1 (en) Method and device for inputting key by using motion sensor
KR101257889B1 (en) Apparatus and method of user input interface for portable device

Legal Events

Date Code Title Description
STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION