WO2012072853A1 - Receiving scriber data - Google Patents

Receiving scriber data Download PDF

Info

Publication number
WO2012072853A1
WO2012072853A1 PCT/FI2010/050983 FI2010050983W WO2012072853A1 WO 2012072853 A1 WO2012072853 A1 WO 2012072853A1 FI 2010050983 W FI2010050983 W FI 2010050983W WO 2012072853 A1 WO2012072853 A1 WO 2012072853A1
Authority
WO
WIPO (PCT)
Prior art keywords
values
scriber
contact
period
accepted
Prior art date
Application number
PCT/FI2010/050983
Other languages
French (fr)
Inventor
Teemu KANKAANPÄÄ
Timo Nieminen
Original Assignee
Nokia Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nokia Corporation filed Critical Nokia Corporation
Priority to PCT/FI2010/050983 priority Critical patent/WO2012072853A1/en
Publication of WO2012072853A1 publication Critical patent/WO2012072853A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text

Definitions

  • the present disclosure relates to the field of touch screens, associated methods, computer programs and apparatus.
  • Certain disclosed aspects/example embodiments relate to portable electronic devices, in particular, so-called hand-portable electronic devices which may be hand-held in use (although they may be placed in a cradle in use).
  • Such hand-portable electronic devices include so-called Personal Digital Assistants (PDAs).
  • PDAs Personal Digital Assistants
  • the portable electronic devices/apparatus may provide one or more audio/text/video communication functions (e.g. tele-communication, video-communication, and/or text transmission (Short Message Service (SMS)/ Multimedia Message Service (MMS)/emailing) functions), interactive/non-interactive viewing functions (e.g. web-browsing, navigation, TV/program viewing functions), music recording/playing functions (e.g. MP3 or other format and/or (FM/AM) radio broadcast recording/playing), downloading/sending of data functions, image capture function (e.g. using a (e.g. in-built) digital camera), and gaming functions.
  • audio/text/video communication functions e.g. tele-communication, video-communication, and/or text transmission (Short Message Service (SMS)/ Multimedia Message Service (MMS)/emailing) functions
  • interactive/non-interactive viewing functions e.g. web-browsing, navigation, TV/program viewing functions
  • music recording/playing functions
  • the touch screen displays one or a number of keys, icons or menu items (or other user interface (Ul) elements) which each occupy a subsection of the touch screen area and correspond to a selectable function.
  • These functions may depend on the device and the mode of the device. For example, for a mobile phone in a text entry mode, the functions may include selecting letters for textual entry. For a laptop, the functions may include running an application.
  • the functions e.g. to select a function
  • the user must, for example, contact the scriber (stylus or finger, for example) accurately on an associated key or user interface element.
  • an apparatus comprising:
  • At least one memory including computer program code
  • the at least one memory and the computer program code configured to, with the at least one processor, cause the apparatus to perform at least the following:
  • the scriber data comprising position values and one or more corresponding parameter values for the touch gesture, the position values representing the position of a scriber on a touch screen over a contact period, wherein the contact period is the period of time during which the scriber is continuously in contact with the touch screen between contact initiation and contact release of the touch gesture;
  • the scriber may, for example, comprise a finger, a finger nail, a thumb or a stylus.
  • the scriber data may comprise discrete data points or a continuous variable.
  • the touch screen may use stress, strain, conductivity, surface acoustic waves, capacitance and/or heat to determine the position and/or the pressure of the scriber with respect to the touch screen.
  • a touch gesture may comprise, for example, tapping the touch screen with the scriber, dragging the scriber across the touch screen and/or exerting a pressure on the touch screen.
  • Scriber input may comprise single-touch or multi-touch input.
  • a touch gesture may comprise a contact action on a virtual keyboard (or other input interface).
  • the contact action may be considered to comprise an initial contact location, a path along which contact with the touch-sensitive screen continues, and a final contact location at which contact of the scriber with the touch-sensitive screen is removed, forming an input stroke pattern according to the said recorded contact action.
  • the parameter values may relate to, for example, one or more of a time aspect, a velocity aspect, a speed aspect, a pressure aspect and/or a rate of change of pressure of the touch gesture.
  • the accepted values may comprise one or more values.
  • the rejected values may comprise one or more values.
  • the contact position may be considered to give a measure of the location of the scriber within the touch screen area.
  • the contact position may be given in terms of distances with respect to a reference point (e.g. 3.4cm along an x-axis and 2.5 cm along a y-axis), or with respect to a user interface element (e.g. the contact position is within (or corresponds to) the 'a' key).
  • the contact position may be used to select (or otherwise interact with) user interface elements corresponding to the determined contact position.
  • the received position values may be considered to give a measure of the location of the scriber within the touch screen area during the contact period.
  • the received position values may be given in terms of distances with respect to a reference point (e.g. 3.4cm along an x-axis and 2.5 cm along a y-axis), or with respect to a user interface element (e.g. the contact position is within (or corresponds to) the 'a' key).
  • the position values may be categorised based on whether the corresponding parameter values fall within one or more corresponding parameter ranges (or satisfy one or more criteria).
  • the corresponding parameter range may be bounded by a single value (e.g. the corresponding speed being greater than 0.01 m/s) or using multiple values. For example, multiple values could define a range:
  • the one or more corresponding parameter ranges may comprise, for example:
  • an end absolute time range of predetermined duration (e.g. between 10 and 11
  • a beginning absolute time range e.g. between 5 and 500ms
  • an end relative time range at the end of the contact period the duration of the end relative time range being a predetermined proportion (e.g. between 5 and 30%) of the contact period
  • the duration of the beginning relative time range being a predetermined proportion (e.g. between 15 and 40%) of the contact period;
  • a predetermined scriber velocity range e.g. less than 0.1 m/s
  • a predetermined scriber pressure range e.g. between 10Pa and 1500Pa
  • the scriber pressure being the pressure exerted by the scriber on the touch screen during the contact period.
  • the apparatus may be configured to categorise position values as rejected values which represent positions having:
  • corresponding time parameter values within an end absolute time range of predetermined duration e.g. between 10 and 100ms at the end of the contact period
  • corresponding time parameter values within a beginning absolute time range e.g. between 5 and 500ms
  • corresponding time parameter values within an end relative time range at the end of the contact period, the duration of the end relative time range being a predetermined proportion (e.g. between 5 and 30%) of the contact period
  • the duration of the beginning relative time range being a predetermined proportion (e.g. between 15 and 40%) of the contact period;
  • corresponding velocity parameter values within a predetermined scriber velocity range e.g. less than 0.1 m/s
  • the apparatus may be configured to categorise, as rejected values, scriber data values which represent positions which the scriber
  • the apparatus may be configured to:
  • derived corresponding parameter values from the received scriber data.
  • the calculation may comprise any mathematical operator (e.g. time integral) and/or function (e.g. sine, cosine function) applied to values of the received scriber data.
  • corresponding parameter values representing velocity may be derived from position values and corresponding time values (e.g. by differentiating the position values with respect to time).
  • the calculation of derived corresponding parameter values may occur before or after the categorisation of position values as accepted or rejected values.
  • the determined contact position may be, for example, at least one of:
  • the apparatus may be configured to:
  • the display screen may comprise user interface elements, each user interface element defined by a position value range, the apparatus configured to:
  • the determined contact position may be a particular user interface element.
  • the user interface elements may comprise, for example, one or more of an letter input key, a punctuation mark input key, a character input key, a numeric input key, a slider, a input region, an icon (e.g. an application icon), a menu item, a function key and a soft key.
  • Menu items may correspond to one or more of single or multiple software applications, contacts, website favourites, previous calls, TV channels, settings, games, names, addresses and any other items.
  • the apparatus may be configured such that the respective user interface elements (e.g. keys) provide for scriber input of characters (e.g. textual input).
  • characters may include, for example, numbers, punctuation marks and/or letters of the Roman, Greek, Arabic and/or Cyrillic alphabets.
  • the apparatus may be configured to enable input of Chinese or Japanese characters, either directly or via transcription methods such as Pinyin and/or Bopomofo (Zhuyin Fuhao).
  • the apparatus may provide for a key input area in one or more of English, Chinese, Japanese, Greek, Arabic, Indo-European, Oriental and Asiatic language.
  • the apparatus may enable user interface elements to be scrolled or moved (e.g. drag and drop).
  • the apparatus may enable drawing or creating user-scribed delineations (e.g. handwriting, drawing Chinese characters and/or drawing pictures).
  • the apparatus may enable textual input methods wherein a user enters words by sliding a finger or stylus from letter to letter, lifting between words and/or phrases (e.g. Swype ® ).
  • the display may comprise a plurality of user interface elements wherein the apparatus is configured such that the respective user interface elements provide for scriber input of characters, with each indicium indicating the particular character which can be input.
  • the display may comprise a plurality of user interface elements, and wherein the apparatus is configured such that the user interface elements represent an alphabet key input area.
  • the display may comprise a plurality of user interface elements, and wherein the apparatus is configured such that the respective user interface elements represent an alphanumeric key input area.
  • the display may comprise a plurality of user interface elements, and wherein the apparatus is configured such that at least one of the user interface element represent a single function (e.g. a QWERTY key input area) or a plurality of functions (e.g. a reduced keypad such as a ITU-T E.161 keypad).
  • the apparatus may be used for predictive text messaging (such as iTap), for disambiguating ambiguous key sequences (such as T9 or other predictive text on an ITU-T keypad) and/or multi-touch text messaging.
  • the apparatus may be configured such that the respective user interface elements provide for scriber input indicating the selection of an icon, with each indicium indicating the particular icon (e.g. relating to a corresponding software application) which can be selected.
  • the apparatus may be configured such that a series of user interface elements can be selected by contacting the screen with the scriber in one continuous motion, wherein, during the continuous motion, the scriber touches each of the user interface elements corresponding to the series of user interface elements.
  • a series of user interface elements can be selected by contacting the screen with the scriber in one continuous motion, wherein, during the continuous motion, the scriber touches each of the user interface elements corresponding to the series of user interface elements.
  • other example embodiments may comprise physical buttons in place of or in addition to touch buttons/keys provided on a touch-screen.
  • the apparatus may be an electronic device or a module for an electronic device.
  • the apparatus, processor and/or memory may be incorporated into an electronic device.
  • the apparatus may comprise a touch screen interface for an electronic device or a module for an electronic device/display or a display of an electronic device. Examples of an electronic device include a portable electronic device, a laptop computer, a Satellite Navigation console, a desktop computer, a monitor, a mobile
  • Transmission of data may be via a network (by signalling).
  • the network may be, for example, the internet, a mobile phone network, a wireless network, LAN or Ethernet.
  • the apparatus may comprise a transmitter and or receiver to interact with a network.
  • the transmitter/receiver may comprise, for example, an antenna, an Ethernet port, a LAN connection, a USB port, a radio antenna, Bluetooth connector, infrared port, fibre optic detector/transmitter.
  • a method comprising:
  • the scriber data comprising position values and one or more corresponding parameter values for the touch gesture, the position values representing the position of a scriber on a touch screen over a contact period, wherein the contact period is the period of time during which the scriber is continuously in contact with the touch screen between contact initiation and contact release of the touch gesture;
  • a computer program comprising computer code configured to:
  • the scriber data comprising position values and one or more corresponding parameter values for the touch gesture, the position values representing the position of a scriber on a touch screen over a contact period, wherein the contact period is the period of time during which the scriber is continuously in contact with the touch screen between contact initiation and contact release of the touch gesture;
  • the computer program may be stored on a recordable medium (e.g. DVD, CD, USB stick or other non-transitory computer-readable medium).
  • a recordable medium e.g. DVD, CD, USB stick or other non-transitory computer-readable medium.
  • an apparatus comprising:
  • a means for receiving configured to receive scriber data for a touch gesture, the scriber data comprising position values and one or more corresponding parameter values for the touch gesture, the position values representing the position of a scriber on a touch screen over a contact period, wherein the contact period is the period of time during which the scriber is continuously in contact with the touch screen between contact initiation and contact release of the touch gesture;
  • a means for categorising configured to categorise the position values as either accepted values or rejected values based on whether the corresponding parameter values of the received scriber data meets a predetermined parameter value criterion
  • a means for determining configured to determine at least one contact position for the touch gesture using the accepted values.
  • the invention may make tapping and typing on a touch screen more accurate and less prone to errors caused by unintentional gestures. As a result, typing may be faster, and there may be less erroneous selection of adjacent Ul elements. This may also allow smaller touch screens, screens with higher resolution and more content, and/or smaller touch screen devices, to be made without sacrificing usability.
  • the present disclosure includes one or more corresponding aspects, example embodiments or features in isolation or in various combinations whether or not specifically stated (including claimed) in that combination or in isolation.
  • Corresponding means for performing one or more of the discussed functions are also within the present disclosure.
  • Corresponding computer programs for implementing one or more of the methods disclosed are also within the present disclosure and encompassed by one or more of the described example embodiments.
  • the above summary is intended to be merely exemplary and non-limiting.
  • Figure 1 illustrates an example embodiment comprising a number of electronic components, including memory, a processor and a communication unit.
  • Figure 2 depicts an example embodiment with a touch screen and physical buttons.
  • Figure 3 illustrates a user interacting with the touch-screen as the user inputs a message.
  • Figure 4 depicts raw scriber data for a touch gesture.
  • Figure 5a illustrates the raw scriber data in graphical form.
  • Figure 5b illustrates the accepted data values.
  • Figure 6a depicts scriber data and calculated derived corresponding parameter values.
  • Figure 6b illustrates the accepted data values.
  • Figure 7a illustrates scriber data comprising position values, corresponding time values and corresponding pressure values.
  • Figure 7b illustrates the accepted data values.
  • Figure 8a depicts scriber data and derived parameter values representing the user interface elements corresponding to the position values.
  • Figure 8b illustrates the accepted data values.
  • Figure 9a depicts scriber data for an example embodiment configured to determine a plurality of contact positions based on velocity.
  • Figure 9b illustrates the accepted data values.
  • Figure 10a illustrates a user interacting with an example embodiment to enter text.
  • Figure 1 1 a depicts scriber data for an example embodiment configured to detect movement along the same path in different directions.
  • Figure 1 1 b illustrates the accepted data values.
  • Figure 12a depicts a flow diagram describing the method used determine at least one contact position from scriber data.
  • Figure 13a illustrates schematically a computer readable medium providing a program according to an example embodiment of the present invention. Description of Example Aspects/Example embodiments
  • feature number 1 can also correspond to numbers 101 , 201 , 301 etc. These numbered features may appear in the figures but may not have been directly referred to within the description of these particular example embodiments. These have still been provided in the figures to aid understanding of the other example embodiments, particularly in relation to the features of similar described example embodiments.
  • a user When a user interacts with a touch screen, it may be, for example, to select or otherwise interact with an object displayed on the screen. To do this the user generally has to position the scriber (e.g. his finger) on the displayed object to select the displayed object. However, as the user is interacting with the screen the scriber may move around unintentionally. This movement may result in, for example, an unintentional selection of an adjacent displayed object. This is particularly an issue when the touch screen displayed objects are small (or the unintentional movements are large (e.g. users with limited coordination)).
  • the scriber e.g. his finger
  • Example embodiments described herein may be considered to provide a way of filtering out unwanted or unintentional gestures (e.g. at the end of a single touch gesture (or 'tap- up')).
  • Example embodiments may be configured to record the position of the scriber throughout a single touch gesture, then ignore certain touched positions based on one or more predetermined criteria.
  • Figure 1 depicts an example embodiment (101 ) of an apparatus, such as a mobile phone or personal digital assistant, comprising a touch screen (105) such as, for example, a Projected Capacitive Touch Screen.
  • the apparatus (101 ) may comprise a module for a mobile phone (or PDA, audio/video player or other suitable device), and may just comprise a suitably configured memory 107 and processor 108 (see below).
  • the apparatus (101 ) of Figure 1 is configured such that it may receive, include, and/or otherwise access data.
  • this example embodiment (101 ) comprises a communications unit (103), such as a receiver, transmitter, and/or transceiver, in communication with an antenna (102) for connecting to a wireless network and/or a port (not shown) for accepting a physical connection to a network, such that data may be received via one or more types of networks.
  • This example embodiment comprises a memory (107) that stores data, possibly after being received via an antenna (102) or port or after being generated at the touch screen user interface (105b).
  • the touch screen user interface allows the user to interact with the touch screen (105) to generate scriber data, the scriber data representing scriber input (that is, scriber input is received at the touch screen from the scriber).
  • the processor (108) may receive the data representing scriber input from the touch screen user interface (105b), from the memory (107), or from the communication unit (103). Regardless of the origin of the data, these data may be outputted to a user of apparatus (101 ) via a touch screen (105) display device (105a), and/or any other output devices provided with the apparatus.
  • the processor (108) may also store the data for later use in the memory (107).
  • the memory (107) may store computer program code and/or applications which may be used to instruct/enable the processor (108) to perform functions (e.g.
  • Figure 2 depicts the outward appearance of the example embodiment of figure 1 comprising a portable electronic device (101 ), e.g. such as a mobile phone, with a user interface comprising a touch-screen (105), a physical keypad (106) comprising buttons, a memory (not shown) and a processor (not shown).
  • the portable electronic device is configured to allow the user to interact with the portable electronic device with his/her finger on the touch screen (105). It will be appreciated that in other suitably adapted example embodiments the user may interact with the touch screen using a stylus.
  • Figure 3 depicts a user interacting with the touch screen (105) of the example embodiment of figure 2.
  • This example embodiment is configured to enable text entry.
  • this example embodiment has a conventional text entry mode wherein the touch-screen (105) is divided into two regions.
  • the bottom (keyboard) region (1 12) of the screen is where the user can select letters by pressing his finger (i.e. the user's finger (120) is the scriber in this example) on the screen.
  • the top (text entry) region of the screen (1 1 1 ) is configured to display the letters entered into the device using the scriber.
  • the bottom (keyboard) region comprises touch sensitive user interface elements (keys) which are fixed with respect to the screen. It will be appreciated that for other example embodiments determining at least one contact position corresponding to the scriber interaction may enable user interface elements to be scrolled or moved (e.g. drag and drop). It will be appreciated that for other example embodiments, determining at least one contact position corresponding to the scriber interaction may facilitate drawing (e.g. handwriting, drawing Chinese characters, drawing pictures).
  • the scriber (123) is the user's finger.
  • the touch screen (105) generates scriber data.
  • the scriber data comprises both the position of the scriber (measured in millimetres) on the screen and the corresponding time (measured in milliseconds), wherein the time is measured from the initial contact with the screen for this touch gesture.
  • the position of the scriber (x, y) (122) is given with respect to a reference point (121 ) on the surface of the touch screen (along two axes).
  • the x-axis and y-axis lie parallel to the bottom edge (124) of the touch screen and the side edge (125) of the touch screen respectively.
  • the scriber data for a touch gesture is transmitted to the processor of the device.
  • the scriber data is processed by the processor of the example embodiment.
  • FIG 4 depicts the raw scriber data in tabulated form and figure 5a shows the same scriber data (540) in graphical form.
  • the position values (432) relate to the x-axis only.
  • the scriber data (540) gives the position values, x, (432) and the corresponding time values, t, (431 ).
  • the position values are measured at discrete intervals (in this case the intervals are 2 ms). It will be appreciated that other example embodiments may use one or more discrete and or continuous variables.
  • the user has moved slightly as he initiated the touch gesture (541 ), coming to rest at a particular position some time later. The user holds this particular position for a period before unintentionally moving the scriber as he lifts his finger and completes the touch gesture (contact release (542)).
  • the touch gesture has a contact period which is the duration of time from contact initiation and contact release.
  • the contact period (543) is 52ms.
  • the apparatus is configured to categorise the position values, as either accepted values or rejected values, based on the (corresponding parameter values of) time (431 ) within the contact period. A graphical form of the accepted values and associated time values is shown in figure 5b.
  • the corresponding parameter (time) value is within an end absolute time range (545) (e.g. 10 ms) of the end of the touch contact period (543)
  • the corresponding positional value is categorised as rejected values for the touch gesture.
  • the time is not within an end absolute time range (544) (e.g. 10 ms) of the end of the contact period (543) then the corresponding positional data is categorised as accepted values.
  • the end absolute time range could be any predetermined absolute duration of time (e.g. between 5 and 50ms). It will be appreciated that the scriber data could be categorised according to an end relative time range. An end relative time range could be any fraction of the contact period. For example, any positional data corresponding to a time within 10% (or other value such as between 1 and 30%) of the contact period at the end of the contact period may be categorised as rejected values. It will be appreciated that a corresponding parameter (time) range (used as a criterion for categorisation) may not be at the end of the contact period (e.g. it may be at the beginning or in the middle of the contact period).
  • This example embodiment uses a single criterion, other example embodiments may use a plurality of criteria.
  • This example embodiment is configured to determine a contact position (546) for the touch gesture using the accepted (position) values.
  • the contact position is determined to be the last (temporally ordered) accepted position value. In this case the contact position is determined to be 2.8mm. It will be appreciated that for other example embodiments the determined contact position may be calculated by taking the mean of the accepted scriber data, the mode of the scriber data, the median of the scriber data or some other method.
  • Figure 6 shows the same scriber data input (640) into a further example embodiment.
  • the scriber data (640) is represented by the top graph.
  • the scriber data (640) in this case, as in the previous example, was generated by the touch screen and comprises values for the (x-axis) position and the corresponding (parameter) time.
  • this example embodiment is configured to calculate derived corresponding (velocity) parameter values (649) and categorise the position values (632) (as accepted or rejected values) based on these derived corresponding (velocity) parameter values.
  • each position value (632), Xi has a corresponding time value (631 ), f,.
  • the labelling variable, / ' denotes the order of the (time or position) value in the sequence ordered by time. That is, the time value (631 ), t i+1 , is the next time value in the (temporal) sequence directly after the time value (631 ), f,.
  • This example embodiment calculates a corresponding (velocity) parameter value (633), v corresponding to the time value (632), f detox and position value (631 ), x ordinance for each / ' as:
  • the calculated velocity values (633) corresponding to the scriber data (640) are shown in figure 6 on the bottom graph.
  • the top (position versus time) and the bottom (velocity versus time) graphs have equivalent time axis scales.
  • the example embodiment is configured to categorise the positional values (631 ) according to criteria based on the corresponding derived (velocity) parameter values (633).
  • each position value (631 ) is categorised as either an accepted value or a rejected value on the basis of whether the corresponding velocity value (633) is greater than, or not greater than, a predetermined velocity threshold, v thr esh (650).
  • v thr esh a predetermined velocity threshold
  • the accepted values and corresponding time values are depicted in graphical form in figure 6b.
  • velocity is a vector quantity, it will be appreciated that different velocity thresholds may be used for movements in different directions.
  • this example embodiment is configured to determine a contact position for the touch gesture.
  • the contact position (646) is determined by determining the middle accepted (position) value ordered by time (or by the labelling variable, / ' ). That is, if there are N accepted values, the contact position is the position value corresponding to the (N/2)th accepted value.
  • the time interval between successive time values is the same, this means that the time period before the middle accepted (position) value (647) is of the same duration as the time period after the middle (accepted) position value (648).
  • the middle accepted (position) value (646) corresponds to determined contact position of 2.8 mm.
  • the time interval may vary as a function of time, in which case the middle accepted vale or the accepted value corresponding to the middle time may be used.
  • a tap touch gesture e.g. a touch of short duration
  • the pressure of the scriber on the touch screen may quickly decrease during tap up, changing the touch position. It may be desirable to categorize as rejected data those position values which correspond to, for example, low pressure contact.
  • Figure 7a shows scriber data, for a touch gesture with a contact period (743), input (740) into such a further example embodiment.
  • the scriber data (740) is represented by the top (position versus time) graph and the bottom (pressure versus time) graph.
  • the touch screen is configured to generate parameter values relating to a plurality of corresponding parameters.
  • the scriber data in this case comprise (corresponding parameter) time values (731 ), position values of the scriber on the touch screen (732), and (corresponding parameter) pressure values of the scriber on the screen (734). In this case both the position and pressure are measured continuously (i.e. the time values (731 ) represent a continuous variable).
  • the example embodiment is configured to categorise the position data values (732) based on a plurality of criteria. That is, the example embodiment is configured to categorise the position data values (732) based on both the corresponding (pressure) parameter values (734) and the corresponding (time) parameter values (731 ). That is, a position data value, x, is categorised as a rejected value if:
  • the corresponding time value, t(x) is within an relative time range of the end of the contact period (745).
  • the position data (732) is categorised as an accepted value.
  • the accepted data (749) corresponding to the accepted (position) values is shown in figure 7b.
  • this example embodiment is configured to determine a contact position (746).
  • the example embodiment determines the accepted (position) value which corresponds to the highest applied pressure, P max (756).
  • the contact position is determined to be 2.8mm.
  • the example embodiment may determine which graphical user interface element corresponds to this determined contact position (744) and enable the performance of the corresponding function.
  • the end relative time range is 10% of the total contact period. It will be appreciated that other relative time range values could be used (e.g. between 2 and 30%).
  • the criterion related to the time ensures that movements, at the end of the contact period, which are of short duration in comparison to the duration of the complete touch from touch initiation to touch release are categorized as rejected values.
  • example embodiments may be configured to detect or calculate (derive) the rate of change of pressure.
  • the rate of change of pressure may also be used as a corresponding parameter, which along with a corresponding parameter range may by used to categorize the position values as accepted values or rejected values.
  • high rates of change in pressure may be associated with rejection. In other uses it may be associated with acceptance.
  • the determined contact position may denote a range of positions corresponding to a user interface element, rather than a specific position.
  • Figure 8a shows scriber data input into a further example embodiment, the example embodiment configured to determine a contact position corresponding to a user interface element.
  • the scriber data (840) is represented by the top (position versus time) graph.
  • This example embodiment is configured to recognise a range of positions relating to a user interface element. For example an x-position value between 2.4cm and 3.2cm (from a reference point) may correspond to an 'a' key user interface element.
  • the user interface elements make up a QWERTY keyboard for textual entry.
  • the user has touched three distinct user interface elements (or buttons (B-i , B 2 and B 3 )) during the contact period.
  • x 0 and Xi correspond to user interface element B-i (in this example B-i corresponds to the letter 'q'); Xi and x 2 correspond to user interface element B 2 (in this example B 2 corresponds to the letter 'w'); and
  • x 2 and x 3 correspond to user interface element B 3 (in this example B 3 corresponds to the letter 'e').
  • the apparatus can calculate (or derive) the corresponding button (position) value (835), wherein the corresponding button (position) value (835) denotes which virtual button (or other user interface element) the scriber is interacting with.
  • the example embodiment is configured to categorise the position data values (832) based on the corresponding time values (831 ). That is, a position data value is categorised as a rejected value if the corresponding (discrete) time value, t,, (831 ) is within a relative time range (845) of the end of the contact period. Otherwise the position value (831 ) is categorised as an accepted value. In this case the relative time range is 20% of the total contact period. It will be appreciated that other relative time range values could be used.
  • the categorising of the position data values could be performed before or after deriving the corresponding button (position) values (835). From the accepted (position) values (shown in figure 8b) this example embodiment is configured to determine a contact position. However, unlike the previous case where the determined contact position was determined to be an absolute position value, in this example embodiment, the contact position is determined to be a button position value. That is, this example embodiment is configured to determine the last button position value corresponding to the accepted position values. In this example the contact position is determined to correspond to user interface element B 2 . This example embodiment is then configured to activate the function corresponding to the determined contact position. In this case, the letter 'w' is entered into the text message. By rejecting data within a (relative or absolute) time range at the end of the contact period, scriber user interface element changes during touch release (e.g. immediately before the finger loses touch with the screen) may be ignored.
  • example embodiments may enable selection of all of the user interface elements corresponding to accepted position values.
  • all of B ⁇ B 2 and B 3 would be selected.
  • other example embodiments may be configured to recognize turning points (or stationary points such as maxima, minima and/or points of inflection) in the parameter values.
  • turning points or stationary points such as maxima, minima and/or points of inflection
  • the turning points local maxima (839), points of inflection and/or local minima (838)
  • the contact position could be determined to correspond to the position (or positions) at which there is a turning point).
  • this determination may result in only buttons B 2 and B 3 being selected for the example scriber data.
  • turning points in other corresponding parameter values e.g. pressure, velocity, speed
  • Figure 9a depicts the scriber data generated by a user interacting with a further example embodiment configured to enable text entry. This example embodiment is configured to enable a series of characters (e.g. letters) to be input using a single touch gesture.
  • the scriber data (940) in this case comprises position values (932) and (discrete) time values (931 ) (as for the example embodiment of figure 3). From this scriber data (940), this example embodiment is configured to calculate corresponding (scalar) speed values (937) Si, using the equation
  • the example embodiment is configured to categorise, as rejected values, those position values which have corresponding speed values exceeding a threshold speed value, s t hresh (952). Those position values which do not have corresponding speed values exceeding a threshold speed value, s t hresh(952), are categorised as accepted values (shown in figure 9b).
  • the accepted values comprise two distinct and consecutive sets of accepted (position) values (971 and 972).
  • the apparatus is configured to determine a contact position for each consecutive set of accepted (position) values (971 and 972).
  • Each consecutive set of accepted position values (971 and 972) consist of position values which correspond to a successive and uninterrupted series of time values (ordered by time) representing a continuous period of time.
  • the apparatus determines the most common position value. This example embodiment then determines the user interface element (character) corresponding to each of the determined contact positions. The example embodiment enables the performance of the function (e.g. selecting a letter for text entry) associated with each of the corresponding user interface elements.
  • the user interface elements may correspond to menu items, or icons. It will be appreciates that when a plurality of user interface elements are selected, they may be performed in the order in which they were selected (e.g. selecting characters for text entry) or not in the order in which they were selected (e.g. a user running a plurality of independent programs by selecting them using a single touch gesture).
  • FIG. 10 depicts the user of the example embodiment of figure 9 entering text into a textual message (e.g. email, SMS message, MMS message or text message).
  • a textual message e.g. email, SMS message, MMS message or text message.
  • this example embodiment has a text entry mode wherein the touchscreen (1005) is divided into two regions.
  • the bottom (keyboard) region (1012) of the screen is where the user can select a series of user interface elements (or letters) on the screen.
  • the user's finger (1023) is the scriber in this example.
  • the top (text entry) region of the screen (101 1 ) is configured to display the letters entered into the device using the scriber.
  • Figure 10 depicts the position of the user's finger (or scriber (1023)) at the end of the contact action (1090) (or touch gesture) corresponding to the word 'map'.
  • the user initiated contact with the screen at the key corresponding to the (letter) character 'm'.
  • a pause or sufficiently slow movement
  • the user moved his finger across the keys to the (letter) character 'a'.
  • all of the position values (and corresponding user interface elements (101 1 )) touched between the (letter) characters 'm' and 'a' would be categorised as rejected (e.g.
  • this example embodiment may be used for predictive text messaging (such as iTap), for disambiguating ambiguous key sequences (such as T9 or other predictive test on an ITU-T keypad) and/or multi-tap text messaging.
  • error- correction technologies could be activated (e.g. for each word) by completing the touch gesture (contact release), and/or by selecting a particular user interface element (e.g. a space bar).
  • Figure 1 1 a depicts the scriber data (1 180) for an example embodiment configured to categorise as rejected positions corresponding to the scriber moving back and forth along the same path during (and/or after) touch initiation and during (and/or before) touch release.
  • the desired touch position may be where the motion reverses direction. This kind of motion may be detected when typing quickly with the blunt, soft tip of a finger.
  • Figure 1 1 a depicts the scriber data (1 140) for a touch gesture input into an example embodiment configured to reject position data values (1 181 ) corresponding to a set of positions which the scriber traverses in a first direction at the beginning of the contact period; and traverses in a second direction at the end of the contact period.
  • Figure 1 1 a depicts both the x and y positions corresponding to a touch gesture.
  • the scriber is first moved through a set of position values (1 181 ) in a first direction. Then the scriber pauses on a particular position for a period of time. Then the scriber traverses the same set of position values (1 181 ) in a second direction immediately prior to touch release.
  • the scriber traverses the set of position values in the second direction at a greater velocity than when traversing the set of position values in the first direction.
  • the example embodiment is configured to compare the series of position values at (or just after) contact initiation and at (or just before) contact release and categorise as rejected those series of position values which have been traversed in different directions at these times. From the remaining accepted position values (figure 1 1 b) at least one contact position is determined. For this example embodiment the contact position is determined to be the mean position of the accepted position values.
  • the desired touch position may be more accurately determined.
  • the apparatus may ignore unintentional touches to the key/menu item/icon adjacent to the intended key by considering the parameter values of the received scriber data. For example, when typing on a virtual keyboard, the user's finger may rotate or slide while tapping down and when lifting the finger, causing the tap-up position, which may be used to detect the touch position, to change unintentionally. By categorising the position data as accepted or rejected, these unintentional touches may be ignored for the purposes of calculating a contact position.
  • a scriber e.g. stylus, finger or thumb
  • Figure 12 shows a flow diagram illustrating how a contact position is determined using scriber data for a touch gesture.
  • Figure 13 illustrates schematically a computer/processor readable media 1200 providing a program according to an example embodiment of the present invention.
  • the computer/processor readable media is a disc such as a digital versatile disc (DVD) or a compact disc (CD).
  • DVD digital versatile disc
  • CD compact disc
  • the computer readable media may be any media that has been programmed in such a way as to carry out an inventive function.
  • any mentioned apparatus/device and/or other features of particular mentioned apparatus/device may be provided by apparatus arranged such that they become configured to carry out the desired operations only when enabled, e.g. switched on, or the like. In such cases, they may not necessarily have the appropriate software loaded into the active memory in the non-enabled (e.g. switched off state) and only load the appropriate software in the enabled (e.g. on state).
  • the apparatus may comprise hardware circuitry and/or firmware.
  • the apparatus may comprise software loaded onto memory.
  • Such software/computer programs may be recorded on the same memory/processor/functional units and/or on one or more memories/processors/functional units.
  • a particular mentioned apparatus/device/server may be pre-programmed with the appropriate software to carry out desired operations, and wherein the appropriate software can be enabled for use by a user downloading a "key", for example, to unlock/enable the software and its associated functionality.
  • Advantages associated with such example embodiments can include a reduced requirement to download data when further functionality is required for a device, and this can be useful in examples where a device is perceived to have sufficient capacity to store such preprogrammed software for functionality that may not be enabled by a user.
  • any mentioned apparatus/circuitry/elements/processor may have other functions in addition to the mentioned functions, and that these functions may be performed by the same apparatus/circuitry/elements/processor.
  • One or more disclosed aspects may encompass the electronic distribution of associated computer programs and computer programs (which may be source/transport encoded) recorded on an appropriate carrier (e.g. memory, signal).
  • any "computer” described herein can comprise a collection of one or more individual processors/processing elements that may or may not be located on the same circuit board, or the same region/position of a circuit board or even the same device. In some example embodiments one or more of any mentioned processors may be distributed over a plurality of devices. The same or different processor/processing elements may perform one or more functions described herein.
  • the term “signalling” may refer to one or more signals transmitted as a series of transmitted and/or received signals.
  • the series of signals may comprise one, two, three, four or even more individual signal components or distinct signals to make up said signalling. Some or all of these individual signals may be transmitted/received simultaneously, in sequence, and/or such that they temporally overlap one another.
  • processor and memory e.g. including ROM, CD-ROM etc
  • these may comprise a computer processor, Application Specific Integrated Circuit (ASIC), field-programmable gate array (FPGA), and/or other hardware components that have been programmed in such a way to carry out the inventive function.

Abstract

An apparatus, method and computer program for: receiving scriber data for a touch gesture, the scriber data comprising position values and one or more corresponding parameter values for the touch gesture, the position values representing the position of a scriber on a touch screen over a contact period, wherein the contact period is the period of time during which the scriber is continuously in contact with the touch screen between contact initiation and contact release of the touch gesture; categorising the position values as either accepted values or rejected values based on whether the corresponding parameter values of the received scriber data meets a predetermined parameter value criterion; and determining at least one contact position for the touch gesture using the accepted values.

Description

RECEIVING SCRIBER DATA
Technical Field
The present disclosure relates to the field of touch screens, associated methods, computer programs and apparatus. Certain disclosed aspects/example embodiments relate to portable electronic devices, in particular, so-called hand-portable electronic devices which may be hand-held in use (although they may be placed in a cradle in use). Such hand-portable electronic devices include so-called Personal Digital Assistants (PDAs).
The portable electronic devices/apparatus according to one or more disclosed aspects/example embodiments may provide one or more audio/text/video communication functions (e.g. tele-communication, video-communication, and/or text transmission (Short Message Service (SMS)/ Multimedia Message Service (MMS)/emailing) functions), interactive/non-interactive viewing functions (e.g. web-browsing, navigation, TV/program viewing functions), music recording/playing functions (e.g. MP3 or other format and/or (FM/AM) radio broadcast recording/playing), downloading/sending of data functions, image capture function (e.g. using a (e.g. in-built) digital camera), and gaming functions. Background
It is now not unusual for electronic devices to utilise touch screen technology to provide a user interface for enabling a user to interact with the device. For example, the touch screen displays one or a number of keys, icons or menu items (or other user interface (Ul) elements) which each occupy a subsection of the touch screen area and correspond to a selectable function. These functions may depend on the device and the mode of the device. For example, for a mobile phone in a text entry mode, the functions may include selecting letters for textual entry. For a laptop, the functions may include running an application. To interact with the functions (e.g. to select a function) the user must, for example, contact the scriber (stylus or finger, for example) accurately on an associated key or user interface element.
The listing or discussion of a prior-published document or any background in this specification should not necessarily be taken as an acknowledgement that the document or background is part of the state of the art or is common general knowledge. One or more aspects/example embodiments of the present disclosure may or may not address one or more of the background issues. Summary
In a first aspect, there is provided an apparatus comprising:
at least one processor; and
at least one memory including computer program code,
the at least one memory and the computer program code configured to, with the at least one processor, cause the apparatus to perform at least the following:
receive scriber data for a touch gesture, the scriber data comprising position values and one or more corresponding parameter values for the touch gesture, the position values representing the position of a scriber on a touch screen over a contact period, wherein the contact period is the period of time during which the scriber is continuously in contact with the touch screen between contact initiation and contact release of the touch gesture;
categorise the position values as either accepted values or rejected values based on whether the corresponding parameter values of the received scriber data meets a predetermined parameter value criterion; and
determine at least one contact position for the touch gesture using the accepted values.
The scriber may, for example, comprise a finger, a finger nail, a thumb or a stylus.
The scriber data may comprise discrete data points or a continuous variable. The touch screen may use stress, strain, conductivity, surface acoustic waves, capacitance and/or heat to determine the position and/or the pressure of the scriber with respect to the touch screen.
A touch gesture may comprise, for example, tapping the touch screen with the scriber, dragging the scriber across the touch screen and/or exerting a pressure on the touch screen. Scriber input may comprise single-touch or multi-touch input.
A touch gesture may comprise a contact action on a virtual keyboard (or other input interface). The contact action may be considered to comprise an initial contact location, a path along which contact with the touch-sensitive screen continues, and a final contact location at which contact of the scriber with the touch-sensitive screen is removed, forming an input stroke pattern according to the said recorded contact action. The parameter values may relate to, for example, one or more of a time aspect, a velocity aspect, a speed aspect, a pressure aspect and/or a rate of change of pressure of the touch gesture.
The accepted values may comprise one or more values. The rejected values may comprise one or more values.
The contact position may be considered to give a measure of the location of the scriber within the touch screen area. The contact position may be given in terms of distances with respect to a reference point (e.g. 3.4cm along an x-axis and 2.5 cm along a y-axis), or with respect to a user interface element (e.g. the contact position is within (or corresponds to) the 'a' key). The contact position may be used to select (or otherwise interact with) user interface elements corresponding to the determined contact position.
The received position values may be considered to give a measure of the location of the scriber within the touch screen area during the contact period. The received position values may be given in terms of distances with respect to a reference point (e.g. 3.4cm along an x-axis and 2.5 cm along a y-axis), or with respect to a user interface element (e.g. the contact position is within (or corresponds to) the 'a' key).
The position values may be categorised based on whether the corresponding parameter values fall within one or more corresponding parameter ranges (or satisfy one or more criteria). The corresponding parameter range may be bounded by a single value (e.g. the corresponding speed being greater than 0.01 m/s) or using multiple values. For example, multiple values could define a range:
by giving an upper and lower bound (e.g. the corresponding pressure being between 100Pa and 500Pa);
by giving a single value where the second value is implied (e.g. a speed less than 0.01 m/s (as speed would be at least 0 m/s); or
by giving a single value with an appropriate tolerance (e.g. a time of 0.1 ±0.05s).
The one or more corresponding parameter ranges may comprise, for example:
an end absolute time range of predetermined duration (e.g. between 10 and
100ms) at the end of the contact period;
a beginning absolute time range (e.g. between 5 and 500ms) of predetermined duration at the beginning of the contact period; an end relative time range at the end of the contact period, the duration of the end relative time range being a predetermined proportion (e.g. between 5 and 30%) of the contact period;
a beginning relative time range at the beginning of the contact period, the duration of the beginning relative time range being a predetermined proportion (e.g. between 15 and 40%) of the contact period;
a predetermined scriber velocity range (e.g. less than 0.1 m/s); and
a predetermined scriber pressure range (e.g. between 10Pa and 1500Pa), the scriber pressure being the pressure exerted by the scriber on the touch screen during the contact period.
The apparatus may be configured to categorise position values as rejected values which represent positions having:
corresponding time parameter values within an end absolute time range of predetermined duration (e.g. between 10 and 100ms) at the end of the contact period; corresponding time parameter values within a beginning absolute time range (e.g. between 5 and 500ms) of predetermined duration at the beginning of the contact period; corresponding time parameter values within an end relative time range at the end of the contact period, the duration of the end relative time range being a predetermined proportion (e.g. between 5 and 30%) of the contact period;
corresponding time parameter values within a beginning relative time range at the beginning of the contact period, the duration of the beginning relative time range being a predetermined proportion (e.g. between 15 and 40%) of the contact period;
corresponding velocity parameter values within a predetermined scriber velocity range (e.g. less than 0.1 m/s); and/or
corresponding pressure parameter values within a predetermined scriber pressure range (e.g. between 10Pa and 1500Pa), the scriber pressure being the pressure exerted by the scriber on the touch screen during the contact period. The apparatus may be configured to categorise, as rejected values, scriber data values which represent positions which the scriber
traverses in a first direction at the beginning of the contact period; and
traverses in a second direction at the end of the contact period. The apparatus may be configured to:
calculate derived corresponding parameter values from the received scriber data. The calculation may comprise any mathematical operator (e.g. time integral) and/or function (e.g. sine, cosine function) applied to values of the received scriber data. For example, corresponding parameter values representing velocity may be derived from position values and corresponding time values (e.g. by differentiating the position values with respect to time). The calculation of derived corresponding parameter values may occur before or after the categorisation of position values as accepted or rejected values.
The determined contact position may be, for example, at least one of:
the average position value of accepted values;
the most common position value of the accepted values; or
the middle position value of the accepted values, ordered by at least one corresponding parameter; and
the last position value of the accepted values, ordered by time.
The apparatus may be configured to:
calculate a respective contact position for each subset of the accepted values representing the position of the scriber over a continuous period of time.
The display screen may comprise user interface elements, each user interface element defined by a position value range, the apparatus configured to:
associate at least one of the determined contact positions with a respective user interface element by correlating the determined contact position and the position value range for a particular user interface element; and
select the user interface element corresponding to the correlated contact position. The determined contact position may be a particular user interface element.
The user interface elements may comprise, for example, one or more of an letter input key, a punctuation mark input key, a character input key, a numeric input key, a slider, a input region, an icon (e.g. an application icon), a menu item, a function key and a soft key. Menu items may correspond to one or more of single or multiple software applications, contacts, website favourites, previous calls, TV channels, settings, games, names, addresses and any other items.
The apparatus may be configured such that the respective user interface elements (e.g. keys) provide for scriber input of characters (e.g. textual input). The characters may include, for example, numbers, punctuation marks and/or letters of the Roman, Greek, Arabic and/or Cyrillic alphabets. The apparatus may be configured to enable input of Chinese or Japanese characters, either directly or via transcription methods such as Pinyin and/or Bopomofo (Zhuyin Fuhao). The apparatus may provide for a key input area in one or more of English, Chinese, Japanese, Greek, Arabic, Indo-European, Oriental and Asiatic language.
The apparatus may enable user interface elements to be scrolled or moved (e.g. drag and drop). The apparatus may enable drawing or creating user-scribed delineations (e.g. handwriting, drawing Chinese characters and/or drawing pictures). The apparatus may enable textual input methods wherein a user enters words by sliding a finger or stylus from letter to letter, lifting between words and/or phrases (e.g. Swype®).
The display may comprise a plurality of user interface elements wherein the apparatus is configured such that the respective user interface elements provide for scriber input of characters, with each indicium indicating the particular character which can be input.
The display may comprise a plurality of user interface elements, and wherein the apparatus is configured such that the user interface elements represent an alphabet key input area. The display may comprise a plurality of user interface elements, and wherein the apparatus is configured such that the respective user interface elements represent an alphanumeric key input area. The display may comprise a plurality of user interface elements, and wherein the apparatus is configured such that at least one of the user interface element represent a single function (e.g. a QWERTY key input area) or a plurality of functions (e.g. a reduced keypad such as a ITU-T E.161 keypad). The apparatus may be used for predictive text messaging (such as iTap), for disambiguating ambiguous key sequences (such as T9 or other predictive text on an ITU-T keypad) and/or multi-touch text messaging.
The apparatus may be configured such that the respective user interface elements provide for scriber input indicating the selection of an icon, with each indicium indicating the particular icon (e.g. relating to a corresponding software application) which can be selected.
The apparatus may be configured such that a series of user interface elements can be selected by contacting the screen with the scriber in one continuous motion, wherein, during the continuous motion, the scriber touches each of the user interface elements corresponding to the series of user interface elements. It will be appreciated that other example embodiments may comprise physical buttons in place of or in addition to touch buttons/keys provided on a touch-screen. The apparatus may be an electronic device or a module for an electronic device. The apparatus, processor and/or memory may be incorporated into an electronic device. The apparatus may comprise a touch screen interface for an electronic device or a module for an electronic device/display or a display of an electronic device. Examples of an electronic device include a portable electronic device, a laptop computer, a Satellite Navigation console, a desktop computer, a monitor, a mobile phone, a TV, a Smartphone, a personal digital assistant (PDA) and a digital camera.
Transmission of data may be via a network (by signalling). The network may be, for example, the internet, a mobile phone network, a wireless network, LAN or Ethernet. The apparatus may comprise a transmitter and or receiver to interact with a network. The transmitter/receiver may comprise, for example, an antenna, an Ethernet port, a LAN connection, a USB port, a radio antenna, Bluetooth connector, infrared port, fibre optic detector/transmitter. In a second aspect, there is provided a method, the method comprising:
receiving scriber data for a touch gesture, the scriber data comprising position values and one or more corresponding parameter values for the touch gesture, the position values representing the position of a scriber on a touch screen over a contact period, wherein the contact period is the period of time during which the scriber is continuously in contact with the touch screen between contact initiation and contact release of the touch gesture;
categorising the position values as either accepted values or rejected values based on whether the corresponding parameter values of the received scriber data meets a predetermined parameter value criterion; and
determining at least one contact position for the touch gesture using the accepted values.
In a third aspect, there is provided a computer program, the computer program comprising computer code configured to:
enable reception of scriber data for a touch gesture, the scriber data comprising position values and one or more corresponding parameter values for the touch gesture, the position values representing the position of a scriber on a touch screen over a contact period, wherein the contact period is the period of time during which the scriber is continuously in contact with the touch screen between contact initiation and contact release of the touch gesture;
enable categorisation of the position values as either accepted values or rejected values based on whether the corresponding parameter values of the received scriber data meets a predetermined parameter value criterion; and
enable determination of at least one contact position for the touch gesture using the accepted values. The computer program may be stored on a recordable medium (e.g. DVD, CD, USB stick or other non-transitory computer-readable medium).
In a fourth aspect, there is provided an apparatus comprising:
a means for receiving configured to receive scriber data for a touch gesture, the scriber data comprising position values and one or more corresponding parameter values for the touch gesture, the position values representing the position of a scriber on a touch screen over a contact period, wherein the contact period is the period of time during which the scriber is continuously in contact with the touch screen between contact initiation and contact release of the touch gesture;
a means for categorising configured to categorise the position values as either accepted values or rejected values based on whether the corresponding parameter values of the received scriber data meets a predetermined parameter value criterion; and
a means for determining configured to determine at least one contact position for the touch gesture using the accepted values.
The invention may make tapping and typing on a touch screen more accurate and less prone to errors caused by unintentional gestures. As a result, typing may be faster, and there may be less erroneous selection of adjacent Ul elements. This may also allow smaller touch screens, screens with higher resolution and more content, and/or smaller touch screen devices, to be made without sacrificing usability.
The present disclosure includes one or more corresponding aspects, example embodiments or features in isolation or in various combinations whether or not specifically stated (including claimed) in that combination or in isolation. Corresponding means for performing one or more of the discussed functions are also within the present disclosure. Corresponding computer programs for implementing one or more of the methods disclosed are also within the present disclosure and encompassed by one or more of the described example embodiments. The above summary is intended to be merely exemplary and non-limiting.
Brief Description of the Figures
A description is now given, by way of example only, with reference to the accompanying drawings, in which:-
Figure 1 illustrates an example embodiment comprising a number of electronic components, including memory, a processor and a communication unit.
Figure 2 depicts an example embodiment with a touch screen and physical buttons.
Figure 3 illustrates a user interacting with the touch-screen as the user inputs a message. Figure 4 depicts raw scriber data for a touch gesture.
Figure 5a illustrates the raw scriber data in graphical form.
Figure 5b illustrates the accepted data values.
Figure 6a depicts scriber data and calculated derived corresponding parameter values. Figure 6b illustrates the accepted data values.
Figure 7a illustrates scriber data comprising position values, corresponding time values and corresponding pressure values.
Figure 7b illustrates the accepted data values.
Figure 8a depicts scriber data and derived parameter values representing the user interface elements corresponding to the position values.
Figure 8b illustrates the accepted data values.
Figure 9a depicts scriber data for an example embodiment configured to determine a plurality of contact positions based on velocity.
Figure 9b illustrates the accepted data values.
Figure 10a illustrates a user interacting with an example embodiment to enter text.
Figure 1 1 a depicts scriber data for an example embodiment configured to detect movement along the same path in different directions.
Figure 1 1 b illustrates the accepted data values.
Figure 12a depicts a flow diagram describing the method used determine at least one contact position from scriber data.
Figure 13a illustrates schematically a computer readable medium providing a program according to an example embodiment of the present invention. Description of Example Aspects/Example embodiments
Other example embodiments depicted in the figures have been provided with reference numerals that correspond to similar features of earlier described example embodiments. For example, feature number 1 can also correspond to numbers 101 , 201 , 301 etc. These numbered features may appear in the figures but may not have been directly referred to within the description of these particular example embodiments. These have still been provided in the figures to aid understanding of the other example embodiments, particularly in relation to the features of similar described example embodiments.
When a user interacts with a touch screen, it may be, for example, to select or otherwise interact with an object displayed on the screen. To do this the user generally has to position the scriber (e.g. his finger) on the displayed object to select the displayed object. However, as the user is interacting with the screen the scriber may move around unintentionally. This movement may result in, for example, an unintentional selection of an adjacent displayed object. This is particularly an issue when the touch screen displayed objects are small (or the unintentional movements are large (e.g. users with limited coordination)).
Example embodiments described herein may be considered to provide a way of filtering out unwanted or unintentional gestures (e.g. at the end of a single touch gesture (or 'tap- up')). Example embodiments may be configured to record the position of the scriber throughout a single touch gesture, then ignore certain touched positions based on one or more predetermined criteria.
Figure 1 depicts an example embodiment (101 ) of an apparatus, such as a mobile phone or personal digital assistant, comprising a touch screen (105) such as, for example, a Projected Capacitive Touch Screen. In other example embodiments, the apparatus (101 ) may comprise a module for a mobile phone (or PDA, audio/video player or other suitable device), and may just comprise a suitably configured memory 107 and processor 108 (see below).
The apparatus (101 ) of Figure 1 is configured such that it may receive, include, and/or otherwise access data. For example, this example embodiment (101 ) comprises a communications unit (103), such as a receiver, transmitter, and/or transceiver, in communication with an antenna (102) for connecting to a wireless network and/or a port (not shown) for accepting a physical connection to a network, such that data may be received via one or more types of networks. This example embodiment comprises a memory (107) that stores data, possibly after being received via an antenna (102) or port or after being generated at the touch screen user interface (105b). The touch screen user interface allows the user to interact with the touch screen (105) to generate scriber data, the scriber data representing scriber input (that is, scriber input is received at the touch screen from the scriber). The processor (108) may receive the data representing scriber input from the touch screen user interface (105b), from the memory (107), or from the communication unit (103). Regardless of the origin of the data, these data may be outputted to a user of apparatus (101 ) via a touch screen (105) display device (105a), and/or any other output devices provided with the apparatus. The processor (108) may also store the data for later use in the memory (107). The memory (107) may store computer program code and/or applications which may be used to instruct/enable the processor (108) to perform functions (e.g. generate, delete, read, write and/or process data). It will be appreciated that other example embodiments may comprise additional displays (e.g. CRT screen, LCD screen and/or plasma screen) and/or user interfaces (e.g. physical keys and/or buttons). It will be appreciated that references to a memory or a processor may encompass a plurality of memories or processors. Figure 2 depicts the outward appearance of the example embodiment of figure 1 comprising a portable electronic device (101 ), e.g. such as a mobile phone, with a user interface comprising a touch-screen (105), a physical keypad (106) comprising buttons, a memory (not shown) and a processor (not shown). The portable electronic device is configured to allow the user to interact with the portable electronic device with his/her finger on the touch screen (105). It will be appreciated that in other suitably adapted example embodiments the user may interact with the touch screen using a stylus.
Figure 3 depicts a user interacting with the touch screen (105) of the example embodiment of figure 2. This example embodiment is configured to enable text entry. To facilitate text entry (e.g. for an email, twitter post, SMS message, MMS message or text based message), this example embodiment has a conventional text entry mode wherein the touch-screen (105) is divided into two regions. The bottom (keyboard) region (1 12) of the screen is where the user can select letters by pressing his finger (i.e. the user's finger (120) is the scriber in this example) on the screen. The top (text entry) region of the screen (1 1 1 ) is configured to display the letters entered into the device using the scriber. In this example embodiment the bottom (keyboard) region comprises touch sensitive user interface elements (keys) which are fixed with respect to the screen. It will be appreciated that for other example embodiments determining at least one contact position corresponding to the scriber interaction may enable user interface elements to be scrolled or moved (e.g. drag and drop). It will be appreciated that for other example embodiments, determining at least one contact position corresponding to the scriber interaction may facilitate drawing (e.g. handwriting, drawing Chinese characters, drawing pictures).
In this case the scriber (123) is the user's finger. As the scriber is interacting with the touch screen (105), the touch screen (105) generates scriber data. In this case the scriber data comprises both the position of the scriber (measured in millimetres) on the screen and the corresponding time (measured in milliseconds), wherein the time is measured from the initial contact with the screen for this touch gesture. The position of the scriber (x, y) (122) is given with respect to a reference point (121 ) on the surface of the touch screen (along two axes). The x-axis and y-axis lie parallel to the bottom edge (124) of the touch screen and the side edge (125) of the touch screen respectively. It will be appreciated that other coordinate systems, including 2D and 3D coordinate systems, may be used to define the position of the scriber on the screen. For this example embodiment, the scriber data for a touch gesture is transmitted to the processor of the device. When the touch gesture has been completed the scriber data is processed by the processor of the example embodiment.
An example of the raw scriber data received by the processor of figure 3 is shown in figure 4 and in figure 5a. Figure 4 depicts the raw scriber data in tabulated form and figure 5a shows the same scriber data (540) in graphical form. In this case the position values (432) relate to the x-axis only. However it will be appreciated that multi-dimensional positional data could be processed in a similar way. In this case the scriber data (540) gives the position values, x, (432) and the corresponding time values, t, (431 ). The position values are measured at discrete intervals (in this case the intervals are 2 ms). It will be appreciated that other example embodiments may use one or more discrete and or continuous variables. In this case, the user has moved slightly as he initiated the touch gesture (541 ), coming to rest at a particular position some time later. The user holds this particular position for a period before unintentionally moving the scriber as he lifts his finger and completes the touch gesture (contact release (542)). The touch gesture has a contact period which is the duration of time from contact initiation and contact release. For the touch gesture scriber data shown in figure 4 the contact period (543) is 52ms. For this example embodiment, the apparatus is configured to categorise the position values, as either accepted values or rejected values, based on the (corresponding parameter values of) time (431 ) within the contact period. A graphical form of the accepted values and associated time values is shown in figure 5b. In particular, if the corresponding parameter (time) value is within an end absolute time range (545) (e.g. 10 ms) of the end of the touch contact period (543), the corresponding positional value is categorised as rejected values for the touch gesture. If the time is not within an end absolute time range (544) (e.g. 10 ms) of the end of the contact period (543) then the corresponding positional data is categorised as accepted values.
It will be appreciated that the end absolute time range could be any predetermined absolute duration of time (e.g. between 5 and 50ms). It will be appreciated that the scriber data could be categorised according to an end relative time range. An end relative time range could be any fraction of the contact period. For example, any positional data corresponding to a time within 10% (or other value such as between 1 and 30%) of the contact period at the end of the contact period may be categorised as rejected values. It will be appreciated that a corresponding parameter (time) range (used as a criterion for categorisation) may not be at the end of the contact period (e.g. it may be at the beginning or in the middle of the contact period).
It will be appreciated that, although this example embodiment uses a single criterion, other example embodiments may use a plurality of criteria. This example embodiment is configured to determine a contact position (546) for the touch gesture using the accepted (position) values. For this example embodiment, the contact position is determined to be the last (temporally ordered) accepted position value. In this case the contact position is determined to be 2.8mm. It will be appreciated that for other example embodiments the determined contact position may be calculated by taking the mean of the accepted scriber data, the mode of the scriber data, the median of the scriber data or some other method.
Figure 6 shows the same scriber data input (640) into a further example embodiment. In this case the scriber data (640) is represented by the top graph. The scriber data (640) in this case, as in the previous example, was generated by the touch screen and comprises values for the (x-axis) position and the corresponding (parameter) time. Unlike the previous example embodiment which used criteria based directly on the received scriber data (position values and corresponding time values), this example embodiment is configured to calculate derived corresponding (velocity) parameter values (649) and categorise the position values (632) (as accepted or rejected values) based on these derived corresponding (velocity) parameter values. In this case each position value (632), Xi, has a corresponding time value (631 ), f,. The labelling variable, /', denotes the order of the (time or position) value in the sequence ordered by time. That is, the time value (631 ), ti+1 , is the next time value in the (temporal) sequence directly after the time value (631 ), f,. This example embodiment calculates a corresponding (velocity) parameter value (633), v corresponding to the time value (632), f„ and position value (631 ), x„ for each /' as:
_ Xi+l ~ Xi
! ti+l - tt
The calculated velocity values (633) corresponding to the scriber data (640) are shown in figure 6 on the bottom graph. The top (position versus time) and the bottom (velocity versus time) graphs have equivalent time axis scales.
The example embodiment is configured to categorise the positional values (631 ) according to criteria based on the corresponding derived (velocity) parameter values (633). In this example embodiment, each position value (631 ) is categorised as either an accepted value or a rejected value on the basis of whether the corresponding velocity value (633) is greater than, or not greater than, a predetermined velocity threshold, vthresh (650). In this case, each position value (631 ), x„ which has a corresponding derived velocity value (633), ½, greater than vthresh, (650) (that is, if v, > vthresh) is categorised as rejected values. Each position value (631 ), x„ which has a corresponding derived velocity value (633), ½, not greater than, vthresh, (650) (that is, if v, ≤ vthresh) is categorised as accepted values. The accepted values and corresponding time values are depicted in graphical form in figure 6b. As velocity is a vector quantity, it will be appreciated that different velocity thresholds may be used for movements in different directions.
Using the accepted values, this example embodiment is configured to determine a contact position for the touch gesture. For this example embodiment, the contact position (646) is determined by determining the middle accepted (position) value ordered by time (or by the labelling variable, /'). That is, if there are N accepted values, the contact position is the position value corresponding to the (N/2)th accepted value. In this case, as the time interval between successive time values is the same, this means that the time period before the middle accepted (position) value (647) is of the same duration as the time period after the middle (accepted) position value (648). In this case the middle accepted (position) value (646) corresponds to determined contact position of 2.8 mm. It will be appreciated that, for other example embodiments, the time interval may vary as a function of time, in which case the middle accepted vale or the accepted value corresponding to the middle time may be used.
It is not uncommon for there to be significant pressure changes during a touch gesture. For example, during a tap touch gesture (e.g. a touch of short duration) the pressure of the scriber on the touch screen may quickly decrease during tap up, changing the touch position. It may be desirable to categorize as rejected data those position values which correspond to, for example, low pressure contact.
Figure 7a shows scriber data, for a touch gesture with a contact period (743), input (740) into such a further example embodiment. In this case the scriber data (740) is represented by the top (position versus time) graph and the bottom (pressure versus time) graph. The scriber data (740) in this case, as in the previous example, was generated by the touch screen. However, for this example embodiment, the touch screen is configured to generate parameter values relating to a plurality of corresponding parameters. The scriber data in this case comprise (corresponding parameter) time values (731 ), position values of the scriber on the touch screen (732), and (corresponding parameter) pressure values of the scriber on the screen (734). In this case both the position and pressure are measured continuously (i.e. the time values (731 ) represent a continuous variable).
In this case the example embodiment is configured to categorise the position data values (732) based on a plurality of criteria. That is, the example embodiment is configured to categorise the position data values (732) based on both the corresponding (pressure) parameter values (734) and the corresponding (time) parameter values (731 ). That is, a position data value, x, is categorised as a rejected value if:
the corresponding pressure value, P(x), is less that the threshold pressure value, Pthresh (751 ); or
the corresponding time value, t(x), is within an relative time range of the end of the contact period (745).
Otherwise the position data (732) is categorised as an accepted value. The accepted data (749) corresponding to the accepted (position) values is shown in figure 7b. From the accepted (position) values this example embodiment is configured to determine a contact position (746). In this case the example embodiment determines the accepted (position) value which corresponds to the highest applied pressure, Pmax (756). In this example, the contact position is determined to be 2.8mm. The example embodiment may determine which graphical user interface element corresponds to this determined contact position (744) and enable the performance of the corresponding function. In this case the end relative time range is 10% of the total contact period. It will be appreciated that other relative time range values could be used (e.g. between 2 and 30%). The criterion related to the time ensures that movements, at the end of the contact period, which are of short duration in comparison to the duration of the complete touch from touch initiation to touch release are categorized as rejected values.
It will be appreciated that other example embodiments may be configured to detect or calculate (derive) the rate of change of pressure. The rate of change of pressure may also be used as a corresponding parameter, which along with a corresponding parameter range may by used to categorize the position values as accepted values or rejected values. Thus, for example, high rates of change in pressure may be associated with rejection. In other uses it may be associated with acceptance.
In order to enable the user to interact with a user interface element, the determined contact position may denote a range of positions corresponding to a user interface element, rather than a specific position.
Figure 8a shows scriber data input into a further example embodiment, the example embodiment configured to determine a contact position corresponding to a user interface element. In this case the scriber data (840) is represented by the top (position versus time) graph. The scriber data (840) in this case, as in the previous example, was generated by the touch screen.
This example embodiment is configured to recognise a range of positions relating to a user interface element. For example an x-position value between 2.4cm and 3.2cm (from a reference point) may correspond to an 'a' key user interface element. In this example embodiment the user interface elements make up a QWERTY keyboard for textual entry. In this example (depicted in figure 8a) the user has touched three distinct user interface elements (or buttons (B-i , B2 and B3)) during the contact period. In this case, x values between:
x0 and Xi correspond to user interface element B-i (in this example B-i corresponds to the letter 'q'); Xi and x2 correspond to user interface element B2 (in this example B2 corresponds to the letter 'w'); and
x2 and x3 correspond to user interface element B3 (in this example B3 corresponds to the letter 'e').
In this way the apparatus can calculate (or derive) the corresponding button (position) value (835), wherein the corresponding button (position) value (835) denotes which virtual button (or other user interface element) the scriber is interacting with. In this case the example embodiment is configured to categorise the position data values (832) based on the corresponding time values (831 ). That is, a position data value is categorised as a rejected value if the corresponding (discrete) time value, t,, (831 ) is within a relative time range (845) of the end of the contact period. Otherwise the position value (831 ) is categorised as an accepted value. In this case the relative time range is 20% of the total contact period. It will be appreciated that other relative time range values could be used. It will be appreciated that, in this example, the categorising of the position data values could be performed before or after deriving the corresponding button (position) values (835). From the accepted (position) values (shown in figure 8b) this example embodiment is configured to determine a contact position. However, unlike the previous case where the determined contact position was determined to be an absolute position value, in this example embodiment, the contact position is determined to be a button position value. That is, this example embodiment is configured to determine the last button position value corresponding to the accepted position values. In this example the contact position is determined to correspond to user interface element B2. This example embodiment is then configured to activate the function corresponding to the determined contact position. In this case, the letter 'w' is entered into the text message. By rejecting data within a (relative or absolute) time range at the end of the contact period, scriber user interface element changes during touch release (e.g. immediately before the finger loses touch with the screen) may be ignored.
It will be appreciated that other example embodiments may enable selection of all of the user interface elements corresponding to accepted position values. In this example, as each of B^ B2 and B3 correspond to accepted position values, all of B^ B2 and B3 would be selected. It will be appreciated that other example embodiments may be configured to recognize turning points (or stationary points such as maxima, minima and/or points of inflection) in the parameter values. For example, for the accepted data depicted in figure 8b, there are a number of local maxima (839) (and local minima (838)) within the (accepted) position versus time graph. The turning points (local maxima (839), points of inflection and/or local minima (838)) may be used to determine a contact position (e.g. the contact position could be determined to correspond to the position (or positions) at which there is a turning point). In this example, as the local maxima (839) correspond to button position B2 and B3 (but not B-i ), this determination may result in only buttons B2 and B3 being selected for the example scriber data. It will be appreciated that turning points in other corresponding parameter values (e.g. pressure, velocity, speed) may also be determined (e.g. to determine one or more contact positions). Figure 9a depicts the scriber data generated by a user interacting with a further example embodiment configured to enable text entry. This example embodiment is configured to enable a series of characters (e.g. letters) to be input using a single touch gesture. The scriber data (940) in this case comprises position values (932) and (discrete) time values (931 ) (as for the example embodiment of figure 3). From this scriber data (940), this example embodiment is configured to calculate corresponding (scalar) speed values (937) Si, using the equation
Xi+l
In this case, the example embodiment is configured to categorise, as rejected values, those position values which have corresponding speed values exceeding a threshold speed value, sthresh (952). Those position values which do not have corresponding speed values exceeding a threshold speed value, sthresh(952), are categorised as accepted values (shown in figure 9b). For the example depicted in figure 9b, the accepted values comprise two distinct and consecutive sets of accepted (position) values (971 and 972). The apparatus is configured to determine a contact position for each consecutive set of accepted (position) values (971 and 972). Each consecutive set of accepted position values (971 and 972) consist of position values which correspond to a successive and uninterrupted series of time values (ordered by time) representing a continuous period of time. For each consecutive set of accepted (position) values, the apparatus determines the most common position value. This example embodiment then determines the user interface element (character) corresponding to each of the determined contact positions. The example embodiment enables the performance of the function (e.g. selecting a letter for text entry) associated with each of the corresponding user interface elements.
It will be appreciated that, for other example embodiments, the user interface elements may correspond to menu items, or icons. It will be appreciates that when a plurality of user interface elements are selected, they may be performed in the order in which they were selected (e.g. selecting characters for text entry) or not in the order in which they were selected (e.g. a user running a plurality of independent programs by selecting them using a single touch gesture).
For example, if the touch screen was configured to display a keyboard (e.g. a QWERTY keyboard) for textual input, the user could select a series of letters, in order, by moving the scriber (whilst touching the touch screen) sufficiently quickly between the desired keys (in the desired order), and sufficiently slowly (e.g. by pausing) when the scriber is in contact with the desired key. Figure 10 depicts the user of the example embodiment of figure 9 entering text into a textual message (e.g. email, SMS message, MMS message or text message). To facilitate text entry, this example embodiment has a text entry mode wherein the touchscreen (1005) is divided into two regions. The bottom (keyboard) region (1012) of the screen is where the user can select a series of user interface elements (or letters) on the screen. The user's finger (1023) is the scriber in this example. The top (text entry) region of the screen (101 1 ) is configured to display the letters entered into the device using the scriber.
Figure 10 depicts the position of the user's finger (or scriber (1023)) at the end of the contact action (1090) (or touch gesture) corresponding to the word 'map'. The user initiated contact with the screen at the key corresponding to the (letter) character 'm'. After a pause (or sufficiently slow movement) lasting at least as long as to get a velocity measurement, the user moved his finger across the keys to the (letter) character 'a'. As the movement would be sufficiently rapid, all of the position values (and corresponding user interface elements (101 1 )) touched between the (letter) characters 'm' and 'a' would be categorised as rejected (e.g. position values corresponding to the letters 'n', 'b', V, 'c', 'd' and 's'). The user would then pause (or move slowly) on the (letter) character 'a' such that a sufficiently low velocity would be recorded (thereby causing the position values to be categorised as accepted values), before moving (sufficiently) rapidly to the (letter) character 'p'. After pausing on the (letter) character 'p' the user could release contact between the scriber and the touch screen. By moving the scriber in such a way, the user could ensure that position data corresponding to the desired letters would be categorised as accepted data and position data corresponding to undesired letters would be categorised as rejected data. In this way words (and even phrases or sentences) could be entered without removing the scriber from the touch screen. This would be useful when rapid text entry would be useful.
It will be appreciated that the method described above could be used in conjunction with error-detection and error-correction technologies to mitigate the effects of inadvertently activating the wrong key (e.g. dictionary based word-checking). It will also be appreciated that this example embodiment may be used for predictive text messaging (such as iTap), for disambiguating ambiguous key sequences (such as T9 or other predictive test on an ITU-T keypad) and/or multi-tap text messaging. It will be appreciated that such error- correction technologies could be activated (e.g. for each word) by completing the touch gesture (contact release), and/or by selecting a particular user interface element (e.g. a space bar).
Figure 1 1 a depicts the scriber data (1 180) for an example embodiment configured to categorise as rejected positions corresponding to the scriber moving back and forth along the same path during (and/or after) touch initiation and during (and/or before) touch release. In this case the desired touch position may be where the motion reverses direction. This kind of motion may be detected when typing quickly with the blunt, soft tip of a finger.
Figure 1 1 a depicts the scriber data (1 140) for a touch gesture input into an example embodiment configured to reject position data values (1 181 ) corresponding to a set of positions which the scriber traverses in a first direction at the beginning of the contact period; and traverses in a second direction at the end of the contact period. Figure 1 1 a depicts both the x and y positions corresponding to a touch gesture. During the touch gesture, the scriber is first moved through a set of position values (1 181 ) in a first direction. Then the scriber pauses on a particular position for a period of time. Then the scriber traverses the same set of position values (1 181 ) in a second direction immediately prior to touch release. In this case the scriber traverses the set of position values in the second direction at a greater velocity than when traversing the set of position values in the first direction. The example embodiment is configured to compare the series of position values at (or just after) contact initiation and at (or just before) contact release and categorise as rejected those series of position values which have been traversed in different directions at these times. From the remaining accepted position values (figure 1 1 b) at least one contact position is determined. For this example embodiment the contact position is determined to be the mean position of the accepted position values.
By filtering out unintentional gestures and changes in the touch position, the desired touch position may be more accurately determined.
As described in the aforementioned example embodiments, when using a scriber (e.g. stylus, finger or thumb) to interact with a touch screen, the apparatus may ignore unintentional touches to the key/menu item/icon adjacent to the intended key by considering the parameter values of the received scriber data. For example, when typing on a virtual keyboard, the user's finger may rotate or slide while tapping down and when lifting the finger, causing the tap-up position, which may be used to detect the touch position, to change unintentionally. By categorising the position data as accepted or rejected, these unintentional touches may be ignored for the purposes of calculating a contact position.
Figure 12 shows a flow diagram illustrating how a contact position is determined using scriber data for a touch gesture.
Figure 13 illustrates schematically a computer/processor readable media 1200 providing a program according to an example embodiment of the present invention. In this example, the computer/processor readable media is a disc such as a digital versatile disc (DVD) or a compact disc (CD). In other example embodiments, the computer readable media may be any media that has been programmed in such a way as to carry out an inventive function.
It will be appreciated to the skilled reader that any mentioned apparatus/device and/or other features of particular mentioned apparatus/device may be provided by apparatus arranged such that they become configured to carry out the desired operations only when enabled, e.g. switched on, or the like. In such cases, they may not necessarily have the appropriate software loaded into the active memory in the non-enabled (e.g. switched off state) and only load the appropriate software in the enabled (e.g. on state). The apparatus may comprise hardware circuitry and/or firmware. The apparatus may comprise software loaded onto memory. Such software/computer programs may be recorded on the same memory/processor/functional units and/or on one or more memories/processors/functional units. In some example embodiments, a particular mentioned apparatus/device/server may be pre-programmed with the appropriate software to carry out desired operations, and wherein the appropriate software can be enabled for use by a user downloading a "key", for example, to unlock/enable the software and its associated functionality. Advantages associated with such example embodiments can include a reduced requirement to download data when further functionality is required for a device, and this can be useful in examples where a device is perceived to have sufficient capacity to store such preprogrammed software for functionality that may not be enabled by a user.
It will be appreciated that the any mentioned apparatus/circuitry/elements/processor may have other functions in addition to the mentioned functions, and that these functions may be performed by the same apparatus/circuitry/elements/processor. One or more disclosed aspects may encompass the electronic distribution of associated computer programs and computer programs (which may be source/transport encoded) recorded on an appropriate carrier (e.g. memory, signal).
It will be appreciated that any "computer" described herein can comprise a collection of one or more individual processors/processing elements that may or may not be located on the same circuit board, or the same region/position of a circuit board or even the same device. In some example embodiments one or more of any mentioned processors may be distributed over a plurality of devices. The same or different processor/processing elements may perform one or more functions described herein.
It will be appreciated that the term "signalling" may refer to one or more signals transmitted as a series of transmitted and/or received signals. The series of signals may comprise one, two, three, four or even more individual signal components or distinct signals to make up said signalling. Some or all of these individual signals may be transmitted/received simultaneously, in sequence, and/or such that they temporally overlap one another. With reference to any discussion of any mentioned computer and/or processor and memory (e.g. including ROM, CD-ROM etc), these may comprise a computer processor, Application Specific Integrated Circuit (ASIC), field-programmable gate array (FPGA), and/or other hardware components that have been programmed in such a way to carry out the inventive function.
The applicant hereby discloses in isolation each individual feature described herein and any combination of two or more such features, to the extent that such features or combinations are capable of being carried out based on the present specification as a whole, in the light of the common general knowledge of a person skilled in the art, irrespective of whether such features or combinations of features solve any problems disclosed herein, and without limitation to the scope of the claims. The applicant indicates that the disclosed aspects/example embodiments may consist of any such individual feature or combination of features. In view of the foregoing description it will be evident to a person skilled in the art that various modifications may be made within the scope of the disclosure. While there have been shown and described and pointed out fundamental novel features of the invention as applied to example embodiments thereof, it will be understood that various omissions and substitutions and changes in the form and details of the devices and methods described may be made by those skilled in the art without departing from the spirit of the invention. For example, it is expressly intended that all combinations of those elements and/or method steps which perform substantially the same function in substantially the same way to achieve the same results are within the scope of the invention. Moreover, it should be recognized that structures and/or elements and/or method steps shown and/or described in connection with any disclosed form or example embodiment of the invention may be incorporated in any other disclosed or described or suggested form or example embodiment as a general matter of design choice. Furthermore, in the claims means-plus-function clauses are intended to cover the structures described herein as performing the recited function and not only structural equivalents, but also equivalent structures. Thus although a nail and a screw may not be structural equivalents in that a nail employs a cylindrical surface to secure wooden parts together, whereas a screw employs a helical surface, in the environment of fastening wooden parts, a nail and a screw may be equivalent structures.

Claims

1 . An apparatus comprising:
at least one processor; and
at least one memory including computer program code,
the at least one memory and the computer program code configured to, with the at least one processor, cause the apparatus to perform at least the following:
receive scriber data for a touch gesture, the scriber data comprising position values and one or more corresponding parameter values for the touch gesture, the position values representing the position of a scriber on a touch screen over a contact period, wherein the contact period is the period of time during which the scriber is continuously in contact with the touch screen between contact initiation and contact release of the touch gesture;
categorise the position values as either accepted values or rejected values based on whether the corresponding parameter values of the received scriber data meets a predetermined parameter value criterion; and
determine at least one contact position for the touch gesture using the accepted values.
2. The apparatus of claim 1 , wherein the parameter values relate to one or more of a time aspect, a velocity aspect, a speed aspect, a pressure aspect, and a rate of change of pressure aspect of the touch gesture.
3. The apparatus of claim 1 or claim 2, wherein the position values are categorised based on whether the corresponding parameter values fall within one or more corresponding parameter ranges.
4. The apparatus of claim 3, wherein the one or more corresponding parameter ranges comprise a combination of one or more of:
an end absolute time range of predetermined duration at the end of the contact period;
a beginning absolute time range of predetermined duration at the beginning of the contact period;
an end relative time range at the end of the contact period, the duration of the end relative time range being a predetermined proportion of the contact period; a beginning relative time range at the beginning of the contact period, the duration of the beginning relative time range being a predetermined proportion of the contact period;
a predetermined scriber velocity range ; and
a predetermined scriber pressure range, the scriber pressure being the pressure exerted by the scriber on the touch screen during the contact period.
5. The apparatus of any preceding claim, wherein the apparatus is configured to categorise, as rejected values, scriber data values which represent positions which the scriber
traverses in a first direction at the beginning of the contact period; and
traverses in a second direction at the end of the contact period.
6. The apparatus of any preceding claim wherein the apparatus is configured to: calculate derived corresponding parameter values from the received scriber data.
7. The apparatus of any preceding claim, wherein the determined contact position is at least one of:
the average position value of accepted values;
the most common position value of the accepted values; or
the middle position value of the accepted values, ordered by at least one corresponding parameter; and
the last position value of the accepted values, ordered by time.
8. The apparatus of claim 1 , wherein the one or more corresponding parameters comprise time, the apparatus configured to
calculate a respective contact position for each subset of the accepted values representing the position of the scriber over a continuous period of time.
9. The apparatus of any preceding claim, wherein the display screen comprises user interface elements, each user interface element defined by a position value range, the apparatus configured to:
associate at least one of the determined contact positions with a respective user interface element by correlating the determined contact position and the position value range for a particular user interface element; and
select the user interface element corresponding to the correlated contact position.
10. The apparatus of claim 9 wherein at least one of the user interface elements comprises one or more of an letter input key, a punctuation mark input key, a character input key, a numeric input key, a slider, a input region, an application icon, a menu item, a function key and a soft key.
1 1 . The apparatus of any preceding claim, where the scriber comprises a finger, a stylus or a finger nail.
12. The apparatus of any preceding claim wherein the apparatus is an electronic device, a display for an electronic device, a module for an electronic device, or a module for a display of an electronic device.
13. A electronic device comprising the apparatus of claim 12, wherein the electronic device is a portable electronic device, a laptop computer, a desktop computer, a mobile phone, a personal digital assistant, a music player, a camera or telephone.
14. Method, the method comprising:
receiving scriber data for a touch gesture, the scriber data comprising position values and one or more corresponding parameter values for the touch gesture, the position values representing the position of a scriber on a touch screen over a contact period, wherein the contact period is the period of time during which the scriber is continuously in contact with the touch screen between contact initiation and contact release of the touch gesture;
categorising the position values as either accepted values or rejected values based on whether the corresponding parameter values of the received scriber data meets a predetermined parameter value criterion; and
determining at least one contact position for the touch gesture using the accepted values.
15. The method of claim 14, wherein the parameter values relate to one or more of a time aspect, a velocity aspect, a speed aspect, a pressure aspect, and a rate of change of pressure aspect of the touch gesture.
16. The method of claim 14, wherein the categorising of the position values is based on whether the corresponding parameter values fall within one or more corresponding parameter ranges.
17. The method of claim 16, wherein the one or more corresponding parameter ranges comprise a combination of one or more of:
an end absolute time range of predetermined duration at the end of the contact period;
a beginning absolute time range of predetermined duration at the beginning of the contact period;
an end relative time range at the end of the contact period, the duration of the end relative time range being a predetermined proportion of the contact period;
a beginning relative time range at the beginning of the contact period, the duration of the beginning relative time range being a predetermined proportion of the contact period;
a predetermined scriber velocity range ; and
a predetermined scriber pressure range, the scriber pressure being the pressure exerted by the scriber on the touch screen during the contact period.
18. The method of any preceding method claim, wherein scriber data values which represent positions which the scriber
traverses in a first direction at the beginning of the contact period; and
traverses in a second direction at the end of the contact period
are categorised as rejected values.
19. The method of any preceding method claim, wherein the method comprises:
calculating derived corresponding parameter values from the received scriber data.
20. The method of any preceding method claim, wherein determining the contact position comprises at least one of:
determining the average position value of accepted values;
determining the most common position value of the accepted values; or determining the middle position value of the accepted values, ordered by at least one corresponding parameter; and
determining the last position value of the accepted values, ordered by time.
21 . The method of any preceding method claim, wherein the one or more corresponding parameters comprise time, the method comprising:
calculating a respective contact position for each subset of the accepted values representing the position of the scriber over a continuous period of time.
22. The method of any preceding method claim, wherein the display screen comprises user interface elements, each user interface element defined by a position value range, the method comprising:
associating at least one of the determined contact positions with a respective user interface element by correlating the determined contact position and the position value range for a particular user interface element; and
selecting the user interface element corresponding to the correlated contact position.
23. The method of claim 22 wherein at least one of the user interface elements comprises one or more of an letter input key, a punctuation mark input key, a character input key, a numeric input key, a slider, a input region, an application icon, a menu item, a function key and a soft key.
24. The method of any preceding method claim, where the scriber comprises a finger, a stylus or a finger nail.
25. A computer program, the computer program comprising computer code configured to:
enable reception of scriber data for a touch gesture, the scriber data comprising position values and one or more corresponding parameter values for the touch gesture, the position values representing the position of a scriber on a touch screen over a contact period, wherein the contact period is the period of time during which the scriber is continuously in contact with the touch screen between contact initiation and contact release of the touch gesture;
enable categorisation of the position values as either accepted values or rejected values based on whether the corresponding parameter values of the received scriber data meets a predetermined parameter value criterion; and
enable determination of at least one contact position for the touch gesture using the accepted values.
26. An apparatus comprising:
a means for receiving configured to receive scriber data for a touch gesture, the scriber data comprising position values and one or more corresponding parameter values for the touch gesture, the position values representing the position of a scriber on a touch screen over a contact period, wherein the contact period is the period of time during which the scriber is continuously in contact with the touch screen between contact initiation and contact release of the touch gesture;
a means for categorising configured to categorise the position values as either accepted values or rejected values based on whether the corresponding parameter values of the received scriber data meets a predetermined parameter value criterion; and
a means for determining configured to determine at least one contact position for the touch gesture using the accepted values.
PCT/FI2010/050983 2010-12-01 2010-12-01 Receiving scriber data WO2012072853A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/FI2010/050983 WO2012072853A1 (en) 2010-12-01 2010-12-01 Receiving scriber data

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/FI2010/050983 WO2012072853A1 (en) 2010-12-01 2010-12-01 Receiving scriber data

Publications (1)

Publication Number Publication Date
WO2012072853A1 true WO2012072853A1 (en) 2012-06-07

Family

ID=46171228

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/FI2010/050983 WO2012072853A1 (en) 2010-12-01 2010-12-01 Receiving scriber data

Country Status (1)

Country Link
WO (1) WO2012072853A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130106700A1 (en) * 2011-11-02 2013-05-02 Kabushiki Kaisha Toshiba Electronic apparatus and input method
CN103713205A (en) * 2012-09-28 2014-04-09 名硕电脑(苏州)有限公司 Capacitive touch screen automatic testing method

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050146508A1 (en) * 2004-01-06 2005-07-07 International Business Machines Corporation System and method for improved user input on personal computing devices
US20080105470A1 (en) * 2004-08-02 2008-05-08 Koninklijke Philips Electronics, N.V. Touch Screen Slider for Setting Floating Point Value
US20080165160A1 (en) * 2007-01-07 2008-07-10 Kenneth Kocienda Portable Multifunction Device, Method, and Graphical User Interface for Interpreting a Finger Gesture on a Touch Screen Display
US20090284482A1 (en) * 2008-05-17 2009-11-19 Chin David H Touch-based authentication of a mobile device through user generated pattern creation
US20100073329A1 (en) * 2008-09-19 2010-03-25 Tiruvilwamalai Venkatram Raman Quick Gesture Input
US20100194701A1 (en) * 2008-10-28 2010-08-05 Hill Jared C Method of recognizing a multi-touch area rotation gesture

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050146508A1 (en) * 2004-01-06 2005-07-07 International Business Machines Corporation System and method for improved user input on personal computing devices
US20080105470A1 (en) * 2004-08-02 2008-05-08 Koninklijke Philips Electronics, N.V. Touch Screen Slider for Setting Floating Point Value
US20080165160A1 (en) * 2007-01-07 2008-07-10 Kenneth Kocienda Portable Multifunction Device, Method, and Graphical User Interface for Interpreting a Finger Gesture on a Touch Screen Display
US20090284482A1 (en) * 2008-05-17 2009-11-19 Chin David H Touch-based authentication of a mobile device through user generated pattern creation
US20100073329A1 (en) * 2008-09-19 2010-03-25 Tiruvilwamalai Venkatram Raman Quick Gesture Input
US20100194701A1 (en) * 2008-10-28 2010-08-05 Hill Jared C Method of recognizing a multi-touch area rotation gesture

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130106700A1 (en) * 2011-11-02 2013-05-02 Kabushiki Kaisha Toshiba Electronic apparatus and input method
CN103713205A (en) * 2012-09-28 2014-04-09 名硕电脑(苏州)有限公司 Capacitive touch screen automatic testing method
CN103713205B (en) * 2012-09-28 2016-10-26 名硕电脑(苏州)有限公司 Capacitance touch screen automatic test approach

Similar Documents

Publication Publication Date Title
AU2008100502A4 (en) List scrolling in response to moving contact over list of index symbols
US9201510B2 (en) Method and device having touchscreen keyboard with visual cues
US8059101B2 (en) Swipe gestures for touch screen keyboards
AU2006330740B2 (en) Scrolling list with floating adjacent index symbols
US10331219B2 (en) Identification and use of gestures in proximity to a sensor
US8456433B2 (en) Signal processing apparatus, signal processing method and selection method of user interface icon for multi-touch panel
EP2653955B1 (en) Method and device having touchscreen keyboard with visual cues
KR101636705B1 (en) Method and apparatus for inputting letter in portable terminal having a touch screen
US20050046621A1 (en) Method and device for recognizing a dual point user input on a touch based user input device
JP2018049657A (en) Classifying intent of user inputs
US20130285926A1 (en) Configurable Touchscreen Keyboard
US20120092278A1 (en) Information Processing Apparatus, and Input Control Method and Program of Information Processing Apparatus
CA2812457C (en) Method and device having touchscreen keyboard with visual cues
EP2506122B1 (en) Character entry apparatus and associated methods
WO2007076205A2 (en) Continuous scrolling list with acceleration
EP2405329A1 (en) Input control method and electronic device for a software keyboard
WO2014062583A1 (en) Character deletion during keyboard gesture
US8081170B2 (en) Object-selecting method using a touchpad of an electronic apparatus
KR20140105691A (en) Apparatus and Method for handling object in a user device having a touch screen
EP2660692A1 (en) Configurable touchscreen keyboard
US20120169607A1 (en) Apparatus and associated methods
JP2011243157A (en) Electronic apparatus, button size control method, and program
WO2012072853A1 (en) Receiving scriber data
KR20110053014A (en) Apparatus and method for offering user interface of electric terminal having touch-screen
WO2014128573A1 (en) Capturing diacritics on multi-touch devices

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 10860315

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 10860315

Country of ref document: EP

Kind code of ref document: A1