WO2017120522A1 - Method of character identification that uses time dependent button presses and time independent swipe gestures - Google Patents

Method of character identification that uses time dependent button presses and time independent swipe gestures Download PDF

Info

Publication number
WO2017120522A1
WO2017120522A1 PCT/US2017/012605 US2017012605W WO2017120522A1 WO 2017120522 A1 WO2017120522 A1 WO 2017120522A1 US 2017012605 W US2017012605 W US 2017012605W WO 2017120522 A1 WO2017120522 A1 WO 2017120522A1
Authority
WO
WIPO (PCT)
Prior art keywords
value
button
swipe
character
cpu
Prior art date
Application number
PCT/US2017/012605
Other languages
French (fr)
Inventor
Michael William MURPHY
Original Assignee
Murphy Michael William
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Murphy Michael William filed Critical Murphy Michael William
Publication of WO2017120522A1 publication Critical patent/WO2017120522A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/20Natural language analysis
    • G06F40/237Lexical tools
    • G06F40/242Dictionaries
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/20Natural language analysis
    • G06F40/274Converting codes to words; Guess-ahead of partial word inputs

Definitions

  • This description generally relates to the field of electronic devices and, more particularly, to user interfaces of electronic devices.
  • a computer processor-implemented method may be summarized as including detecting, by at least one computer processor, a swipe gesture from a first button of a plurality of selection buttons toward a second button of the plurality of selection buttons; and selecting, by at least one computer processor, a character based on assigned values of the first and second selection buttons.
  • the character may be a member of a menu, the character may be identified by a value, and the assigned values of the first and second selection buttons may mathematically combine to equal the character's identifying value.
  • the method may further include in response to the detection of the swipe gesture, adding, by at least one computer processor, assigned values of the first and second buttons; and selecting the character based on a result of the adding.
  • the method may further include in response to the detection of the swipe gesture, subtracting, by at least one computer processor, assigned values of the first and second buttons; and selecting the character based on a result of the subtracting.
  • a first character of the at least two characters may occupy a position of the one-dimensional array identified by an assigned value of the first button and a second character of the at least two characters may occupy a position of the one-dimensional array identified by a sum of the assigned value of the first button and an assigned value of the second button.
  • the first button may be assigned one of the button press values -3, -2, +2, and +3.
  • the at least two characters may occupy positions of a one-dimensional array.
  • the one-dimensional array may be a menu of 13 positions, each position may be populated by a character, and the plurality of buttons may include at least four buttons.
  • a computer processor-implemented method may be summarized as including identifying, by at least one computer processor, selection of at least a first button of a plurality of selection buttons and a second button of the plurality of selection buttons in response to a swipe gesture; and in response to the identification at least the first button of the plurality of selection buttons and the second button of the plurality of selection buttons in response to the swipe gesture, selecting, by at least one computer processor, a character based on a mathematical combination of assigned values of the first and second selection buttons and a value assigned to the character.
  • the mathematical combination may be a subtraction or addition of the assigned values of the first and second selection buttons based on the direction of the swipe gesture.
  • a computer processor-implemented method may be summarized as detecting a button press by a user.
  • the button press engages one of a plurality of buttons, each button having a unique assigned value.
  • the assigned value of one button of the plurality of buttons is zero (e.g., a reference button)
  • the assigned value of at least two buttons of the plurality of buttons is a positive integer (e.g., +2 and +3)
  • the assigned value of at least two other buttons of the plurality of buttons is a negative integer (e.g., -2 and -3).
  • a processor In response to detection of the button press, a processor starts an elapsed time counter and stores a first value that corresponds to the assigned value of the pressed button.
  • a second value is calculated from the first value based on whether the button is released before or after a given elapsed time period expires and based on whether a positional movement of the button press is completed before or after the elapsed time period expires.
  • the second value In response to release of the pressed button before expiration of a given elapsed time period, the second value is set equal to the first value.
  • the second value In response to completion of a positional movement prior to expiration of the given elapsed time period, the second value is set equal to a combination of the first value and a third value.
  • the third value corresponds to the assigned value of a second button positioned adjacent to the first pressed button.
  • the second value is set equal to the first value plus the third value, (e.g., +2 (assigned value of first pressed button) plus +3 (assigned value of button adjacent to the first button)).
  • the second value is set equal to the first value minus the third value, which is the assigned value of the button adjacent to the first pressed button but closer to the reference (e.g., +3 (assigned value of the first pressed button) minus +2).
  • the second value is set equal to the first value plus the third value, (e.g., -2 (assigned value of first pressed button) plus -3 (assigned value of button adjacent to the first pressed button but farther from the reference)).
  • the second value is set equal to the first value minus the third value, which is the assigned value of the button adjacent to the first pressed button but closer to the reference (e.g., -3 (assigned value of first pressed button) minus -2).
  • the second value In response to detection that the first button pressed extends beyond the given elapsed time period and without completion of a positional movement, the second value is set equal to twice the first value. But in response to completion of a positional movement after the expiration of the given elapsed time period, the second value is set equal to a combination of a fourth value with the second value, where the second equals twice the first value. In some embodiments, the fourth value is equal to one.
  • the second value is set equal to twice the first value plus the fourth value (e.g., (2 * 2) + 1). In some other embodiments, where the first value is positive but the positional movement is in the direction toward the reference button, the second value is set equal to twice the first value minus the fourth value (e.g., (2 * 2) - 1). In other embodiments, where the first value is negative and the direction of the positional movement is away from the reference button, the second value is set equal to twice the first value minus the fourth value (e.g., (2 * (-2)) - 1). In yet other embodiments, where the first value is negative but the direction of the positional movement is toward the reference button, the second value is set equal to twice the first value plus the fourth value (e.g., (2 * (-2)) +1)).
  • each of a plurality of characters is identified by a unique second identifying value.
  • a character from the plurality of characters is selected based on the selected character's identifying value and the calculated second value.
  • a user slides a cursor from the middle position of a menu row to a desired character. To do so, a user presses the selection button that corresponds with the row and position of a color block that contains the desired character.
  • Each color block identifies two characters. By default, the cursor selects the first character it encounters. To select the second character, a user types the first character and lets a correction algorithm exchange the first character for the second one, if needed.
  • a user drags the cursor one position beyond its associated color block by including a swipe gesture with the button press.
  • the previously mentioned correction algorithm launches.
  • the algorithm analyzes the previously entered word, exchanges first characters for second ones, as needed, to identify the intended word in a dictionary. If an intended word is not in the dictionary, a user manually enters the second characters by pressing the button that selects the first character, but maintains the button press for longer than a pre-determined time threshold, such as 0.2 seconds.
  • the cursor After each character selection, the cursor returns to the menu's middle position. Within a few minutes of use, a user recognizes the pattern for selecting each character. From that point on, the cursor is a redundant affordance and can be disabled.
  • robustness refers to the level of dexterity that an interface requires of its users for them to achieve the output they expect.
  • robustness is judged in terms of the positional and temporal accuracy required.
  • the proposed interface has the requirement of temporal accuracy due to the time-dependent button presses that distinguish characters within a color block.
  • the interface provides fewer but larger selection buttons.
  • the large button size compared with a 26-button interface reduces the rate of positional selection errors compared with typical interfaces.
  • the interface provides a novel system for interpreting input gestures.
  • the system includes logic that enables button presses to be time-dependent while keeping swipe gestures time-independent, and an algorithm that corrects time-dependent button press errors very accurately.
  • the system identifies selection buttons and menu positions with values.
  • Menu positions of each row are numbered consecutively with '0' at the middle position.
  • Selection buttons corresponding to each menu row have the assigned values and in one embodiment have the assigned values -5, -2, 0, 2, and 5 with '0' assigned to the middle button.
  • BPV button press value
  • BPT button press type
  • the system interprets an input gesture and identifies its BPT.
  • the BPT is interpreted from the input gestures that move the cursor.
  • the system applies one or more simple calculations to the BPV.
  • the system specifies the character of the menu that corresponds with the row of the pressed selection button and the value of the BPV variable.
  • a gesture interpreter monitors two button press metrics: button press duration and swipe distance. Duration is a measure of the length of a button press from onset to release. Swipe distance is the length of any positional displacement during the button press.
  • a combination of measurements for duration and distance over the course of a character selection cycle can be represented as a response curve and plotted in a two-dimensional plot.
  • button press duration is plotted on one axis and swipe distance on another.
  • Each curve represents the progress of a button press as it unfolds. Onset of a button press occurs at the plot's origin. Its release occurs at a curve's terminus. The path that a curve follows through the plot reflects the duration and swipe distance of a received button press.
  • a threshold value is an arbitrary value that enables the analog output from the metric to be recast as a binary output, i.e., a high or low value.
  • Applying threshold values for each metric to the plot divides the plot into four regions. Each region represents a unique combination of the binary output values from the metrics.
  • the gesture interpreter determines a BPT based on the combination of binary values it interprets. Because the metrics may cross threshold values during a character selection cycle, the BPT may evolve during the selection cycle. The final determination for BPT occurs when the button press is lifted. The region of the plot in which a curve terminates identifies the final BPT for the selection cycle. Examples of typical threshold values are 200 msec for the time response and 25 pixels for the distance response.
  • the purpose for identifying a BPT is to specify a calculation that, in turn, specifies a total BPV and then a character.
  • the proposed interface uses a set of input gestures to specify a character based on its position in a menu.
  • that's realized as a system that uses BPT to specify a calculation that converts the assigned value of a pressed selection button value to a value that in turn identifies the position of a desired character in a menu.
  • the value that identifies the position of a character in the menu is the 'total BPV .
  • Each BPT has an associated calculation (or math operation). Whenever the path of a character selection cycle crosses a threshold, the calculation associated with the newly entered BPT region becomes applied to the BPV variable.
  • the number of calculations that become executed depends on the path that the character selection cycle follows.
  • the value of BPV at the end of the selection cycle is the total BPV. Once calculated, the system inputs the character of the menu that corresponds with the total BPV.
  • Figure 1 is a schematic view of an example electronic device for input of characters with time-dependent button presses and time-independent swipe gestures according to one illustrated embodiment, the electronic device being a mobile device having a housing, a display, a graphics engine, a central processing unit (CPU), user input device(s), one or more storage mediums having various software modules thereon that are executable by the CPU, input/output (I/O) port(s), network interface(s), wireless receiver(s) and transmitter(s), a power source, an elapsed time counter, an integer value counter and a swipe gesture interpreter.
  • CPU central processing unit
  • I/O input/output
  • Figure 2 is a schematic drawing of one embodiment of the electronic device 100 for input of characters.
  • the user interface 150 was previously disclosed in Figure 8 of U.S. Patent No. 8,487,877, which is hereby incorporated by reference in its entirety.
  • Figure 3 is graphical representations of examples of various button press types.
  • Figure 4 is a schematic drawing of a portion of one embodiment of the electronic device 100 for input of characters.
  • Figure 5 is a schematic drawing of a portion of another embodiment of the electronic device 100 for input of characters.
  • Figure 6 is a table of assignments for one embodiment of a method for specifying a character from among a plurality of characters.
  • Figure 7 is a table of example values for variables for one embodiment of a method for specifying a character from among a plurality of characters.
  • Figure 8 is a schematic drawing of a portion of another embodiment of the electronic device 100 for input of characters.
  • Figure 9 is a table of example values for variables for one embodiment of a method for specifying a character from among a plurality of characters.
  • Figure 10 is a flow diagram that shows a method for specifying a character from among a plurality of characters according to one illustrated embodiment.
  • Figure 11 is a flow diagram that shows a method for an electronic device to interpret button presses according to one illustrated embodiment.
  • Figure 12 is a table of value assignments, a user interface and a list of variables for one embodiment of a method of character identification.
  • Figure 13 is flow diagrams that show variables and values for an embodiment of a method for an electronic device to interpret button presses.
  • Figure 14 is an example of an application of a method of character identification.
  • Figure 15 is another example of an application of a method of character identification.
  • Figure 16 is a schematic drawing of another embodiment of the electronic device 100 for input of characters.
  • Figure 17 is a table of value assignments for another embodiment of a method of character identification.
  • Figure 18 is graphical representations of additional examples of various button press types.
  • Figure 19 is a flow diagram that shows another method for an electronic device to interpret button presses according to one illustrated embodiment.
  • Figure 20 is a flow diagram that shows yet another method for an electronic device to interpret button presses according to one illustrated embodiment.
  • Figure 21 is a flowchart that shows another method that interprets button press and swipe input gestures to select a character from a menu according to one illustrated embodiment.
  • Figure 22 is a flow diagram that shows a method for specifying a character from among a plurality of characters according to one illustrated embodiment.
  • Figure 23 is a flow diagram that shows a method for an electronic device to interpret button presses according to one illustrated embodiment.
  • Figure 24 is a table of possible values and variables for a method of interpreting input according to one to set of selection buttons.
  • Figure 25 is table of possible values and variables for a method of interpreting input according to another set of selection buttons.
  • Figure 26 is another flow diagram that shows a method for an electronic device to interpret button presses according to one illustrated embodiment.
  • Figure 27 is a table of possible variable combinations for a method of interpreting button presses according to one illustrated embodiment.
  • Figure 28 is a table of value assignments, a user interface and a list of variables for one embodiment of a method of character identification.
  • Figure 29 is an example of an application of a method of character identification.
  • Figure 30 is another example of an application of a method of character identification.
  • Figure 31 is a schematic drawing of another embodiment of the electronic device 100 for input of characters.
  • Figure 32 is another table of value assignments, a user interface and a list of variables for one embodiment of a method of character identification.
  • Figure 33 is a plot of graphical representations of examples of possible button input gestures as a function of time and positional displacement.
  • Figure 34 is a schematic drawing of one embodiment of the electronic device 100 for input of characters.
  • the user interface 150 was previously disclosed in Figure 8 of U.S. Patent No. 8,487,877, which is hereby incorporated by reference in its entirety.
  • Figure 35 is a plot of graphical representations of possible examples of responses of input gestures.
  • Figure 36 is a flow diagram that shows a method for specifying a character from among a plurality of characters according to one illustrated embodiment.
  • Figure 37 is a flow diagram that shows a method for an electronic device to interpret button presses according to one illustrated embodiment.
  • Figure 38 is a table of possible values and variables for a method of interpreting input according to one to set of selection buttons.
  • Figure 39 is table of possible values and variables for a method of interpreting input according to another set of selection buttons.
  • Figure 40 is another flow diagram that shows a method for an electronic device to interpret button presses according to one illustrated embodiment.
  • Figure 41 is a table of possible variable combinations for a method of interpreting button presses according to one illustrated embodiment.
  • Figure 42 is a table of value assignments, a user interface and a list of variables for one embodiment of a method of character identification.
  • Figure 43 is an example of an application of a method of character identification.
  • Figure 44 is another example of an application of a method of character identification.
  • Figure 45 is a schematic drawing of another embodiment of the electronic device 100 for input of characters.
  • Figure 46 is another table of value assignments, a user interface and a list of variables for one embodiment of a method of character identification.
  • Figure 1 is a schematic view of one example electronic device, in this case mobile device 100, for input of characters with optional time- dependent button presses according to one illustrated embodiment.
  • the mobile device 100 shown in Figure 1 may have a housing 102, a display 104, a graphics engine 106, a central processing unit (CPU) 108, one or more user input devices 110, one or more storage mediums 112 having various software modules 114 stored thereon comprising instructions that are executable by the CPU 108, input/output (I/O) port(s) 116, one or more wireless receivers and transmitters 118, one or more network interfaces 120, and a power source 122.
  • I/O input/output
  • some or all of the same, similar or equivalent structure and functionality of the mobile device 100 shown in Figure 1 and described herein may be that of, part of or operably connected to a communication and/or computing system of another device or machine.
  • the mobile device 100 may be any of a large variety of devices such as a cellular telephone, a smartphone, a wearable device, a wristwatch, a portable media player (PMP), a personal digital assistant (PDA), a mobile communications device, a portable computer with built-in or add-on cellular communications, a portable game console, a global positioning system (GPS), a handheld industrial electronic device, a television, an automotive interface, an augmented reality (AR) device, a virtual reality (VR) device or the like, or any combination thereof.
  • the mobile device 100 has at least one central processing unit (CPU) 108 which may be a scalar processor, a digital signal processor (DSP), a reduced instruction set (RISC) processor, or any other suitable processor.
  • CPU central processing unit
  • DSP digital signal processor
  • RISC reduced instruction set
  • the central processing unit (CPU) 108, display 104, graphics engine 106, one or more user input devices 110, one or more storage mediums 112, input/output (I/O) port(s) 116, one or more wireless receivers and transmitters 118, and one or more network interfaces 120 may all be communicatively connected to each other via a system bus 124.
  • the system bus 124 can employ any suitable bus structures or architectures, including a memory bus with memory controller, a peripheral bus, and/or a local bus.
  • the mobile device 100 also includes one or more volatile and/or nonvolatile storage medium(s) 112.
  • the storage mediums 112 may be comprised of any single or suitable combination of various types of processor-readable storage media and may store instructions and data acted on by CPU 108. For example, a particular collection of software instructions comprising software 114 and/or firmware instructions comprising firmware are executed by CPU 108.
  • the software or firmware instructions generally control many of the operations of the mobile device 100 and a subset of the software and/or firmware instructions may perform functions to operatively configure hardware and other software in the mobile device 100 to provide the initiation, control and maintenance of applicable computer network and
  • the CPU 108 includes an elapsed time counter 140.
  • the elapsed time counter 140 may be implemented using a timer circuit operably connected to or as part of the CPU 108. Alternately some or all of the elapsed time counter 140 may be implemented in computer software as computer executable instructions stored on volatile and/or non-volatile storage medium(s) 112, for example, that when executed by CPU 108 or a processor of a timer circuit, performs the functions described herein of the elapsed time counter 140.
  • the CPU 108 includes an integer value counter (also called button press value counter) 142.
  • an integer value counter also called button press value counter
  • some or all of the integer value counter 142 may be implemented in computer software as computer executable instructions stored on volatile and/or non-volatile storage medium(s) 112, for example, that when executed by CPU 108, performs the functions described herein of the integer value counter 142.
  • the CPU 108 includes a swipe gesture interpreter 144.
  • some or all of the swipe gesture interpreter 144 may be implemented in computer software as computer executable instructions stored on volatile and/or non-volatile storage medium(s) 112, for example, that when executed by CPU 108, performs the functions described herein of the swipe gesture interpreter 144.
  • the storage medium(s) 112 may be processor-readable storage media which may comprise any combination of computer storage media including volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data. Combinations of any of the above should also be included within the scope of processor-readable storage media.
  • the storage medium(s) 112 may include system memory which includes computer storage media in the form of volatile and/or nonvolatile memory such as read- only memory (ROM) and random access memory (RAM).
  • ROM read- only memory
  • RAM random access memory
  • BIOS basic input/output system
  • RAM typically contains data and/or program modules that are immediately accessible to and/or presently being operated on by CPU 108.
  • Figure 1 illustrates software modules 114 including an operating system, application programs and other program modules that implement the processes and methods described herein.
  • the mobile device 100 may also include other removable/non- removable, volatile/nonvolatile computer storage media drives.
  • the storage medium(s) 112 may include a hard disk drive or solid state storage drive that reads from or writes to non-removable, nonvolatile media, a SSD that reads from or writes to a removable, nonvolatile SSD, and/or an optical disk drive that reads from or writes to a removable, nonvolatile optical disk such as a DVD-RW or other optical media.
  • a user may enter commands and information into the mobile device 100 through touch screen display 104 or the one or more other input device(s) 110 such as a keypad, keyboard, tactile buttons, camera, motion sensor, position sensor, light sensor, biometric data sensor, accelerometer, or a pointing device, commonly referred to as a mouse, trackball or touch pad.
  • Other input devices of the mobile device 100 may include a microphone, joystick, thumbstick, game pad, optical scanner, other sensors, or the like.
  • the touch screen display 104 or the one or more other input device(s) 110 may include sensitivity to swipe gestures, such as a user dragging a finger tip across the touch screen display 104.
  • the sensitivity to swipe gestures may include sensitivity to direction and/or distance of the swipe gesture.
  • a user input interface that is coupled to the system bus 124, but may be connected by other interface and bus structures, such as a parallel port, serial port, wireless port, game port or a universal serial bus (USB).
  • USB universal serial bus
  • a unique software driver stored in software 114 configures each input mechanism to sense user input, and then the software driver provides data points that are acted on by CPU 108 under the direction of other software 114.
  • the display is also connected to the system bus 124 via an interface, such as the graphics engine 106.
  • the mobile device 100 may also include other peripheral output devices such as speakers, a printer, a projector, an external monitor, etc., which may be connected through one or more analog or digital I/O ports 116, network interface(s) 120 or wireless receiver(s) and transmitter(s) 118.
  • the mobile device 100 may operate in a networked environment using connections to one or more remote computers or devices, such as a remote computer or device.
  • the mobile device 100 When used in a LAN or WAN networking environment, the mobile device 100 may be connected via the wireless receiver(s) and transmitted s) 118 and network interface(s) 120, which may include, for example, cellular receiver(s) and transmitter(s), Wi-Fi receiver(s) and transmitter(s), and associated network interface(s).
  • the mobile device 100 When used in a WAN networking environment, the mobile device 100 may include a modem or other means as part of the network interface(s) for establishing
  • the wireless receiver(s) and transmitter(s) 118 and the network interface(s) 120 may be communicatively connected to the system bus 124.
  • program modules depicted relative to the mobile device 100, or portions thereof, may be stored in a remote memory storage device of a remote system.
  • the mobile device 100 has a collection of I/O ports 116 and/or short range wireless receiver(s) and transmitter(s) 118 and network interface(s) 120 for passing data over short distances to and from the mobile device 100 or for coupling additional storage to the mobile device 100.
  • I/O ports 116 and/or short range wireless receiver(s) and transmitter(s) 118 and network interface(s) 120 for passing data over short distances to and from the mobile device 100 or for coupling additional storage to the mobile device 100.
  • serial ports can be included in the mobile device 100.
  • USB ports can be included in the like.
  • Wi- Fi ports can be any combination of wireless technology that can be used to communicate with a Wi- Fi ports.
  • Bluetooth® ports i.e., Bluetooth® ports
  • IEEE 1394 i.e., FireWire
  • Compact Flash (CF) ports can couple a memory device to the mobile device 100 for reading and writing by the CPU 108 or couple the mobile device 100 to other communications interfaces such as Wi-Fi or Bluetooth transmitters/receivers and/ or network interfaces.
  • CF Compact Flash
  • SD Secure Digital
  • Mobile device 100 also has a power source 122 (e.g., a battery).
  • the power source 122 may supply energy for all the components of the mobile device 100 that require power when a traditional, wired or wireless power source is unavailable or otherwise not connected.
  • Other various suitable system architectures and designs of the mobile device 100 are contemplated and may be utilized which provide the same, similar or equivalent functionality as those described herein.
  • the various techniques, components and modules described herein may be implemented in connection with hardware, software and/or firmware or, where appropriate, with a combination of such.
  • the methods and apparatus of the disclosure, or certain aspects or portions thereof may take the form of program code (i.e., instructions) embodied in tangible media, such as various solid state memory devices, DVD-RW, RAM, hard drives, flash drives, or any other machine-readable or processor-readable storage medium wherein, when the program code is loaded into and executed by a machine, such as a processor of a computer, vehicle or mobile device, the machine becomes an apparatus for practicing various embodiments.
  • vehicles or mobile devices such generally includes a processor, a storage medium readable by the processor (including volatile and non-volatile memory and/or storage elements), at least one input device, and at least one output device.
  • One or more programs may implement or utilize the processes described in connection with the disclosure, e.g., through the use of an API, reusable controls, or the like.
  • Such programs are preferably implemented in a high level procedural or object oriented programming language to communicate with a computer system of mobile device 100.
  • the program(s) can be implemented in assembly or machine language, if desired. In any case, the language may be a compiled or interpreted language, and combined with hardware implementations.
  • Figure 2 shows a schematic drawing of one embodiment of the electronic device 100 for input of characters.
  • the device 100 may have some or all the components and functionality described herein with respect to the mobile device 100 of Figure 1.
  • the device 100 has aspects previously disclosed in Figure 8 of U.S. Patent No. 8,487,877, which is hereby incorporated by reference in its entirety.
  • the electronic device 100 includes the display 104, a plurality of characters 200 that populate positions 242 of a character menu 240, a plurality of selection buttons 110 and a spacebar button 264, which together make up a user interface 150 of the device 100.
  • Each of the plurality of selection buttons 110 has an assigned button press value 222. Included as part or within proximity to the menu 240 is a reference 258 and an offset scale 260.
  • the display 104, the plurality of selection buttons 110, and the spacebar button 264 are communicatively coupled with the CPU 108, as described in the embodiment of Figure 1.
  • the CPU 108 includes the elapsed time counter 140, the integer value counter 142 and the swipe gesture interpreter 144, as described in the embodiment of Figure 1.
  • the CPU 108 is communicatively coupled with the storage medium 112 and the power source 122, as described in the
  • the positions 242 of the menu 240 are arranged in a one-dimensional array similar to the embodiment in Figure 8 of Patent No. 8,487,877, except that the menu 240 and corresponding selection buttons 110 are shown on the display 104 instead of as physical features of the user interface 150.
  • the buttons 110 are communicatively coupled with the CPU 108.
  • the menu 240 and the offset scale 260 are positioned in respective one-dimensional arrays in the user interface region 150 of the device 100.
  • the character menu 240 and the offset scale 260 are positioned on the user interface 150 so that they lie adjacent to and parallel with one other.
  • the character menu 240 and the offset scale 260 are programmed in software so that they appear as features on the display 104 of the device 100.
  • positions 242 of the menu 240 are distributed in a one-dimensional array in evenly spaced increments.
  • values of the offset scale 260 are distributed in a one-dimensional array in spatial increments that match the increment of the menu 240, so that by referencing the offset scale 260 to the menu 240, characters 200 in the menu are effectively numbered.
  • the reference 258 is an indicator located near or on one of the positions 242 of the menu 240.
  • the offset scale 260 includes a value of zero that is located to correspond with the reference 258 of the menu 240. Values of the offset scale 260 increase from zero in pre-selected increments as positions of the offset scale get farther from the zero value. In a further embodiment, values of the offset scale 260 decrease from zero in pre-selected increments as positions of the offset scale get farther from the zero value in a direction opposite to the increasing direction. In one embodiment, the pre-selected increment of the offset scale 260 equals one and the values of the offset scale extend from a negative value to a positive value passing through zero. In an alternative embodiment, the increment of the offset scale 260 is 10 and positions 242 of the menu 240 are marked off in corresponding units of 10.
  • the positions 242 of the menu 240 and the values of the offset scale 260 are distributed in respective one-dimensional arrays positioned adjacent to and parallel with one another, the values of the offset scale 260 count in increments of one and are spaced with respect to one another in their array to correspond with the spacing of positions 242 of the menu 240, and the zero value of the offset scale 260 corresponds to the reference 258 of the menu 240 so that the values of the offset scale 260 label the positions of the menu 240 according to how many positions a given position 242 of the menu 240 is offset from the reference 258.
  • the plurality of selection buttons 110 lie on the display 104 of the user interface 150 of the device 100.
  • the buttons 110 are arranged in a row that corresponds to the physical alignment of the menu 240 on the user interface.
  • Each button is communicatively coupled with the CPU 108 and is assigned a button press value 222.
  • the assigned button press value 222 can be either positive or negative.
  • Each button 110 has the function that when the button is pressed the value 222 assigned to the button is input to the CPU 108.
  • each button 110 has the function that when the button is pressed and a swipe gesture follows the press, the value 222 assigned to the button is input to the CPU 108 along with identification of one or more mathematical operators and the assigned value 222 of one or more neighboring selection buttons.
  • each button 1 10 has the function that when the button is pressed and a swipe gesture follows the press, the value 222 assigned to the button is doubled, then increased by one, and then input to the CPU 108.
  • the assigned button press value 222 of each selection button is unique.
  • there are four selection buttons and the buttons' assigned values are -3, -2, +2, and +3.
  • there are four selection buttons and the buttons' assigned values are -3, -1, +1, and +3.
  • there are five selection buttons and the buttons' assigned values are -3, -2, 0, +2, and +3.
  • the one or more mathematical operators identified by the swipe gesture are addition and/or subtraction.
  • the identified math operators are specified by the direction of the swipe gesture.
  • the identified math operators are specified by the distance (or length) of the swipe gesture.
  • an assigned integer value of a neighboring button is mathematically combined with the integer value identified by the button press, wherein the neighboring button and mathematical operator are identified by the direction of the swipe.
  • the spacebar 264 also lies in the user interface region 150 of the device 100, can be either a hard or soft key, and is communicatively coupled with the CPU 108.
  • the menu 240 has 13 menu positions 242 and the plurality of selection buttons includes four buttons with the assigned button press values 222: '-3, -2, +2, +3' .
  • the menu positions 242 are populated by 13 of the 26 characters 200 of the English alphabet.
  • FIG. 3 shows graphical representations of examples of four button press types (BPTs) 224.
  • Button press types are a means of classifying a received button press.
  • a button press is classified according to a measured duration 208 and a swipe distance 372.
  • U.S. Provisional Patent Application No. 62/155,372 filed April 30, 2015, entitled SYSTEMS AND METHODS FOR WORD IDENTIFICATION THAT USE BUTTON PRESS TYPE ERROR ANALYSIS
  • a button press type is assigned a mathematical operator.
  • the interpretation of a button press type determines the character selected from a menu according to the character's position in the menu and the mathematical operator assigned to the button press type.
  • Each example button press type 224 of Figure 3 has a first and second horizontal bar 326, 331.
  • the first horizontal bar 326 represents the passage of time.
  • the second horizontal bar 331 represents the distance (or length) of a positional translation.
  • a black region 327 within the bar indicates a time period when a button is pressed.
  • a white region 328 indicates a time period when a button is not pressed.
  • a solid vertical marker 329 indicates the beginning or end of an elapsed time period (ETP) metric 330.
  • the elapsed time period 330 is of selectable length and commences with the onset of a button press.
  • the length of the black region 327 indicates the duration 208 of the button press.
  • the duration of the button press is compared with the elapsed time period metric 330. In one embodiment, the selected length of the elapsed time period 330 is 0.10 seconds.
  • the black region 327 within the bar indicates a distance (or length) 372 of a positional translation of a button press along the touch sensitive screen 104.
  • the positional translation is measured starting from the position of the button press at the onset of the press.
  • the white region 328 within the bar indicates a distance from the starting position that the positional translation of the button press has not reached.
  • the solid vertical markers 329 indicate the beginning or end of a swipe distance metric 373.
  • the swipe distance metric 373 is of selectable length and commences at the position of the button press on the screen 104 at the button press' onset.
  • the distance (or length) 372 of a positional translation of a button press is compared with the swipe distance metric 373.
  • the selected length of the swipe distance metric 373 is 20 pixels.
  • a device 100 monitors the duration 208 and swipe distance 372 of a button press concurrently. In other words the elapsed time
  • the device measures the duration 208 of the button press and the swipe gesture interpreter 144 measures its swipe distance 372.
  • the device monitors the duration 208 starting at the onset of a button press and monitors the swipe distance 372 starting at the position of the button press at the onset of the press, so that the measurement of duration and distance begin simultaneously.
  • the button press types 224 of Figure 3 are distinguished from one another by (1) the button press duration 208 with respect to the ETP metric 330 and (2) the swipe distance 372 with respect to the swipe distance metric 373.
  • both the duration 208 and the swipe distance 372 are less than their respective metrics.
  • the duration 208 exceeds its metric while the swipe distance 372 does not.
  • a second button press is received before the ETP 330 expires, while the swipe distance for neither button press exceeds the swipe distance metric 330.
  • the swipe distance 372 exceeds its metric while the duration 208 does not.
  • the rate of the positional translation must be fast enough so that the swipe distance 372 exceeds the distance metric 373 before the elapsed time period 330 expires. Otherwise, the gestures could be interpreted as the short BPT 340.
  • the elapsed timer period equals 0.1 sec and the swipe distance metric equals 20 pixels. In this case, the rate of the positional translation must be at least 20 pixels /
  • a BPT not disclosed in Figure 3, but disclosed later on, is one where both the duration 208 and the swipe distance 372 exceed their respective metrics.
  • Figure 4 shows an embodiment of a portion of the user interface 150.
  • the portion shown includes the selection buttons 110 implemented as on-screen buttons on the touch display screen 104.
  • An example of an executed button press is represented by a button press position 321 on a particular selection button 111. From the position 321 of the button press the CPU 108 determines (1) which of the selection buttons 110 is selected (in this example, the button with assigned integer value -3) and (2) an initial point from which any positional translation of the button press is measured. Also, as described in U.S. Patent No. 8,487,877, the onset of the button press triggers the elapsed time counter 330 to start, which the CPU 108 uses to distinguish time-dependent BPTs.
  • the CPU 108 measures the duration 208 of the button press (as shown in Figure 3). Furthermore, the swipe gesture interpreter 144 of the CPU 108 monitors a present position 323 of the button press relative to the initial position 321. In one embodiment, the gesture interpreter 144 monitors the distance (or length) 372 and a direction 374 of any positional translation that occurs.
  • the swipe gesture interpreter 144 has programmable thresholds that (1) define whether the distance 372 of a given positional translation constitutes a swipe gesture and (2) classifies the direction 374 of a positional translation, if one occurs.
  • a positional translation from the button press position 321 must occur within the bounds of a given sweep 375 to be considered a swipe gesture.
  • a positional translation must extend a minimum distance 373 from the initial position 321 to be considered a swipe gesture.
  • the two previously mentioned conditions must both be met for the positional translation to be considered a sweep gesture.
  • the threshold for the minimum translation is the boundary of the button itself, or some position relative to the button's boundary.
  • the distance 373 is measured in mm but in an alternative embodiment the distance is measured in pixels. In one embodiment the minimum distance 373 is 15 pixels, which for an example button size of 60 x 60 pixels is a translation of 25% of the button's width.
  • a sweep 375 there are multiple sweeps 375 within which a positional translation could become classified.
  • the particular sweep 375 that a position translation falls within defines a direction 374 of the swipe gesture.
  • positional translations occurring within the opposing sweeps 376, 377 are classified as having a direction 374 opposite to one another, even if the exact positional translations that lead to their respective classification are not precisely parallel.
  • a sweep 375 spans 90 degrees, but in alternative embodiments the span of the sweep is less or more.
  • Figure 5 shows another embodiment of the user interface 150.
  • the embodiment includes the selection buttons 110, the menu 240 and the values of the offset scale 260.
  • the menu 240 further includes individual character positions 242 and the reference 258, all as previously described in Figure 2.
  • the menu 240 is one-dimensional. In other words, positions 242 of the menu are arranged in a row, similar to elements of a one-dimensional array. Consistent with a one-dimensional array, each menu position is identified by an index, which in one embodiment is a value of the offset scale 260.
  • a menu position 242 can be specified for selection by its index. Furthermore, a change in a selected menu position to a neighboring position of the menu is considered a movement of the selection. In one embodiment, movement of a selection from a menu position farther from the reference 258 to a neighboring menu position closer to the reference is a movement in an 'inward' direction 374. Furthermore, movement of a selection from a menu position closer to the reference 258 to a neighboring menu position farther from the reference is a movement in the 'outward' direction 374. In yet a further embodiment, the possible directions 374 of the swipe gestures of Figure 4 correspond with the possible directions 374 of movement of selections within the menu 240 of Figure 5.
  • the directions 374 of the swipe gestures of Figure 4 each have an associated math operator.
  • Figure 6 shows a table that identifies mathematical operators 181 and their correspondence to a direction 374 of a swipe gesture.
  • an 'outward' swipe corresponds to an 'addition' math operator and an 'inward' swipe corresponds to a 'subtraction' math operator.
  • more than two swipe directions is possible and the
  • Figure 7 shows a table that discloses one possible relationship between
  • the button press that initiates a swipe gesture also specifies for the CPU 108 two other pieces of information: (1) the selection button pressed, and thereby the assigned integer value 222 that the CPU stores in the integer value counter 142, and (2) the orientation (and consequently) the direction 374 of a swipe gesture if one is interpreted, i.e., 'inward' or Outward' depending on the selection button identified and the relationships of Figure 5.
  • a swipe gesture interpreter 144 does interpret a swipe gesture, then a math operator 181 is specified according to the interpreted input for (1) and (2) above, and the table of Figure 6.
  • a button press 321 on the selection button with assigned value -3 followed by a swipe gesture in the direction of menu position -2 according to Figure 5 is interpreted as a swipe in the 'inward' direction 374.
  • the 'inward' swipe is then interpreted according to the table of Figure 6 as specifying the 'subtraction' math operator 181.
  • a button press 321 on the selection button with assigned value +2 followed by a swipe gesture in the direction of menu position +3 according to Figure 5 is interpreted as a swipe in the 'outward' direction 374.
  • the 'outward' swipe is interpreted according to the table of Figure 6 as specifying the 'addition' math operator 181.
  • the table of Figure 7 also shows a further relationship where the value of the button that neighbors the pressed button is mathematically combined with the value of the pressed button according to (1) the direction 374 of the swipe gesture, and (2) the math operator 181 specified by the swipe gesture.
  • the integer value 226 of the button neighboring the pressed selection button is mathematically combined with the value 222 of the pressed selection button according to the math operator 181 specified by the direction of the swipe gesture.
  • a button press and swipe gesture together specify positions 242 of the menu 240 that cannot be specified by the actuations of a single button.
  • the positions -5, -1, +1 and +5 are examples where a button press and a swipe gesture identify positions that cannot be identified with the actuations of a single selection button (in this embodiment, the values -3, -2, +2 and +3).
  • Figure 7 discloses other examples of button presses and swipe gestures that can be combined to specify positions 242 of the menu for the embodiments of Figures 4-6.
  • the integer value 242 calculated from the math operation selects a character 200 of the menu 240 by specifying the offset value that identifies the character's position in the menu.
  • the first example 190 specifies the character 'f and the second example 191 specifies the character T .
  • Figure 8 shows another embodiment of a portion of the user interface 150.
  • the portion shown includes the selection buttons 110 implemented as on-screen buttons on the touch display screen 104.
  • An example of an executed button press is represented by a button press position 321 on a particular selection button 111. From the position 321 of the button press the CPU 108 determines (1) which of the selection buttons 110 is selected (in this example, the button with assigned integer value -3) and (2) an initial point from which any positional translation of the button press is measured. Also, as described in previous U.S. Patent No. 8,487,877, the onset of the button press triggers the elapsed time counter 330 to start, which the CPU 108 uses to distinguish time-dependent BPTs.
  • the CPU 108 measures the duration 208 of the button press (as shown in Figure 3). Furthermore, the swipe gesture interpreter 144 of the CPU 108 monitors the present position 323 of the button press relative to the initial position 321. In an embodiment alternative to the one of Figure 4, the gesture interpreter 144 monitors a direction 374 of any positional translation that occurs and the intersection of the positional translation with any boundary 324 of any selection button 110.
  • the initial button press 321 causes the integer value counter 140 to store the value -3.
  • Intersection of the positional translation with the button -2 by an 'inward' gesture subtracts the value -2 from the stored value -3.
  • Intersection of the positional translation with the button 0 by a continued 'inward' gesture subtracts the value 0 from the stored value -1.
  • Intersection of the positional translation with the button +2 is an 'outward' gesture and adds the value +2 to the stored value -1.
  • Intersection of the positional translation with the button +3 by a continued Outward' gesture adds the value +3 to the stored value +1.
  • the integer value counter 140 stores the value +4.
  • the assigned values of the selection buttons and the mathematical operators associated with the direction of the position translation could be different than those given in the example.
  • Figure 9 shows a table that discloses one possible set of relationships between (1) the assigned integer value 222 that an initial button press 321 selects, (2) the assigned integer value 227 of a pressed selection button 110 where a translation of the initial button press ends, (3) the math operators 181 identified by the direction 374 of the positional translation at each intersection of the translation with a subsequent button, (4) the accumulated math operations executed by the integer value counter 140, and (5) the integer value calculated as a result of the accumulated math operations, which in a further embodiment specifies a position 242 of the character selection menu 240.
  • the integer value calculated from the accumulated math operations selects a character 200 of the menu 240 by specifying the offset value that identifies the character's position in the menu.
  • the button press and swipe gesture of Figure 8 specify the character 'k', according to one embodiment.
  • Figure 10 shows a flowchart of an embodiment of a method 504 for a user to specify a character from among a plurality of characters.
  • a user views the characters 200 displayed in the menu 240.
  • the user selects a character from the menu 240 for input to the electronic device 100.
  • the user identifies the selected character by the position of the character with respect to the reference 258 of the menu 240, for example by a value equal to the number of positions the selected character is offset from the menu's reference 258.
  • the user can identify the position (or index) of the selected character in a number of ways, including by referencing the position to a corresponding value in the offset scale 260, counting the number of positions that the selected character is offset from the reference 258, recalling from memory the value that identifies the particular selected character, and recalling by muscle memory the selection button keystrokes that correspond with the selected character or the selected character's position.
  • the user determines whether the value that identifies the selected character's position 242 in the menu 240 equals the assigned button press value 222 of any selection button 110.
  • step 538 the user presses the selection button with the assigned value that equals the selected character's position and releases the button before the elapsed time counter expires.
  • the aforementioned step 538 inputs the assigned value 222 of the pressed selection button to the CPU 108, triggers the CPU 108 to start the elapsed time counter 140, and indicates to the CPU that the type of button press is a SHORT press.
  • step 520 the user waits for the elapsed time counter 140 to expire before, in an optional step 522, the user views the specified character on the display 104. In an alternative embodiment, step 522 is bypassed.
  • the user determines whether the value that identifies the selected character's position 242 in the menu 240 equals twice the assigned button press value 222 of any selection button 110.
  • step 540 the user presses the selection button 110 with the assigned value 222 that equals half the selected character's position and maintains the button press until the elapsed time counter expires.
  • the aforementioned step 540 inputs the assigned value 222 of the pressed selection button to the CPU 108, triggers the CPU 108 to start the elapsed time counter 140, and indicates to the processor that the type of button press is a LONG press.
  • step 522 the user views the specified character on the display 104. In an alternative embodiment, step 522 is bypassed.
  • the user determines if the value that identifies the position of the selected character is the sum or the difference of the assigned button press values of any two adjacently positioned selection buttons.
  • a step 550 the user presses one of the selection button values and swipes from the pressed selection button toward the other button.
  • one button is positioned closer to an associated reference, the other button is positioned farther from the associated reference, and the user presses the button closer to the associated reference and swipes in an outward direction.
  • the button assigned the smaller value is positioned closer to an associated reference, the button assigned the larger value is further from the reference, and the user swipes in an outward direction with respect to the reference.
  • a step 552 the user presses one of the selection button values and swipes from one selection button toward the other.
  • one button is positioned closer to an associated reference, the other button is positioned farther from the associated reference, and the user presses the button farther from the associated reference and swipes in an inward direction.
  • the button assigned the smaller value is positioned closer to an associated reference, the button assigned the larger value is further from the reference, and the user swipes in an inward direction with respect to the reference.
  • the button press that initiates the swipe gesture inputs the assigned value 222 of the pressed selection button 110 to the integer value counter 142 and triggers the CPU 108 to start the elapsed time counter 140.
  • a swipe gesture greater than some predetermined distance before the elapsed time counter expires causes the addition or subtraction operation by the integer value counter 142 to occur.
  • the math operation is determined by the direction 374 of the swipe.
  • the value added or subtracted is determined by the assigned integer value 226 of the button neighboring the pressed selection button in the direction
  • Interpretation of a swipe gesture indicates to the processor that the type of button press is SWIPE 357.
  • the direction 374 determines if the type is SWIPE-OUT or SWIPE-IN.
  • the CPU 108 may also terminate the elapsed time counter 140 before expiration once the BPT is known to be SWIPE 357.
  • the user views the specified character on the display 104, which is an optional step and in an alternative embodiment is bypassed.
  • the character specification method 504 described above is used iteratively to specify series' of characters from the character menu 240.
  • words and sentences are formed on the display 104 by iteratively specifying characters according the method above, with the spacebar 264 used to input spaces between words on the display.
  • FIG. 11 shows a flowchart of an embodiment of a method 611 for the processor 108 of an electronic device to interpret button presses.
  • the CPU 108 initializes the integer value counter 142 to zero.
  • the CPU 108 initializes the elapsed time counter 140 to zero.
  • the CPU 108 monitors the selection buttons 110 for a pressed selection button 110. Once a first selection button press occurs, in another step 616, the CPU 108 adds to the integer value counter 142 a value equal to the assigned value 222 of the first pressed selection button 110.
  • the CPU 108 starts the elapsed time counter 140.
  • the CPU 108 monitors the selection buttons 110 for the occurrence of a second selection button press or a swipe gesture while comparing the elapsed time counter 140 with a user chosen selectable-length time period.
  • the CPU 108 determines if the first button press is still pressed.
  • the CPU 108 interprets as input the character 200 of the menu 240 whose position 242 equals the value of the integer value counter 142.
  • the processor multiplies the value of the integer value counter 142 by two before commencing the subsequent step 624, where the CPU 108 interprets as input the character 200 of the menu 240 whose position 242 equals the value of the integer value counter 142.
  • step 626 the CPU 108 adds to the integer value counter 142 a value equal to the assigned value 222 of the second pressed selection button. Then, in the subsequent step 624 the CPU 108 interprets as input the character 200 of the menu 240 whose position 242 equals the value of the integer value counter 142.
  • swipe gesture interpreter 144 determines the direction 374 of the swipe gesture. If in step 686 the interpreter 144 determines that the swipe is outward, in a step 688 the CPU 108 adds to the integer value counter the assigned value of the button neighboring the pressed selection button on the side corresponding to the direction of the swipe gesture. Then, in the subsequent step 624 the CPU 108 interprets as input the character 200 of the menu 240 whose position 242 equals the value of the integer value counter 142.
  • step 686 the interpreter 144 determines that the swipe is inward
  • step 690 the CPU 108 subtracts from the integer value counter the assigned value of the button neighboring the pressed selection button on the side corresponding to the direction of the swipe gesture. Then, in the subsequent step 624 the CPU 108 interprets as input the character 200 of the menu 240 whose position 242 equals the value of the integer value counter 142.
  • the CPU 108 reinitializes the integer value counter 142 and the elapsed time counter 140 to zero and repeats the method. According to another embodiment, in a further step the CPU 108 displays the identified character 200 on the screen 104.
  • math operations other than addition, subtraction and multiplication-by-two are used in steps 626, 642, 688 and 690 to identify characters by their numerical position in a menu, array or table.
  • the particular button neighboring the pressed selection button is in an alternative direction from the embodiment disclosed or is identified by an alternative input than the direction of the swipe gesture.
  • Figure 12 shows the user interface 150 of Figure 2, a table 185 of value assignments for variables of the method 611 of Figure 11, and a list 186 of input variables for the method 611.
  • the user interface 150, table 185, and list 186 are examples used to demonstrate the embodiments of Figures 10 and 11.
  • the scope of the invention is not limited by the variables and values shown here, but rather by the scope of the claims.
  • the table 185 is divided into rows and columns. Rows are grouped by path as identified in Figure 11. Each column is one variable: the variable 'input gesture' 210, the variable 'duration' 208, the variable 'swipe direction' 374, the variable 'button press type' 224, the variable 'button press values' 222, the variable math operator 181, a variable 'total button press value' 228 and the variable 'character' 200.
  • Path 1 is the short BPT 340
  • Path 2 is the long BPT 345
  • Path 3 is the pair BPT 350
  • Path 4 is a swipe-out BPT 358
  • Path 5 is a swipe-in BPT 359.
  • 'direction' 374 align with the values for 'BPT' 224 as dictated by the method 611 of Figure 11 : when 'gesture' is 'single press' and 'duration' is ' ⁇ ETP', then the BPT is short; when 'gesture' is 'single press' and 'duration' is '> ETP', then the BPT is long; when 'gesture' is 'pair press', then the BPT is pair; when 'gesture' is 'swipe' and 'direction' is 'outward', then the BPT is 'swipe out'; and when 'gesture' is 'swipe' and 'direction' is 'inward', then the BPT is 'swipe in'.
  • Values for the variable 'button press values' 222 are the assigned button press values 222 (or possible combinations of the assigned values 222) of the user interface 150.
  • the button press values 222 are single values.
  • the button press values 222 are possible combinations of two of the values 222.
  • the variable 'button press value' 222 identifies for the processor 108 the particular selection button 110 pressed along with the second button pressed if the BPT is 'pair' or the value identified by the direction of the swipe gesture, if a swipe gesture occurs.
  • values for variable 'button press values' 222 are -3, -2, +2 +3, and possible
  • variable 'button press values' 222 are possible.
  • Values for the variable 'math operator' 181 are the operators assigned to each specific BPT 234.
  • the operator is ' ⁇ 2' .
  • the operator is '+'.
  • the operator is '+'.
  • the operator is '-'.
  • Values for the variable 'total button press value' 228 are the values that result from the mathematical operations of steps 624, 626, 642, 688 and 690 of the method 611 of Figure 11. As the method of Figure 11 shows, the path taken determines which math operation becomes implemented. As a result, values of the variable 'total button press value' 228 depend on both the variables 'button press values' 222 and 'button press type' 224. For the short BPT 340, the 'total button press value' 228 equals the 'button press value' 222. For the long BPT 345, the 'total button press value' 228 equals two times the 'button press value' 222.
  • the 'total button press value' 228 equals the sum of the 'button press values' 222.
  • the 'total button press value' 228 equals the sum of the 'button press values' 222.
  • the 'total button press value' 228 equals the difference between the 'button press values' 222.
  • Values for the variable 'character' 200 are the characters available in the menu 240 of the user interface 150. Each character is identified by its position 242 in the menu 240, as previously disclosed in U.S. Patent No. 8,487,877. As described in step 624 of the method 611 of Figure 11, the processor 108 interprets characters 200 of the menu 240 by their position 242. Values for the variable 'total button press value' 228 identify that menu position.
  • the list 186 shows explicitly which variables of the method 611 of Figure 11 are input variables.
  • Input variables are variables that require input from a user.
  • the input variables are: (1) 'button press values' 222, (2) 'input gesture' 210,
  • Figure 13 shows a first flowchart of variables and a second flowchart of example values for each variable of the method 611 of Figure 11 and the user interface 150 of Figure 2.
  • the first flowchart shows the four input variables for the method of
  • Figure 11 (1) 'button press values' 222, (2) 'input gesture' 210, (3) 'duration' 208 and
  • variables 'input gesture' 210, 'duration' 208 and 'direction' 374 together determine the 'button press type' 224, as disclosed by the method 611 of Figure 11.
  • the variables 'button press values' 222 and 'button press type' 224 together determine the 'total button press value' 228, which follows step 624, 626, 642, 688 or 690 of the method 611 of Figure 11.
  • the variable 'total button press value' 228 determines the 'character' 200 which is based on the menu 240 of the user interface 150 of Figure 2.
  • the second flowchart shows example values for each variable of the method 611 of Figure 11 and the embodiment of the user interface 150 of Figure 2.
  • the variable 'button press values' 222 has the values '-3, -2, +2 or +3' 223 (or combinations of them).
  • the variable 'input gesture' 210 has the values 'single, 'pair' or 'swipe' 211.
  • the variable 'duration' 208 has the values ' ⁇ ETP' or '> ETP' 209.
  • the variable 'swipe direction' 374 has the values 'inward' or 'outward' 375.
  • the variable 'button press type' 224 has the values 'short', 'long', 'pair', 'swipe-out' or 'swipe-in' 225.
  • the variable 'total button press value' 228 has the values '-6, -5, -4, -3, -2, -1, 0, +1, +2, +3, +4, +5, or +6' 229.
  • the variable 'character' 200 has the values 'a, b, c, d, e, f, g, h, i, j, k, 1, or m' 201.
  • the values of the second flowchart are examples used to demonstrate the embodiments of Figures 2 and 11. The scope of the invention is not limited by the variables and particular values shown here, but rather by the scope of the claims.
  • Figures 14 and 15 show examples of how characters of a word 130 derive from the variables of the method 611 of Figure 11 and the user interface 150.
  • the word 130 is 'fad' .
  • Rows of a table show values for each of the variables 'character' 200, 'menu position' 242, 'button press values' 222 and 'button press type' 224.
  • Values for the variable 'character' 200 derive directly from the word 130.
  • Values for the variable 'menu position' 242 derive from the position of each character 200 in the menu 240 according to the user interface 150.
  • Values for the variable 'button press values' 222 derive from the values for 'menu position' 242 and steps of the method 611 of Figure 11 taken in reverse.
  • the button press value can only be -3.
  • no combination of one or two button press values 222 except -3 can produce the value -3.
  • the math operation can be either (-3) - (-2) or (-3) + 2.
  • either a 'pair' BPT can be executed to specify the '+' operator or a 'swipe-in' BPT to specify the '-' operator.
  • the 'swipe-out' BPT is not a possibility because although the gesture does select the '+' operator, the required integers values to make -1 are not assigned to neighboring selection buttons.
  • Values for the variable 'button press type' 224 derive from the values for
  • every menu position 242 is accessible by at least one combination of the variables 'button press values' 222 and 'button press types' 224,
  • Figure 14 also shows a variable 'button press type sequence' 382 and a variable 'total number of button presses' 384.
  • the button press type sequence 382 is 'swipe-in - long - short.
  • the total number of button presses 384 for the word 'fad' is three if a swipe gesture is used to select 'f (or four if a pair button press is used to select 'f ).
  • the word 130 is 'leg'. Rows of a table show values for each of the variables 'character' 200, 'menu position' 242, 'button press values' 222 and 'button press type' 224. Individual values for each of the variables are derived just as explained for the example of Figure 14.
  • the button press type sequence 382 is 'swipe-out - short - short. Based on the number of button press types 224 and the number of button press values 222 per button press type 224, the total number of button presses 384 for the word 'leg' is three if a swipe gesture is used to select T (or four if a pair button press is used to select T).
  • Figure 16 shows a schematic drawing of another embodiment of the electronic device 100 for input of characters.
  • the device 100 may have some or all the components and functionality described herein with respect to the mobile device 100 of Figure 1.
  • the device 100 has aspects previously disclosed in Figure 8 of U.S. Patent No. 8,487,877, which is hereby incorporated by reference in its entirety.
  • the electronic device 100 includes the display 104, the plurality of characters 200 that populate positions 242 of the character menu 240, the plurality of selection buttons 110 and the spacebar button 264, which together make up the user interface 150 of the device 100.
  • Each of the plurality of selection buttons 110 has an assigned button press value 222. Included as part or within proximity to the menu 240 is the reference 258 and the offset scale 260.
  • the display 104, and specifically the menu 240 the plurality of selection buttons 110, and the spacebar button 264, are communicatively coupled with the CPU 108, as described in the embodiment of Figure 1.
  • the CPU 108 includes the elapsed time counter 140, the integer value counter 142 and the swipe gesture interpreter 144, as described in the embodiment of Figure 1.
  • the CPU 108 is communicatively coupled with the storage medium 112 and the power source 122, as described in the embodiment of Figure 1.
  • the menu 240 has 17 menu positions 242 and the plurality of selection buttons includes six buttons with the assigned button press values 222: '-4, -3, -2, +2, +3, +4' .
  • the menu positions 242 are populated by 17 of the 33 characters 200 of the Russian alphabet.
  • Figure 17 shows an embodiment of the table 185 of value assignments for variables of the method 611 of Figure 11 for the embodiment of the user interface 150 of Figure 16.
  • FIG 18 shows graphical representations of another example of the swipe BPT 357 shown in Figure 3 and an example alternative swipe BPT 362.
  • each example button press type 224 of Figure 18 has a first and second horizontal bar 326, 331.
  • the first horizontal bar 326 represents the passage of time.
  • the second horizontal bar 331 represents the distance (or length) of a positional translation.
  • the black region 327 within the bar indicates a time period when a button is pressed.
  • the white region 328 indicates a time period when a button is not pressed.
  • the solid vertical marker 329 indicates the beginning or end of an elapsed time period (ETP) metric 330.
  • ETP elapsed time period
  • the elapsed time period 330 is of selectable length and commences with the onset of a button press.
  • the length of the black region 327 indicates the duration 208 of the button press.
  • the duration of the button press is compared with the elapsed time period metric 330.
  • the black region 327 within the bar indicates a distance (or length) 372 of a positional translation of a button press along the touch sensitive screen 104.
  • the positional translation is measured starting from the position of the button press at the onset of the press.
  • the white region 328 within the bar indicates a distance from the starting position that the positional translation of the button press has not reached.
  • the solid vertical marker 329 indicates the beginning or end of the swipe distance metric 373.
  • the swipe distance metric 373 is of selectable length and commences at the position of the button press on the screen 104 at the button press' onset.
  • the distance (or length) 372 of a positional translation of a button press is compared with the swipe distance metric 373.
  • a difference between the swipe BPT 357 and the alternative swipe BPT 362 is that for the swipe BPT 357 the button press duration 208 does not exceed the elapsed timer period metric 330, whereas for the alternative swipe BPT 357 the duration does exceed the elapsed time period metric.
  • the fact that both the duration and the swipe distance exceed their respective metrics leads to a potential ambiguity in the determination of the BPT 224, when the gesture is interpreted according to the method 611 of Figure 11.
  • the one that occurs first depends on the value selected for the elapsed time period metric 330, the value selected for the swipe distance metric 373, and the rate the user executes the positional translation.
  • the elapsed time period metric is 0.1 seconds and the swipe distance metric is 20 pixels. If the rate of the positional translation is less than 20 pixels per 0.1 seconds, then the button press is interpreted as the long BPT 345. If the rate of the positional translation is greater than 20 pixels per 0.1 seconds, then the button press is interpreted as the swipe BPT 357.
  • Figure 19 is a flowchart that shows an alternate method 613 to the method 611 of Figure 11 for the processor 108 of an electronic device to interpret button presses.
  • Figure 20 is a flowchart that shows a method 700 to change an interpreted button press type 224 from the long BPT 345 to the swipe BPT 357 for the case where the swipe distance 372 does not exceed the swipe distance metric 373 until after the elapsed time period 330 expires.
  • the method 700 of Figure 20 uses the alternate method 613 of Figure 19 to enable that correction.
  • the alternate method 613 of Figure 19 differs from the order that the method executes the step 610 that initializes the BP value counter to zero occurs in the method.
  • the step 610 follows the step 614 that determines if a first selection button is pressed. Furthermore the step 610 falls ahead of the step 616 that adds to the BP value counter the button press value of the pressed selection button.
  • the step 624 that identifies the character of the menu according to the input gesture received leads directly to the step 612 that initializes the elapsed time counter to zero.
  • a step 699 is included in the alternate method 613 that is not included in the method 611 of Figure 11.
  • the step 699 is a conglomeration of the steps 610, 616, 618, 620, 622, 624, 626, 640, 642, 684, 686, 688 and 690.
  • the step 699 interprets as input the character of the menu whose position corresponds to the input gestures received.
  • Figure 19 shows a flowchart of an embodiment of a method 613 for the processor 108 of an electronic device to interpret button presses.
  • the CPU 108 initializes the elapsed time counter 140 to zero.
  • the CPU 108 monitors the selection buttons 110 for a pressed selection button. Once a first selection button press occurs, in another step 610 of the method 613, the CPU 108 initializes the integer value counter 142 to zero.
  • the CPU 108 adds to the integer value counter 142 a value equal to the assigned value 222 of the first pressed selection button 110.
  • the CPU 108 starts the elapsed time counter 140.
  • the CPU 108 monitors the selection buttons 110 for the occurrence of a second selection button press or a swipe gesture while comparing the elapsed time counter 140 with a user chosen selectable-length time period.
  • the CPU 108 determines if the first button press is still pressed.
  • the CPU 108 interprets as input the character 200 of the menu 240 whose position 242 equals the value of the integer value counter 142.
  • the processor multiplies the value of the integer value counter 142 by two before commencing the subsequent step 624, where the CPU 108 interprets as input the character 200 of the menu 240 whose position 242 equals the value of the integer value counter 142.
  • step 626 the CPU 108 adds to the integer value counter 142 a value equal to the assigned value 222 of the second pressed selection button. Then, in the subsequent step 624 the CPU 108 interprets as input the character 200 of the menu 240 whose position 242 equals the value of the integer value counter 142.
  • swipe gesture interpreter 144 determines the direction 374 of the swipe gesture.
  • step 686 the interpreter 144 determines that the swipe is outward
  • step 688 the CPU 108 adds to the integer value counter the assigned value of the button neighboring the pressed selection button on the side corresponding to the direction of the swipe gesture. Then, in the subsequent step 624 the CPU 108 interprets as input the character 200 of the menu 240 whose position 242 equals the value of the integer value counter 142.
  • step 686 the interpreter 144 determines that the swipe is inward
  • step 690 the CPU 108 subtracts from the integer value counter the assigned value of the button neighboring the pressed selection button on the side corresponding to the direction of the swipe gesture. Then, in the subsequent step 624 the CPU 108 interprets as input the character 200 of the menu 240 whose position 242 equals the value of the integer value counter 142.
  • the CPU 108 re- initializes the elapsed time counter 140 to zero and repeats the method.
  • the CPU 108 displays the identified character 200 on the screen 104.
  • the steps 610, 616, 618, 620, 622, 624, 626, 640, 642, 684, 686, 688 and 690 together make up the step 699, which interprets as input the character of the menu whose position corresponds to the input gestures received.
  • math operations other than addition, subtraction and multiplication-by-two are used in steps 626, 642, 688 and 690 to identify characters by their numerical position in a menu, array or table.
  • the particular button neighboring the pressed selection button is in an alternative direction from the embodiment disclosed or is identified by an alternative input than the direction of the swipe gesture.
  • FIG 20 is a flowchart that shows a method 700 that changes a button press type initially interpreted as the long BPT 345 to the swipe BPT 357 for the case where the swipe distance 372 exceeds the swipe distance metric 373 after the elapsed timer counter 330 expires.
  • the CPU 108 initializes the elapsed time counter 140 to zero.
  • the CPU 108 monitors the selection buttons 110 for a first pressed selection button.
  • a first pressed selection button is button press with an onset after the step 612 in the currently executed cycle of steps 612, 614 and 699.
  • an ongoing button press is a button press with an onset before the step 612 of the current selection cycle.
  • An ongoing button press is a button press initiated in the previous character selection cycle and that is still underway.
  • the CPU 108 determines whether the swipe distance 372 of an ongoing button press (if any) exceeds the swipe distance metric 373. Note that in the step 710 the button press could only have been an ongoing button press, because the determination in the step 614 was that a first selection button press was not found.
  • the CPU finds no button press with swipe distance 372 that exceeds the swipe distance metric 373 (i.e., if there is an ongoing button press for which the swipe distance 372 does not exceed the distance metric 373, or there is no ongoing button press at all), then in the next step 614 the CPU rechecks whether a first selection button is pressed. Alternatively, if the CPU finds that a button press is underway (i.e., carried over from the previous character selection cycle) and the swipe distance 372 of the ongoing button press exceeds the distance metric, then in a next step 712 the CPU 108 divides the value of the BP value counter by 2.
  • the swipe gesture interpreter 144 determines the direction 374 of the swipe gesture.
  • step 686 the interpreter 144 determines that the swipe is outward
  • step 688 the CPU 108 adds to the BP value counter the assigned value of the button neighboring the pressed selection button on the side corresponding to the direction of the swipe gesture. Then, in a subsequent step 714 the CPU 108 replaces the last interpreted character with the character 200 of the menu 240 whose position 242 equals the value of the BP value counter 142.
  • step 686 the interpreter 144 determines that the swipe is inward
  • step 690 the CPU 108 subtracts from the BP value counter the assigned value of the button neighboring the pressed selection button on the side corresponding to the direction of the swipe gesture. Then, in the subsequent step 714 the CPU 108 replaces the last interpreted character with the character 200 of the menu 240 whose position 242 equals the value of the BP value counter 142. In the next step 614 the CPU re-checks whether a first selection button is pressed. Once a new selection button is received, then in the next step 699 the CPU 108 interprets as input the character of the menu whose position corresponds to the input gestures received, as described in the method 613 of Figure 19.
  • Figure 21 is a flowchart that shows another method 720 that interprets button press and swipe input gestures to select a character from a menu.
  • the CPU 108 initializes the elapsed time counter 140 to zero.
  • the CPU 108 monitors the selection buttons 110 for a first pressed selection button.
  • a 'first' pressed selection button is button press with an onset after the step 612 in the currently executed cycle of steps 612, 614 and 699.
  • an Ongoing' button press is a button press with an onset before the step 612 of the current selection cycle.
  • an Ongoing' button press is a button press initiated in the previous character selection cycle that is still underway.
  • the CPU 108 determines whether the swipe distance 372 of an ongoing button press (if any) exceeds the swipe distance metric 373. Note that in the step 710 the button press could only have been an ongoing button press, because the determination in the step 614 was that a first selection button press did not occur.
  • the CPU finds no button press with swipe distance 372 that exceeds the swipe distance metric 373 (i.e., the CPU finds an ongoing button press but the swipe distance 372 does not exceed the distance metric 373, or finds no ongoing button press at all), then in the next step 614 the CPU rechecks whether a first selection button is pressed. Alternatively, if the CPU finds an ongoing button press (i.e., a button press still underway from the previous character selection cycle) and the swipe distance 372 of the ongoing button press exceeds the distance metric, then in a next step 730 the CPU 108 replaces the last interpreted character with the character 200 of a menu 240 whose position 242 corresponds to the swipe gesture.
  • the correspondence is based on any spatial relationship between the selection button activated, the swipe gesture interpreted and the position of the character in the menu, not necessarily to a correspondence with a button press value of the pressed selection button.
  • step 614 the CPU rechecks whether a first selection button is pressed. Once a first selection button press occurs, in another step 618, the CPU 108 starts the elapsed time counter 140.
  • the CPU 108 monitors the selection buttons 110 for the occurrence of a second selection button press or a swipe gesture while comparing the elapsed time counter 140 with a user chosen selectable-length time period.
  • the CPU 108 determines if the first button press is still pressed.
  • a step 735 the CPU 108 interprets as input the character 200 of a menu 240 whose position 242 corresponds to the input gesture received.
  • the correspondence is based on any spatial relationship between the selection button (or buttons) pressed, the gesture interpreted and the position of the character in the menu, not necessarily to a correspondence with a button press value of the pressed selection button.
  • the CPU 108 re-initializes the elapsed time counter 140 to zero which initiates a subsequent character selection cycle. According to another embodiment, in a further step the CPU 108 displays the identified character 200 on the screen 104.
  • Figure 22 shows a flowchart of an embodiment of a method 505 for a user to specify a character from among a plurality of characters.
  • a user views the characters 200 displayed in the menu 240.
  • the user selects a character from the menu 240 for input to the electronic device 100.
  • the user identifies the selected character by the position of the character with respect to the reference 258 of the menu 240, for example by a value equal to the number of positions the selected character is offset from the menu's reference 258.
  • the user can identify the position (or index) of the selected character in a number of ways, including by referencing the position to a corresponding value in the offset scale 260, referencing the position to a corresponding mark or position indicator of known offset, counting the number of positions that the selected character is offset from the reference 258, recalling from memory the value that identifies the particular selected character, and recalling by muscle memory the selection button keystrokes that correspond with the selected character or the selected character's position.
  • step 516 the user determines whether the value that identifies the selected character's position 242 in the menu 240 equals the assigned button press value 222 of any selection button 110.
  • step 538 the user presses the selection button with the assigned value that equals the selected character's position and releases the button before the elapsed time counter expires.
  • the aforementioned step 538 inputs the assigned value 222 of the pressed selection button to the CPU 108, triggers the CPU 108 to start the elapsed time counter 140, and indicates to the CPU that the type of button press is a SHORT press.
  • step 520 the user waits for the elapsed time counter 140 to expire before the character selection cycle progresses. In an alternative embodiment, the user bypasses step 520 because the character selection cycle ends once the user releases the selection button. In an optional step 522, the user views the specified character on the display 104. In an alternative embodiment, step 522 is bypassed.
  • the user determines whether the value that identifies the selected character's position 242 in the menu 240 equals twice the assigned button press value 222 of any selection button 110.
  • step 540 the user presses the selection button 110 with the assigned value 222 that equals half the selected character's position and maintains the button press until the elapsed time counter expires.
  • the aforementioned step 540 inputs the assigned value 222 of the pressed selection button to the CPU 108, triggers the CPU 108 to start the elapsed time counter 140, and indicates to the processor that the type of button press is a LONG press.
  • step 522 the user views the specified character on the display 104. In an alternative embodiment, step 522 is bypassed.
  • step 553 the user presses the selection button 110 with the assigned value 222 that is the smaller of the two consecutive selection button values and then swipes outward before the elapsed time counter expires.
  • the aforementioned step 553 inputs the sum of the assigned values 222 of the consecutive selection buttons to the CPU 108, on initiation of the swipe triggers the CPU 108 to start the elapsed time counter 140, upon recognition of the swipe gesture triggers the CPU to end the elapsed time counter, and indicates to the processor that the type of button press is a SWIPE press.
  • step 522 the user views the specified character on the display 104. In an alternative embodiment, step 522 is bypassed.
  • step 557 the user presses the selection button 110 with the assigned value 222 that equals half the value that is one less than the selected character's position, maintains the button press until the elapsed timer expires, and then swipes outward.
  • the aforementioned step inputs to the CPU 108 a value equal twice the assigned value 222 of the pressed selection button plus one, on initiation of the swipe triggers the CPU 108 to start the elapsed time counter 140, on recognition of the swipe gesture triggers the CPU to end the elapsed time counter, and indicates to the processor that the type of button press is a SWIPE press.
  • step 522 the user views the specified character on the display 104. In an alternative embodiment, step 522 is bypassed.
  • step 555 identifies the same position identified in the step 551. So the steps 551 and 555 identify a character in the same position of the menu, but offer two alternative ways for a user to recognize how to select it. Whichever of steps 551 and 553 the user chooses to follow, the character that gets selected is the same.
  • the character specification method 505 described above is used iteratively to specify series' of characters from the character menu 240.
  • words and sentences are formed on the display 104 by iteratively specifying characters according the method above, with the spacebar 264 used to input spaces between words on the display.
  • FIG 23 shows a flowchart of an embodiment of a method 740 for the processor 108 of an electronic device to interpret button presses and swipes.
  • the CPU 108 initializes a variable 'button press value' (BPV) stored by the button press value counter 142 to zero.
  • the CPU initializes a variable 'button press type' (BPT) to a empty string.
  • the CPU 108 initializes a variable 'elapsed time' (ET) stored by the elapsed time counter 140 to zero.
  • the CPU initializes a variable 'duration of the ETP' to a non-zero value or alternatively receives a non-zero value selected by a user.
  • the CPU 108 monitors the selection buttons 110 for a pressed selection button 110. Once a button press occurs, in another step 616, the CPU 108 sets the variable BPV to a value equal to the assigned value 222 of the pressed selection button 110. In another step 618, the CPU 108 starts the elapsed time counter 140.
  • the CPU 108 monitors the selection buttons 110 for the occurrence of a swipe gesture while comparing the elapsed time (ET) with the selected duration of the elapsed time period (ETP).
  • the swipe gesture interpreter 144 recognizes that the swipe threshold is exceeded before the elapsed time period expires, in a subsequent step 750, the CPU adds to the variable BPV a value equal to BPV + 1. In a subsequent step 756, the CPU updates the variable BPT to SWIPE and in another subsequent step 758 the CPU outputs the current values for the variables BPV and BPT.
  • the CPU 108 determines if the button is still pressed.
  • the processor multiplies the value of the variable BPV by two.
  • the CPU 108 monitors the selection buttons 110 to determine if the selection button remains pressed and for the occurrence of a swipe gesture. If the pressed selection button is released without a swipe gesture occurring, then in a subsequent step 754 the CPU updates the variable BPT to LONG and in another subsequent step 758 the CPU outputs the current values for the variables BPV and BPT.
  • the swipe gesture interpreter 144 recognizes that the swipe threshold is exceeded, then in a subsequent step 748 the CPU adds one to the variable BPV. Then in a subsequent step 756, the CPU updates the variable BPT to SWIPE and in another subsequent step 758 the CPU outputs the current values for the variables BPV and BPT.
  • the CPU 108 interprets as input the character 200 of the menu 240 whose position 242 equals the BPV output in the step 758.
  • the CPU executes the method 740 iteratively, which selects one character from the menu for each iteration.
  • the CPU 108 displays the identified character 200 on the screen 104.
  • Figure 24 shows a portion of the user interface 150 of Figure 2 and a table.
  • the portion of the user interface includes selection buttons 110 that have assigned button press values 222 equal 0, 2 and 3.
  • the table shows possible values for six variables of the method 740 of Figure 23.
  • Three of the variables are input variables 810, which are selectable by a user.
  • Three of the variables are output variables 815, which are determined by the device 110 according to the logic of Figure 23.
  • the input variables 810 selectable by a user are: the variable 'assigned value of pressed button' 222, a variable 'swipe threshold exceeded?' 804, a variable 'button lifted before or after time expires?' 788.
  • the output variables 815 determined by the device are: the variable 'button press type (BPT)' 224, the calculation 790, and the 'calculated button press value (BPV)' 228.
  • Each row of the table discloses a unique combination of the three input variables 810. Assuming for a moment that the variable 'assigned value of pressed button' 222 is constant, then the remaining two input variables 'swipe threshold exceeded?' 804 and 'button lifted?' 788 have four possible unique combinations:
  • Each combination specifies a unique calculation 790.
  • the specified calculation 790 together with the value of the pressed button 222, determines a value for the variable 'calculated BPV 228.
  • swipe gestures are time independent doesn't change the fact that button presses are not.
  • the duration of the button press still determines whether the calculated BPV 228 equals the value of the pressed button 222 (SHORT BPT) or twice the value of the pressed button (LONG BPT).
  • buttons 110, assigned button values 222 and values for the input and output variables 810, 815 are examples used to demonstrate the embodiments of Figures 2, 22 and 23.
  • the scope of the invention is not limited by the variables and values shown here, but rather by the scope of the claims.
  • Figure 25 shows a portion of an alternative embodiment of the user interface 150 of Figure 2 and a corresponding table of variables 810, 815.
  • the portion of the alternative embodiment includes the selection buttons 1 10 and assigned button press values 222 of the embodiment of Figure 24, but also includes an additional button with an assigned value equal 4.
  • swipe gestures are time-independent.
  • Figure 26 shows another flowchart of an embodiment of a method 780 for the processor 108 of an electronic device to interpret button presses.
  • the CPU 108 initializes a variable 'button press value' (BPV) stored by the button press value counter 142 to zero.
  • the CPU initializes a variable 'button press type' (BPT) to a empty string.
  • the CPU 108 initializes a variable 'elapsed time' (ET) stored by the elapsed time counter 140 to zero.
  • the CPU initializes a variable 'duration of the ETP' to a non-zero value or alternatively receives a non-zero value selected by a user.
  • the CPU initializes a variable 'cycle interrupted' to FALSE.
  • the CPU 108 monitors the selection buttons 110 for a pressed selection button. Once a first selection button press occurs, in another step 616, the CPU 108 sets the variable BPV to a value equal to the assigned value 222 of the first pressed selection button 110.
  • the CPU determines if the variable BPV equals zero. If the BPV is not equal zero, in a step 722 the CPU sets a local ETP variable equal to the variable 'duration of ETP' initialized in the step 746. Alternatively, if the BPV equals zero, then in an alternative step 724 the CPU sets the local ETP variable equal zero.
  • step 618 the CPU 108 starts the elapsed time counter 140.
  • the CPU 108 monitors the selection buttons 110 for the occurrence of a swipe gesture or a second button press, while also comparing the elapsed time (ET) with the local ETP variable to see if the ETP has expired.
  • ETP elapsed time
  • the swipe gesture interpreter 144 recognizes that the swipe threshold is exceeded before the elapsed time period expires or a second button press occurs, then in a subsequent step 750, the CPU adds to the variable BPV a value equal to BPV + 1. In a subsequent step 756, the CPU updates the variable BPT to
  • the CPU interprets a second button press before a swipe occurs or the elapsed time period expires (or the first button press is lifted), then in a subsequent step 752 the CPU updates the variable BPT to SHORT, in a subsequent step 776 the CPU changes the variable 'cycle interrupted' from FALSE to TRUE, and in another subsequent step 758 the CPU outputs the current values for the variables BPV and BPT.
  • the CPU 108 determines if the first button press is still pressed.
  • the CPU updates the variable BPT to SHORT and in another subsequent step 758 the CPU outputs the current values for the variables BPV and BPT.
  • the CPU multiplies the value of the variable BPV by two and then in another subsequent step 754 the CPU updates the variable BPT to LONG.
  • the CPU 108 monitors the selection buttons 110 for the occurrence of a swipe gesture or a second button press, while also determining if the first button press is still pressed.
  • the CPU determines the first button press is no longer pressed before either a swipe or second button press occurs, then in a subsequent step 758 the CPU outputs the current values for the variables BPV and BPT.
  • the CPU determines that a second button occurs before a swipe occurs or the first button is released, then in a subsequent step 776 the CPU changes the variable 'cycle interrupted' from FALSE to TRUE and in another subsequent step 758 the CPU outputs the current values for the variables BPV and BPT.
  • the swipe gesture interpreter 144 recognizes that the swipe threshold is exceeded before the elapsed time period expires or a second button press occurs, then in a subsequent step 748 the CPU adds one to the variable BPV. In a subsequent step 756, the CPU updates the variable BPT to SWIPE and in another subsequent step 758 the CPU outputs the current values for the variables BPV and BPT.
  • the CPU In a step 612 subsequent to the step 758 that outputs values for the variables BPV and BPT, the CPU resets the variable 'elapsed time' (ET) stored by the elapsed time counter 140 to zero. Then, in a subsequent step 778, the CPU determines the value stored in the variable 'cycle interrupted'.
  • ET variable 'elapsed time'
  • the CPU 108 monitors the selection buttons 110 for a next pressed selection button.
  • the CPU determines the variable 'cycle interrupted' is TRUE
  • the CPU sets the variable BPV stored by the button press value counter 128 to the button press value 222 of the second pressed selection button in the previous character selection cycle.
  • the CPU updates the variable 'cycle interrupted' to FALSE.
  • the CPU determines if the variable BPV stored by the button press value counter 142 equals zero.
  • CPU 108 interprets as input the character 200 of the menu 240 whose position 242 equals the BPV output in the step 758.
  • the CPU executes the method 740 iteratively, which selects one character from the menu per iteration.
  • the CPU 108 displays the identified character 200 on the screen 104.
  • Figure 27 shows a table 800 that lists the ways each menu position 242 can be identified using the logic of Figure 23 for the user interface 150 of Figure 2.
  • the table 800 shows only six positions plus a '0' position, due to symmetry these six positions can applied to either half of the 13 -position menu of the user interface 150 of Figure 2.
  • the table of Figure 27 applies to the embodiment of the user interface of Figure 2, the table could be extended to apply to any length menu 240.
  • the table 800 includes values for the following variables: variable 'menu position' 242, variable 'gesture to select character' 802, variable 'assigned value of pressed button 222, variable 'swipe threshold exceeded' 804, variable 'button released' 806, variable ⁇ expired' 808.
  • the table 800 shows that for positions accessible by a swipe gesture (Positions 1 and 5 for the embodiment of Figure 2), there is always at least two ways for a user to reach that position. Furthermore, for those positions, the variable ⁇ expired' is FALSE for at least one way and TRUE for at least one of the others. That fact guarantees that even if a user fails to complete a swipe gesture when they expect to (i.e., expected to complete the swipe before but completed it after, or vice-versa), the same character gets selected anyway. That fact makes swipe gestures time- independent.
  • Each row of the table has one grey box 809 that marks one or the other of the variables 'swipe threshold exceeded' 804 and 'button released' 806.
  • the grey box 809 indicates the action that signifies the end the character selection cycle.
  • the character selection cycle terminates with a button release. In other words, if a button is released and no swipe gesture has occurred, then the selection cycle ends. (In an alternative embodiment, the selection cycle extends until the ETP expires for short presses, but that isn't necessary.)
  • the selection cycle may or may not immediately end.
  • swipes that occur before the ETP expires cause the selection cycle to immediately end, but swipes that occur after the ETP expires do not cause the selection cycle to end.
  • the button release ends the selection cycle. This enables the user to "undo" a swipe gesture, if they want, by swiping back to the position where the swipe gesture originated.
  • Figure 28 is a table 800 that lists the ways each position 242 of the menu 240 can be reached using the selection buttons 110 and the logic of Figure 26 for the user interface 150 of Figure 2.
  • the table 800 of the embodiment of Figure 28 also includes assigned characters 200 for each position of the menu 240.
  • the table includes values for the following variables: variable 'menu position' 242, variable 'gesture to select character' 802, variable 'button pressed' 222, variable 'swipe threshold exceeded' 804, variable 'button released' 806, variable ⁇ expired' 808 and character 200.
  • the table of Figure 28 is just one possible embodiment of the user interface of Figure 2 and the methods of Figures 22, 23 and 26, but in alternative embodiments could include alternative character assignments, alternative assigned values for the selection buttons 110, and alternative numbers of menu positions 242 and selection buttons 110, among other possible variations.
  • Figures 29 and 30 show examples of how a word 130 is composed according to the method 780 of Figure 26 and the embodiment of the user interface 150 of Figure 2.
  • the composed word 130 is 'flag'.
  • Each row of Figure 29 shows one or more ways in which a particular letter 200 of the word 130 could be composed from a 'button press value' 222 and a 'button press type' 224.
  • Values for the variables 'button press value' 222 and 'button press type' 224 are selected by a user based on the position of an intended character 200 in the menu 240 and knowledge about how gestures identify calculations 790 according to the method 780 of Figure 26.
  • the variable ⁇ expired' 808 shows that for the SWIPE BPT the swipe gesture may be completed before or after the ETP expires and the same character becomes selected in either case.
  • variable 'calculation' 790 is specified based on the BPT 224 according to the logic of the method 780 of Figure 26.
  • the variable 'calculated BPV 228 is the result of the calculation of 790 and the assigned BPV 222 selected by the user.
  • the device identifies the user's intended character 200 based on the 'calculated BPV and the assignment of the characters in the menu 240.
  • the composed word 130 is 'bike' .
  • Each row of Figure 30 shows one or more ways in which a character 200 could be composed from a 'button press value' 222 and a 'button press type' 224.
  • Values for the variables 'button press value' 222 and 'button press type' 224 are selected by a user based on the position of an intended character 200 in the menu 240 and knowledge about how gestures identify calculations 790 according to the method 780 of Figure 26.
  • the variable ⁇ expired' 808 shows that for the SWIPE BPT the swipe gesture may be completed before or after the ETP expires and the same character becomes selected in either case.
  • variable 'calculation' 790 is specified based on the BPT 224 according to the logic of the method 780 of Figure 26.
  • the variable 'calculated BPV 228 is the result of the calculation of 790 and the assigned BPV 222 selected by the user.
  • the device identifies the user's intended character 200 based on the 'calculated BPV and the assignment of the characters in the menu 240.
  • Figure 31 shows a schematic drawing of another embodiment of the electronic device 100 for input of characters.
  • the device 100 may have some or all the components and functionality described herein with respect to the mobile device 100 of Figure 1.
  • the device 100 has aspects previously disclosed in Figure 8 of U.S. Patent No. 8,487,877, which is hereby incorporated by reference in its entirety.
  • the electronic device 100 includes the display 104, the plurality of characters 200 that populate positions 242 of the character menu 240, the plurality of selection buttons 110 and the spacebar button 264, which together make up the user interface 150 of the device 100.
  • Each of the plurality of selection buttons 110 has an assigned button press value 222. Included as part or within proximity to the menu 240 is the reference 258 and the offset scale 260.
  • the display 104, and specifically the menu 240 the plurality of selection buttons 110, and the spacebar button 264, are communicatively coupled with the CPU 108, as described in the embodiment of Figure 1.
  • the CPU 108 includes the elapsed time counter 140, the integer value counter 142 and the swipe gesture interpreter 144, as described in the embodiment of Figure 1.
  • the CPU 108 is communicatively coupled with the storage medium 112 and the power source 122, as described in the embodiment of Figure 1.
  • the menu 240 has 17 menu positions 242 and the plurality of selection buttons includes six buttons with the assigned button press values 222: '-4, -3, -2, +2, +3, +4' .
  • the menu positions 242 are populated by 17 of the 33 characters 200 of the Russian alphabet.
  • Figure 32 shows a table 800 that lists the ways each position 242 of the menu 240 can be reached using the logic of the method 780 of Figure 26 for the embodiment of the user interface 150 of Figure 31.
  • the table 800 includes assigned characters 200 for each position of the menu 240.
  • the table includes values for the following variables: variable 'menu position' 242, variable 'gesture to select character' 802, variable 'button pressed' 222, variable 'swipe threshold exceeded' 804, variable 'button released' 806, variable ⁇ expired' 808 and character 200.
  • the table of Figure 32 is just one possible embodiment of the user interface of Figure 31 and the methods of Figures 22, 23 and 26, but in alternative embodiments could include alternative character assignments, alternative assigned values for the selection buttons 110, and alternative numbers of menu positions 242 and selection buttons 110, among other possible variations.
  • Figure 33 shows a plot 820 of graphical representations of examples of possible button input gestures as a function of time and positional displacement.
  • the plot 820 includes a positional displacement axis 822 with units of pixels.
  • the plot also includes a time axis 824 with units of milliseconds.
  • An origin 826 of the axes marks the point in time and position where the onset of a button input gesture occurs.
  • the elapsed time counter 140 equals 0
  • the positional displacement measured by the swipe gesture interpreter 144 equals
  • the calculated button press value 228 measured by the button press value counter 142 equals zero.
  • the onset of a button input gesture is the onset of a button press.
  • the elapsed time axis 824 is divided into two segments by an elapsed time threshold 830, which in this example equals 100 msec.
  • the elapsed time threshold identifies the point in time when the elapsed time period expires.
  • the positional displacement axis 822 is divided into two segments by the swipe threshold 832, which in this example equals 25 pixels.
  • the swipe threshold identifies the point that positional displacement of a button press becomes identified as a SWIPE button press type.
  • the plot 820 is divided into four quadrants 840 according to four possible combinations of the variables positional displacement 822 and time 824.
  • the combinations are (1) ⁇ 25 / ⁇ 100, (2) ⁇ 25 / >100, (3) >25 / ⁇ 100, and (4) >25 / >100.
  • Each quadrant has an associated math operation 842.
  • the math operation is the calculation that the processor 108 takes on the current value of the button value counter 142, where x is the current value of the counter.
  • x is the current value of the counter.
  • the value of x equals 0.
  • the plot 820 shows various paths 840 that a user input gesture could trace through the quadrants 840.
  • the paths correspond to (1) the positional
  • Each path has a termination 842, which identifies when a button press is released and the character selection cycle ends.
  • the particular path 840 that a user input gesture follows through the plot determines the one or more math operations 842 that the processer 108 takes on the value of the button press value counter 142.
  • Each possible path of Figure 33 corresponds with each of the four possible outcomes for a given pressed button value 222 of Figure 24.
  • Figure 34 shows a schematic drawing of one embodiment of the electronic device 100 for input of characters.
  • the device 100 may have some or all the components and functionality described herein with respect to the mobile device 100 of Figure 1.
  • the device 100 has aspects previously disclosed in Figure 8 of U.S. Patent No. 8,487,877, which is hereby incorporated by reference in its entirety.
  • the electronic device 100 includes the display 104, a plurality of characters 200 that populate positions 242 of a character menu 240, a plurality of selection buttons 110 and a spacebar button 264, which together make up a user interface 150 of the device 100.
  • Each of the plurality of selection buttons 110 has an assigned button press value 222. Included as part or within proximity to the menu 240 is a reference 258 and an offset scale 260.
  • the display 104, the plurality of selection buttons 110, and the spacebar button 264 are communicatively coupled with the CPU 108, as described in the embodiment of Figure 1.
  • the CPU 108 includes the elapsed time counter 140, the integer value counter 142 and the swipe gesture interpreter 144, as described in the embodiment of Figure 1.
  • the CPU 108 is communicatively coupled with the storage medium 112 and the power source 122, as described in the
  • the positions 242 of the menu 240 are arranged in a one-dimensional array similar to the embodiment in Figure 8 of Patent No. 8,487,877, except that the menu 240 and corresponding selection buttons 110 are shown on the display 104 instead of as physical features of the user interface 150.
  • the buttons 110 are communicatively coupled with the CPU 108.
  • the menu 240 and the offset scale 260 are positioned in respective one-dimensional arrays in the user interface region 150 of the device 100.
  • the character menu 240 and the offset scale 260 are positioned on the user interface 150 so that they lie adjacent to and parallel with one other.
  • the character menu 240 and the offset scale 260 are programmed in software so that they appear as features on the display 104 of the device 100.
  • positions 242 of the menu 240 are distributed in a one-dimensional array in evenly spaced increments.
  • values of the offset scale 260 are distributed in a one-dimensional array in spatial increments that match the increment of the menu 240, so that by referencing the offset scale 260 to the menu 240, characters 200 in the menu are effectively numbered.
  • the reference 258 is an indicator located near or on one of the positions 242 of the menu 240.
  • the offset scale 260 includes a value of zero that is located to correspond with the reference 258 of the menu 240. Values of the offset scale 260 increase from zero in pre-selected increments as positions of the offset scale get farther from the zero value. In a further embodiment, values of the offset scale 260 decrease from zero in pre-selected increments as positions of the offset scale get farther from the zero value in a direction opposite to the increasing direction. In one embodiment, the pre-selected increment of the offset scale 260 equals one and the values of the offset scale extend from a negative value to a positive value passing through zero. In an alternative embodiment, the increment of the offset scale 260 is 10 and positions 242 of the menu 240 are marked off in corresponding units of 10.
  • the positions 242 of the menu 240 and the values of the offset scale 260 are distributed in respective one-dimensional arrays positioned adjacent to and parallel with one another, the values of the offset scale 260 count in increments of one and are spaced with respect to one another in their array to correspond with the spacing of positions 242 of the menu 240, and the zero value of the offset scale 260 corresponds to the reference 258 of the menu 240 so that the values of the offset scale 260 label the positions of the menu 240 according to how many positions a given position 242 of the menu 240 is offset from the reference 258.
  • the plurality of selection buttons 110 lie on the display 104 of the user interface 150 of the device 100.
  • the buttons 110 are arranged in a row that corresponds to the physical alignment of the menu 240 on the user interface.
  • Each button is communicatively coupled with the CPU 108 and is assigned a button press value 222.
  • the assigned button press value 222 can be either positive or negative.
  • Each button 110 has the function that when the button is pressed the value 222 assigned to the button is input to the CPU 108.
  • each button 110 has the function that when the button is pressed for a minimum time duration the value 222 assigned to the button is doubled and then input to the CPU 108.
  • each button 110 has the function that when the button is pressed and a swipe gesture follows the press, the value 222 assigned to the button is doubled, then increased by one, and then input to the CPU 108.
  • the assigned button press value 222 of each selection button is unique.
  • there are four selection buttons and the buttons' assigned values are -3, -2, +2, and +3.
  • there are four selection buttons and the buttons' assigned values are -3, -1, +1, and +3.
  • there are five selection buttons and the buttons' assigned values are -5, -2, 0, +2, and +5.
  • the spacebar 264 also lies in the user interface region 150 of the device 100, can be either a hard or soft key, and is communicatively coupled with the CPU 108.
  • the menu 240 has 13 menu positions 242 and the plurality of selection buttons includes five buttons with the assigned button press values 222: '-5, -2, 0, +2, +5' .
  • the menu positions 242 are populated by 13 of the 26 characters 200 of the English alphabet.
  • buttons 110 of the electronic device 100 of Figure 34 are receptive to two input gestures: button presses and swipe gestures.
  • a 'button press' is an activation of a button that extends for some duration of time greater than zero.
  • a 'swipe gesture' is a positional displacement of a button press along the screen 104 that occurs during a button press. For example, this may be the distance along the screen 104 a user moves their finger while it is held on the screen 104 during the button press.
  • a swipe that ends on different key than it started is still during the "button press”. Where a swipe gesture ends is irrelevant, except relative to its "swipe distance threshold".
  • “Swipe distance threshold” is measured with respect to the position of the button press at its onset and nothing else.
  • a swipe gesture includes the possibility of a zero-length displacement. Based on these two definitions, any activation of one of the selection buttons 110 includes both a 'button press' and a 'swipe gesture' .
  • the duration of a button press is measured from the onset of the button press until its release.
  • the duration is typically measured in milliseconds.
  • the positional displacement (also called length or distance) of a swipe gesture is measured along the plane of the screen 104 from the point of the button press at its onset to the point of the button press at its release.
  • the swipe distance is typically measured in pixels, but can also be measured in other length units such as mm or fractional inches.
  • any button activation includes both a button press and swipe gesture (even if the swipe distance equals 0). As such, the response of each input gesture can be acquired simultaneously for any button activation.
  • Figure 35 shows a plot 820 that represents possible examples of responses for duration and swipe distance for the respective input gestures 'button press' and 'swipe gesture'.
  • a single curve 840 represents a possible combination of measurements for duration and swipe distance over the course of a character selection cycle (also referred to as button activation).
  • button press duration is plotted on the x-axis 824 and swipe distance on the_y-axis 822. Duration is measured in units of milliseconds and swipe distance is measured in units of pixels.
  • Onset of a button press occurs at the plot's origin 826 and marks the point in time and distance where the onset of an input gesture occurs.
  • the release of a button is represented by a terminus 842 at the end of each curve.
  • the path that a curve 840 follows through the plot reflects the duration and swipe distance of a received button activation.
  • the response of each input gesture is converted to a binary output using a threshold value.
  • a threshold value is an arbitrary value that enables the analog output of each measured response to be recast as a binary output, i.e., a high or low value.
  • the duration axis 824 is divided into two segments by an elapsed time threshold 830, which in this example equals 200 msec.
  • the elapsed time threshold is the end of a selectable elapsed time period (ETP).
  • the swipe distance axis 822 is divided into two segments by a swipe distance threshold 832, which in this example equals 25 pixels.
  • the swipe distance threshold identifies the minimum required positional displacement for a swipe gesture to be classified as a SWIPE BPT (rather than a LONG or SHORT BPT).
  • each region represents a unique combination of the binary output values from the input gestures.
  • each region represents one possible combination of high and low values: low/low, low/high, high/low and high/high.
  • the measured responses would be segregated among the four regions as follows: ⁇ 25 / ⁇ 100, ⁇ 25 / >100, >25 / ⁇ 100 and >25 / >100.
  • Each region 838 of the plot is identified by a button press type (BPT) value.
  • the BPT is merely a label for the combination of binary values that identify a given region.
  • the current BPT value for a character selection cycle reflects the current measured responses for duration and swipe distance. Because the path that a curve 840 takes through the plot may intersect more than one region 838 of the plot during the course of a character selection cycle, the BPT may evolve during the selection cycle.
  • the final BPT value of a character selection cycle is determined when the button press is lifted, which is identified by the terminus 842 of the curve.
  • the possible BPTs are SHORT, LONG and SWIPE.
  • Each region 838 of the plot also has an associated math operation 842.
  • the math operation is a calculation that the processor 108 executes on the current value of the BPV 228 variable stored in the button value counter 142.
  • the particular path that a curve follows determines which, and how many, of the one or more math operations 842 the processer 108 applies to the BPV.
  • Figure 36 shows a flowchart of an embodiment of a method 560 for a user to specify a character from among a plurality of characters.
  • a user views the characters 200 displayed in the menu 240.
  • the user selects a character from the menu 240 for input to the electronic device 100.
  • the user identifies the selected character by the position of the character with respect to the reference 258 of the menu 240, for example by a value equal to the number of positions the selected character is offset from the menu's reference 258.
  • the user can identify the position (or index) of the selected character in a number of ways, including by referencing the position to a corresponding value in the offset scale 260, referencing the position to a corresponding mark, color or position indicator of known offset or correspondence with a selection button, counting the number of positions that the selected character is offset from the reference 258, recalling from memory the value that identifies the particular selected character, and recalling by muscle memory the selection button keystrokes that correspond with the selected character or the selected character's position.
  • step 516 the user determines whether the value that identifies the selected character's position 242 in the menu 240 equals the assigned button press value 222 of any selection button 110.
  • step 538 the user presses the selection button with the assigned value that equals the selected character's position and releases the button before the elapsed time counter expires.
  • the aforementioned step 538 inputs the assigned value 222 of the pressed selection button to the button value counter 142, triggers the CPU 108 to start the elapsed time counter 140, and indicates to the CPU that the type of button press is a SHORT press.
  • step 522 the user views the specified character on the display 104.
  • step 522 is bypassed.
  • the user determines whether the value that identifies the selected character's position 242 in the menu 240 equals one more than the assigned button press value 222 of any selection button 110.
  • step 564 the user presses the selection button 110 with the assigned value 222 that equals one less than the selected character's position and maintains the button press until the elapsed time counter expires.
  • the aforementioned step 564 inputs the assigned value 222 of the pressed selection button to the button press value counter 142, triggers the CPU 108 to start the elapsed time counter 140, and indicates to the processor that the type of button press is a SHORT press. Then, once the elapsed time counter expires the CPU adds one to the button press value counter and indicates to the processor that the type of button press is a LONG press.
  • step 522 the user views the specified character on the display 104. In an alternative embodiment, step 522 is bypassed.
  • step 568 the user presses the selection button 110 with the assigned value 222 that equals two less than the selected character's position and maintains the button press until the elapsed time counter expires.
  • the aforementioned step 568 inputs the assigned value 222 of the pressed selection button to the button press value counter 142, triggers the CPU 108 to start the elapsed time counter 140, and indicates to the processor that the type of button press is a SHORT press. Then, once the elapsed time counter expires the CPU adds one to the button press value counter and indicates to the processor that the type of button press is a LONG press.
  • step 569 the user completes, in the same button activation as step 568, a swipe gesture that exceeds the swipe distance threshold.
  • the aforementioned step 569 triggers the CPU to add one to the button press value counter 142 and indicates to the processor that the type of button press is a SWIPE gesture.
  • step 522 the user views the specified character on the display 104. In an alternative embodiment, step 522 is bypassed.
  • a step 570 that is an alternate to steps 568 and 569, the user presses the selection button 110 with the assigned value 222 that equals two less than the selected character's position and completes a swipe gesture that exceeds the swipe distance threshold before the elapsed time counter expires.
  • the aforementioned step 570 inputs the assigned value 222 of the pressed selection button to the button press value counter 142, triggers the CPU 108 to start the elapsed time counter 140, and indicates to the processor that the type of button press is a SHORT press.
  • the swipe gesture interpreter 144 recognizes the BPT as a SWIPE BPT, the CPU adds two to the button press value counter and indicates to the processor that the type of button press is a SWIPE BPT.
  • step 568 and 569 identifies the same position identified in the step 570. So the steps 568 and 569 together identify a character in the same position of the menu as the step 570. Therefore, the method 560 offers two alternative ways for a user to recognize how to select a character two positions greater than the assigned value of a selection button. Whichever of steps 568/569 or 570 the user chooses to follow, the character that gets selected is the same.
  • the character specification method 560 described above is used iteratively to specify series' of characters from the character menu 240.
  • words and sentences are formed on the display 104 by iteratively specifying characters according the method above, with the spacebar 264 used to input spaces between words on the display.
  • FIG 37 shows a flowchart of an embodiment of a method 741 for the processor 108 of an electronic device to interpret button presses and swipes.
  • the CPU 108 initializes a variable 'button press value' (BPV) stored by the button press value counter 142 to zero.
  • the CPU initializes a variable 'button press type' (BPT) to a empty string.
  • the CPU 108 initializes a variable 'elapsed time' (ET) stored by the elapsed time counter 140 to zero.
  • the CPU initializes a variable 'duration of the ETP' to a non-zero value or alternatively receives a non-zero value selected by a user.
  • the CPU 108 monitors the selection buttons 110 for a pressed selection button 110. Once a selection button press occurs, in another step 616, the CPU 108 sets the variable BPV to a value equal to the assigned value 222 of the pressed selection button 110. In another step 618, the CPU 108 starts the elapsed time counter 140. In a pair of steps 622, 684 the swipe gesture interpreter 144 monitors the selection button pressed in the step 614 for the occurrence of a swipe gesture. At the same time, the elapsed time counter 140 compares the elapsed time (ET) with the selected duration of the elapsed time period (ETP). The step 622 corresponds with the comparison of the curve 842 with the threshold value 830 of Figure 35. The step 684 corresponds with the comparison of the curve 842 with the threshold value 832 of Figure 35.
  • ETP elapsed time
  • the swipe gesture interpreter 144 recognizes that the swipe threshold is exceeded before the elapsed time period expires, in a subsequent step 760, the CPU adds two to the variable BPV. In a subsequent step 756, the CPU updates the variable BPT to SWIPE and in another subsequent step 758 the CPU outputs the current values for the variables BPV and BPT.
  • the CPU 108 determines if the button is still pressed.
  • the CPU 108 monitors the selection buttons 110 to determine if the pressed selection button remains pressed and for the occurrence of a SWIPE BPT.
  • the swipe gesture interpreter 144 recognizes that the swipe threshold is exceeded, then in a subsequent step 748 the CPU adds one to the variable BPV. Then in a subsequent step 756, the CPU updates the variable BPT to SWIPE and in another subsequent step 758 the CPU outputs the current values for the variables BPV and BPT.
  • the CPU 108 interprets as input the character 200 of the menu 240 whose position 242 equals the BPV output in the step 758. According to a further embodiment of the invention, the CPU executes the method 741 iteratively, which selects one character from the menu per iteration. According to another embodiment, in a further step the CPU 108 displays the identified character 200 on the screen 104.
  • Figure 38 shows a portion of the user interface 150 of Figure 34 and a table.
  • the portion of the user interface includes selection buttons 110 that have assigned button press values 222 equal 0, 2 and 5.
  • the table shows possible values for six variables of the method 741 of Figure 37.
  • Three of the variables are input variables 810, which are selectable by a user.
  • Three of the variables are output variables 815, which are determined by the device 110 according to the logic of Figure 37.
  • the input variables 810 selectable by a user are: the variable 'assigned value of pressed button' 222, a variable 'swipe threshold exceeded?' 804, a variable 'button lifted before or after time expires?' 788.
  • the output variables 815 determined by the device are: the variable 'button press type (BPT)' 224, the calculation 790, and the 'calculated button press value (BPV)' 228.
  • Each row of the table discloses a unique combination of the three input variables 810. Assuming for a moment that the variable 'assigned value of pressed button' 222 is constant, then the remaining two input variables 'swipe threshold exceeded?' 804 and 'button lifted?' 788 have four possible unique combinations:
  • Each combination specifies a unique calculation 790.
  • the specified calculation 790 together with the value of the pressed button 222, determines a value for the variable 'calculated BPV 228.
  • button activations that are SWIPE BPT are time-independent.
  • button activations that are SWIPE BPT are time independent, button activations that are not SWIPE BPT (i.e., SHORT and LONG BPTs) are not.
  • the duration of the button press still determines whether the calculated BPV 228 equals the value of the pressed button 222 (SHORT BPT) or one more than the value of the pressed button (LONG BPT).
  • buttons 110, assigned button values 222 and values for the input and output variables 810, 815 are examples used to demonstrate the embodiments of Figures 34, 35, 36 and 37.
  • the scope of the invention is not limited by the variables and values shown here, but rather by the scope of the claims.
  • Figure 39 shows a portion of an alternative embodiment of the user interface 150 of Figure 34 and a corresponding table of variables 810, 815.
  • the portion of the alternative embodiment includes the selection buttons 1 10 and assigned button press values 222 of the embodiment of Figure 38, but also includes an additional button with an assigned value equal 8.
  • Figure 40 shows another flowchart of an embodiment of a method 781 for the processor 108 of an electronic device to interpret button presses.
  • the CPU 108 initializes a variable 'button press value' (BPV) stored by the button press value counter 142 to zero.
  • the CPU initializes a variable 'button press type' (BPT) to a empty string.
  • the CPU 108 initializes a variable 'elapsed time' (ET) stored by the elapsed time counter 140 to zero.
  • the CPU initializes a variable 'duration of the ETP' to a non-zero value or alternatively receives a non-zero value selected by a user.
  • the CPU initializes a variable 'cycle interrupted' to FALSE.
  • the CPU 108 monitors the selection buttons 110 for a pressed selection button. Once a first selection button is pressed, in another step 616, the CPU 108 sets the variable BPV to a value equal to the assigned value 222 of the first pressed selection button 110.
  • the CPU determines if the variable BPV equals zero. If the BPV is not equal zero, in a step 722 the CPU sets a local ETP variable equal to the variable 'duration of ETP' initialized in the step 746. Alternatively, if the BPV equals zero, then in an alternative step 724 the CPU sets the local ETP variable equal zero.
  • the CPU 108 starts the elapsed time counter 140.
  • the CPU 108 monitors the selection buttons 110 for a swipe gesture that exceeds the swipe distance threshold or for a second button press, while also comparing the elapsed time (ET) with the local ETP variable to see if the ETP has expired.
  • the step 622 corresponds with the comparison of the curve 842 with the threshold value 830 of Figure 35.
  • the step 684 corresponds with the comparison of the curve 842 with the threshold value 832 of Figure 35.
  • the swipe gesture interpreter 144 recognizes that the swipe distance threshold is exceeded before (a) the elapsed time period expires or (b) a second button press occurs, then in a subsequent step 760, the CPU adds two to the variable BPV. In a subsequent step 756, the CPU updates the variable BPT to SWIPE and in another subsequent step 758 the CPU outputs the current values for the variables BPV and BPT.
  • the CPU interprets a second button press before (a) the swipe distance threshold is exceeded, (b) the elapsed time period expires, or (c) the first button press is lifted, then in a subsequent step 752 the CPU updates the variable BPT to SHORT, in a subsequent step 776 the CPU changes the variable 'cycle interrupted' from FALSE to TRUE, and in another subsequent step 758 the CPU outputs the current values for the variables BPV and BPT.
  • the CPU 108 determines if the first button press is still pressed.
  • the CPU updates the variable BPT to SHORT and in another subsequent step 758 the CPU outputs the current values for the variables BPV and BPT.
  • the CPU 108 monitors the selection buttons 110 for (a) a swipe gesture that exceeds the swipe distance threshold or (b) a second button press, while also determining if the first button press is still pressed.
  • the CPU determines the first button press is no longer pressed before either (a) a swipe gesture that exceeds the swipe distance threshold or (b) a second button press, then in a subsequent step 758 the CPU outputs the current values for the variables BPV and BPT.
  • the CPU determines that a second button press occurs before the swipe distance threshold is exceeded or the first button is released, then in a subsequent step 776 the CPU changes the variable 'cycle interrupted' from FALSE to TRUE and in another subsequent step 758 the CPU outputs the current values for the variables BPV and BPT.
  • the swipe gesture interpreter 144 recognizes that the swipe threshold is exceeded before the elapsed time period expires or a second button press occurs, then in a subsequent step 748 the CPU adds one to the variable BPV. In a subsequent step 756, the CPU updates the variable BPT to SWIPE and in another subsequent step 758 the CPU outputs the current values for the variables BPV and BPT.
  • the CPU In a step 612 subsequent to the step 758 that outputs values for the variables BPV and BPT, the CPU resets the variable 'elapsed time' (ET) stored by the elapsed time counter 140 to zero. Then, in a subsequent step 778, the CPU determines the value stored in the variable 'cycle interrupted'.
  • ET variable 'elapsed time'
  • the CPU 108 monitors the selection buttons 110 for a next pressed selection button.
  • the CPU determines the variable 'cycle interrupted' is TRUE
  • the CPU sets the variable BPV stored by the button press value counter 128 to the button press value 222 of the second pressed selection button in the previous character selection cycle.
  • the CPU updates the variable 'cycle interrupted' to FALSE.
  • the CPU determines if the variable BPV stored by the button press value counter 142 equals zero.
  • CPU 108 interprets as input the character 200 of the menu 240 whose position 242 equals the BPV output in the step 758.
  • the CPU executes the method 741 iteratively, which selects one character from the menu per iteration.
  • the CPU 108 displays the identified character 200 on the screen 104.
  • Figure 41 shows a table 801 that lists the ways each menu position 242 can be identified using the logic of Figure 37 for the user interface 150 of Figure 34. Although the table 801 shows only six positions plus a '0' position, due to symmetry about the value 0, these six positions can be applied to either half of the 13 -position menu of the user interface 150 of Figure 34. Although the table of Figure 41 applies to the embodiment of the user interface of Figure 34, the table could be extended to apply to any length menu 240.
  • the table 801 includes values for the following variables: variable 'menu position' 242, variable 'gesture to select character' 802, variable 'assigned value of pressed button 222, variable 'swipe threshold exceeded' 804, variable 'button released' 806, variable ⁇ expired' 808.
  • the table 801 shows that for positions accessible using a SWIPE BPT (Positions 1 and 4 for the embodiment of Figure 34), there is always at least two ways for a user to reach that position. Furthermore, for those positions, the variable ⁇ expired' is FALSE for at least one way and TRUE for at least one of the others. That fact guarantees that even if a user fails to exceed the swipe distance when they expect to (i.e., expected to exceed the swipe threshold before ETP expired but completed it after, or vice-versa), the same character gets selected anyway. That fact makes SWIPE BPT time-independent.
  • Each row of the table has one grey box 809 that marks one or the other of the variables 'swipe threshold completed' 804 and 'button released' 806.
  • the grey box 809 indicates the action that signifies the end the character selection cycle.
  • the character selection cycle terminates with a button release. In other words, if a button is released and a swipe threshold is not exceeded, then the selection cycle ends. (In an alternative embodiment, the selection cycle extends until the ETP expires for short presses, but that isn't necessary.)
  • the selection cycle may or may not immediately end.
  • swipes that exceed the swipe threshold before the ETP expires cause the selection cycle to immediately end, but swipes that exceed the swipe threshold after the ETP expires do not cause the selection cycle to end.
  • the button release ends the selection cycle. This enables the user to "undo" a SWIPE BPT, if they want, by swiping back to the position where the swipe gesture originated.
  • Figure 42 is a table 801 that lists the ways each position 242 of the menu 240 can be reached using the selection buttons 110 and the logic of Figure 40 for the user interface 150 of Figure 34.
  • the table 801 of the embodiment of Figure 42 also includes assigned characters 200 for each position of the menu 240.
  • the table includes values for the following variables: variable 'menu position' 242, variable 'gesture to select character' 802, variable 'button pressed' 222, variable 'swipe threshold exceeded' 804, variable 'button released' 806, variable ⁇ expired' 808 and character 200.
  • the table of Figure 42 is just one possible embodiment of the user interface of Figure 34 and the methods of Figures 35, 36, 37 and 40, but in alternative embodiments could include alternative character assignments, alternative assigned values for the selection buttons 110, and alternative numbers of menu positions 242 and selection buttons 110, among other possible variations.
  • Figures 43 and 44 show examples of how a word 130 is composed according to the method 781 of Figure 40 and the embodiment of the user interface 150 of Figure 34.
  • the composed word 130 is 'flag'.
  • Each row of Figure 43 shows one or more ways in which a particular letter 200 of the word 130 could be composed from a 'button press value' 222 and a 'button press type' 224.
  • Values for the variables 'button press value' 222 and 'button press type' 224 are selected by a user based on the position of an intended character 200 in the menu 240 and knowledge about how gestures identify calculations 790 according to the method 781 of Figure 40.
  • the variable ⁇ expired' 808 shows that for the SWIPE BPT the swipe gesture may be completed before or after the ETP expires and the same character becomes selected in either case.
  • variable 'calculation' 790 (sometimes referred to as 'math operation') is specified based on the BPT 224 according to the logic of the method 781 of Figure 40.
  • the variable 'calculated BPV 228 (sometimes also referred to as 'total BPV) is the result of the calculation of 790 and the assigned BPV 222 selected by the user.
  • the device identifies the user's intended character 200 based on the 'calculated BPV and the assignment of the characters in the menu 240.
  • the composed word 130 is 'bake' .
  • Each row of Figure 44 shows one or more ways in which a character 200 could be composed from a 'button press value' 222 and a 'button press type' 224.
  • Values for the variables 'button press value' 222 and 'button press type' 224 are selected by a user based on the position of an intended character 200 in the menu 240 and knowledge about how gestures identify calculations 790 (sometimes referred to as math operations) according to the method 781 of Figure 40.
  • the variable ⁇ expired' 808 shows that for the SWIPE BPT the swipe distance threshold may be completed before or after the ETP expires and the same character becomes selected in either case.
  • variable 'calculation' 790 is specified based on the BPT 224 according to the logic of the method 781 of Figure 40.
  • the variable 'calculated BPV 228 is the result of the calculation of 790 and the assigned BPV 222 selected by the user.
  • the device identifies the user's intended character 200 based on the 'calculated BPV and the assignment of the characters in the menu 240.
  • Figure 45 shows a schematic drawing of another embodiment of the electronic device 100 for input of characters.
  • the device 100 may have some or all the components and functionality described herein with respect to the mobile device 100 of Figure 1.
  • the device 100 has aspects previously disclosed in Figure 8 of U.S. Patent No. 8,487,877, which is hereby incorporated by reference in its entirety.
  • the electronic device 100 includes the display 104, the plurality of characters 200 that populate positions 242 of the character menu 240, the plurality of selection buttons 110 and the spacebar button 264, which together make up the user interface 150 of the device 100.
  • Each of the plurality of selection buttons 110 has an assigned button press value 222. Included as part or within proximity to the menu 240 is the reference 258 and the offset scale 260.
  • the display 104, and specifically the menu 240 the plurality of selection buttons 110, and the spacebar button 264, are communicatively coupled with the CPU 108, as described in the embodiment of Figure 1.
  • the CPU 108 includes the elapsed time counter 140, the integer value counter 142 and the swipe gesture interpreter 144, as described in the embodiment of Figure 1.
  • the CPU 108 is communicatively coupled with the storage medium 112 and the power source 122, as described in the embodiment of Figure 1.
  • the menu 240 has 17 menu positions 242 and the plurality of selection buttons includes six buttons with the assigned button press values 222: '-5, -2, 0, +2, +5, +8' .
  • the menu positions 242 are populated by 17 of the 33 characters 200 of the Russian alphabet.
  • Figure 46 shows a table 801 that lists the ways each position 242 of the menu 240 can be reached using the logic of the method 781 of Figure 40 for the embodiment of the user interface 150 of Figure 45.
  • the table 801 includes assigned characters 200 for each position of the menu 240.
  • the table includes values for the following variables: variable 'menu position' 242, variable 'gesture to select character' 802, variable 'button pressed' 222, variable 'swipe threshold exceeded' 804, variable 'button released' 806, variable ⁇ expired' 808 and character 200.
  • the table of Figure 46 is just one possible embodiment of the user interface of Figure 45 and the methods of Figures 35, 36, 37 and 40, but in alternative embodiments could include alternative character assignments, alternative assigned values for the selection buttons 110, and alternative numbers of menu positions 242 and selection buttons 110, among other possible variations.

Abstract

Systems, devices and methods are disclosed for input of characters using button presses and swipe gestures, in particular time-dependent button presses and time-independent swipe gestures. In one embodiment, a swipe gesture identifies a selected character as the character in a menu position equal to the assigned value of a pressed selection button, plus the assigned value of the pressed selection button plus one. In a further embodiment, a time dependent button press incorporated in the swipe gesture identifies the same character, but identifies the selected character as the character in the menu position equal to twice the assigned value of the pressed selection button, plus one.

Description

METHOD OF CHARACTER IDENTIFICATION THAT USES
TIME DEPENDENT BUTTON PRESSES AND TIME INDEPENDENT SWIPE GESTURES
CROSS-REFERENCE TO RELATED APPLICATIONS This application claims the benefit under 35 U.S.C. § 119(e) of U.S. Provisional Patent Application No. 62/276,729, filed January 8, 2016, U.S. Provisional Patent Application No. 62/318, 125, filed April 4, 2016 and U.S. Provisional Patent Application No.
62/334,702 filed May 11, 2016, the contents of which are incorporated herein by reference in their entireties. BACKGROUND
Technical Field
This description generally relates to the field of electronic devices and, more particularly, to user interfaces of electronic devices.
BRIEF SUMMARY
A computer processor-implemented method may be summarized as including detecting, by at least one computer processor, a swipe gesture from a first button of a plurality of selection buttons toward a second button of the plurality of selection buttons; and selecting, by at least one computer processor, a character based on assigned values of the first and second selection buttons. The character may be a member of a menu, the character may be identified by a value, and the assigned values of the first and second selection buttons may mathematically combine to equal the character's identifying value.
The method may further include in response to the detection of the swipe gesture, adding, by at least one computer processor, assigned values of the first and second buttons; and selecting the character based on a result of the adding.
The method may further include in response to the detection of the swipe gesture, subtracting, by at least one computer processor, assigned values of the first and second buttons; and selecting the character based on a result of the subtracting.
A first character of the at least two characters may occupy a position of the one-dimensional array identified by an assigned value of the first button and a second character of the at least two characters may occupy a position of the one-dimensional array identified by a sum of the assigned value of the first button and an assigned value of the second button. The first button may be assigned one of the button press values -3, -2, +2, and +3. The at least two characters may occupy positions of a one-dimensional array. The one-dimensional array may be a menu of 13 positions, each position may be populated by a character, and the plurality of buttons may include at least four buttons.
A computer processor-implemented method may be summarized as including identifying, by at least one computer processor, selection of at least a first button of a plurality of selection buttons and a second button of the plurality of selection buttons in response to a swipe gesture; and in response to the identification at least the first button of the plurality of selection buttons and the second button of the plurality of selection buttons in response to the swipe gesture, selecting, by at least one computer processor, a character based on a mathematical combination of assigned values of the first and second selection buttons and a value assigned to the character. The mathematical combination may be a subtraction or addition of the assigned values of the first and second selection buttons based on the direction of the swipe gesture.
A computer processor-implemented method may be summarized as detecting a button press by a user. The button press engages one of a plurality of buttons, each button having a unique assigned value. In some embodiments, the assigned value of one button of the plurality of buttons is zero (e.g., a reference button), the assigned value of at least two buttons of the plurality of buttons is a positive integer (e.g., +2 and +3), and the assigned value of at least two other buttons of the plurality of buttons is a negative integer (e.g., -2 and -3).
In response to detection of the button press, a processor starts an elapsed time counter and stores a first value that corresponds to the assigned value of the pressed button. A second value is calculated from the first value based on whether the button is released before or after a given elapsed time period expires and based on whether a positional movement of the button press is completed before or after the elapsed time period expires. In response to release of the pressed button before expiration of a given elapsed time period, the second value is set equal to the first value. In response to completion of a positional movement prior to expiration of the given elapsed time period, the second value is set equal to a combination of the first value and a third value. The third value corresponds to the assigned value of a second button positioned adjacent to the first pressed button.
In some embodiments, where the first value is positive and the direction of the positional movement is away from a reference button (e.g., zero), the second value is set equal to the first value plus the third value, (e.g., +2 (assigned value of first pressed button) plus +3 (assigned value of button adjacent to the first button)). In another embodiments, where first value is positive but the direction of the positional movement is toward the reference button, the second value is set equal to the first value minus the third value, which is the assigned value of the button adjacent to the first pressed button but closer to the reference (e.g., +3 (assigned value of the first pressed button) minus +2). In yet another embodiment, where the first value is negative and the direction of the positional movement is away from the reference button, the second value is set equal to the first value plus the third value, (e.g., -2 (assigned value of first pressed button) plus -3 (assigned value of button adjacent to the first pressed button but farther from the reference)). In yet another embodiment, where first value is negative but the direction of the positional movement is toward the reference button, the second value is set equal to the first value minus the third value, which is the assigned value of the button adjacent to the first pressed button but closer to the reference (e.g., -3 (assigned value of first pressed button) minus -2).
In response to detection that the first button pressed extends beyond the given elapsed time period and without completion of a positional movement, the second value is set equal to twice the first value. But in response to completion of a positional movement after the expiration of the given elapsed time period, the second value is set equal to a combination of a fourth value with the second value, where the second equals twice the first value. In some embodiments, the fourth value is equal to one.
In some embodiments, where the first value is positive and the direction of the positional movement is away from the reference button, the second value is set equal to twice the first value plus the fourth value (e.g., (2 * 2) + 1). In some other embodiments, where the first value is positive but the positional movement is in the direction toward the reference button, the second value is set equal to twice the first value minus the fourth value (e.g., (2 * 2) - 1). In other embodiments, where the first value is negative and the direction of the positional movement is away from the reference button, the second value is set equal to twice the first value minus the fourth value (e.g., (2 * (-2)) - 1). In yet other embodiments, where the first value is negative but the direction of the positional movement is toward the reference button, the second value is set equal to twice the first value plus the fourth value (e.g., (2 * (-2)) +1)).
In one embodiment, each of a plurality of characters is identified by a unique second identifying value. In a further embodiment, in response to release of the first button press or completion of the positional movement, a character from the plurality of characters is selected based on the selected character's identifying value and the calculated second value.
Mobile text input is notoriously slow, inaccurate and inconvenient. To make text input easier, a novel computer-processor implemented method and interface is proposed that reduces the dexterity needed to type. The interface eases text input by providing large selection buttons and enabling selection gestures that are resistant to input errors but still intuitive.
To select a character, a user slides a cursor from the middle position of a menu row to a desired character. To do so, a user presses the selection button that corresponds with the row and position of a color block that contains the desired character.
Each color block identifies two characters. By default, the cursor selects the first character it encounters. To select the second character, a user types the first character and lets a correction algorithm exchange the first character for the second one, if needed.
To select a character outside a color block, a user drags the cursor one position beyond its associated color block by including a swipe gesture with the button press.
Upon pressing the spacebar, the previously mentioned correction algorithm launches. The algorithm analyzes the previously entered word, exchanges first characters for second ones, as needed, to identify the intended word in a dictionary. If an intended word is not in the dictionary, a user manually enters the second characters by pressing the button that selects the first character, but maintains the button press for longer than a pre-determined time threshold, such as 0.2 seconds.
After each character selection, the cursor returns to the menu's middle position. Within a few minutes of use, a user recognizes the pattern for selecting each character. From that point on, the cursor is a redundant affordance and can be disabled.
In the context of text input, robustness refers to the level of dexterity that an interface requires of its users for them to achieve the output they expect. In the case of the proposed method, robustness is judged in terms of the positional and temporal accuracy required. The proposed interface has the requirement of temporal accuracy due to the time-dependent button presses that distinguish characters within a color block.
To avert errors due to positional inaccuracy, the interface provides fewer but larger selection buttons. The large button size compared with a 26-button interface reduces the rate of positional selection errors compared with typical interfaces. To avert errors due to temporal inaccuracy, the interface provides a novel system for interpreting input gestures. The system includes logic that enables button presses to be time-dependent while keeping swipe gestures time-independent, and an algorithm that corrects time-dependent button press errors very accurately.
To enable the desired logic, the system identifies selection buttons and menu positions with values. Menu positions of each row are numbered consecutively with '0' at the middle position. Selection buttons corresponding to each menu row have the assigned values and in one embodiment have the assigned values -5, -2, 0, 2, and 5 with '0' assigned to the middle button.
The system uses two variables: button press value (BPV) and button press type (BPT). BPV is an integer variable that stores the value of a pressed selection button. BPT is a string variable that represents one of three possible classifications of a received input gesture— SHORT press, LONG press, or SWIPE gesture.
To specify a character, the system interprets an input gesture and identifies its BPT. The BPT is interpreted from the input gestures that move the cursor. Based on the BPT, the system applies one or more simple calculations to the BPV. At the end of a character selection cycle, the system specifies the character of the menu that corresponds with the row of the pressed selection button and the value of the BPV variable.
To determine a value for BPT, a gesture interpreter monitors two button press metrics: button press duration and swipe distance. Duration is a measure of the length of a button press from onset to release. Swipe distance is the length of any positional displacement during the button press.
A combination of measurements for duration and distance over the course of a character selection cycle can be represented as a response curve and plotted in a two-dimensional plot.
In the plot, button press duration is plotted on one axis and swipe distance on another. Each curve represents the progress of a button press as it unfolds. Onset of a button press occurs at the plot's origin. Its release occurs at a curve's terminus. The path that a curve follows through the plot reflects the duration and swipe distance of a received button press.
The response of each metric is converted to a binary output using a threshold value. A threshold value is an arbitrary value that enables the analog output from the metric to be recast as a binary output, i.e., a high or low value. Applying threshold values for each metric to the plot divides the plot into four regions. Each region represents a unique combination of the binary output values from the metrics.
The gesture interpreter determines a BPT based on the combination of binary values it interprets. Because the metrics may cross threshold values during a character selection cycle, the BPT may evolve during the selection cycle. The final determination for BPT occurs when the button press is lifted. The region of the plot in which a curve terminates identifies the final BPT for the selection cycle. Examples of typical threshold values are 200 msec for the time response and 25 pixels for the distance response.
The purpose for identifying a BPT is to specify a calculation that, in turn, specifies a total BPV and then a character.
At a high level, the proposed interface uses a set of input gestures to specify a character based on its position in a menu. In practice, that's realized as a system that uses BPT to specify a calculation that converts the assigned value of a pressed selection button value to a value that in turn identifies the position of a desired character in a menu. The value that identifies the position of a character in the menu is the 'total BPV .
Each BPT has an associated calculation (or math operation). Whenever the path of a character selection cycle crosses a threshold, the calculation associated with the newly entered BPT region becomes applied to the BPV variable.
As the plot shows, the number of calculations that become executed depends on the path that the character selection cycle follows. The number of calculations is from one (BPT = SHORT, so BPV = x) to three (BPT = SWIPE via LONG, so BPV = x + 1 + 1), where x = the assigned value of the pressed selection button.
The value of BPV at the end of the selection cycle is the total BPV. Once calculated, the system inputs the character of the menu that corresponds with the total BPV.
An intentional result of the system's design is that the total BPV of a swipe gesture that exceeds the swipe distance threshold is the same irrespective of when the user completes the gesture.
This can be seen by the math operations associated with each BPT and by following the curves that lead to the SWIPE BPT in a two-dimensional plot of the curves. For example, along one path the cumulative calculation for BPV may equal x + 2. Along another path the cumulative calculation for BPV may equal x + 1 +1, an outcome that's mathematically the same as the first path.
Because a button activation that exceeds the swipe distance threshold yields the same total BPV along either path, input that exceeds the swipe distance threshold is immune from errors that are due to the time that the input gesture becomes completed.
BRIEF DESCRIPTION OF THE SEVERAL VIEWS OF THE DRAWINGS
In the drawings, identical reference numbers identify similar elements or acts. The sizes and relative positions of elements in the drawings are not necessarily drawn to scale. For example, the shapes of various elements and angles are not drawn to scale, and some of these elements are arbitrarily enlarged and positioned to improve drawing legibility. Further, the particular shapes of the elements as drawn are not intended to convey any information regarding the actual shape of the particular elements, and have been solely selected for ease of recognition in the drawings.
Figure 1 is a schematic view of an example electronic device for input of characters with time-dependent button presses and time-independent swipe gestures according to one illustrated embodiment, the electronic device being a mobile device having a housing, a display, a graphics engine, a central processing unit (CPU), user input device(s), one or more storage mediums having various software modules thereon that are executable by the CPU, input/output (I/O) port(s), network interface(s), wireless receiver(s) and transmitter(s), a power source, an elapsed time counter, an integer value counter and a swipe gesture interpreter.
Figure 2 is a schematic drawing of one embodiment of the electronic device 100 for input of characters. The user interface 150 was previously disclosed in Figure 8 of U.S. Patent No. 8,487,877, which is hereby incorporated by reference in its entirety.
Figure 3 is graphical representations of examples of various button press types.
Figure 4 is a schematic drawing of a portion of one embodiment of the electronic device 100 for input of characters.
Figure 5 is a schematic drawing of a portion of another embodiment of the electronic device 100 for input of characters.
Figure 6 is a table of assignments for one embodiment of a method for specifying a character from among a plurality of characters. Figure 7 is a table of example values for variables for one embodiment of a method for specifying a character from among a plurality of characters.
Figure 8 is a schematic drawing of a portion of another embodiment of the electronic device 100 for input of characters.
Figure 9 is a table of example values for variables for one embodiment of a method for specifying a character from among a plurality of characters.
Figure 10 is a flow diagram that shows a method for specifying a character from among a plurality of characters according to one illustrated embodiment.
Figure 11 is a flow diagram that shows a method for an electronic device to interpret button presses according to one illustrated embodiment.
Figure 12 is a table of value assignments, a user interface and a list of variables for one embodiment of a method of character identification.
Figure 13 is flow diagrams that show variables and values for an embodiment of a method for an electronic device to interpret button presses.
Figure 14 is an example of an application of a method of character identification.
Figure 15 is another example of an application of a method of character identification.
Figure 16 is a schematic drawing of another embodiment of the electronic device 100 for input of characters.
Figure 17 is a table of value assignments for another embodiment of a method of character identification.
Figure 18 is graphical representations of additional examples of various button press types.
Figure 19 is a flow diagram that shows another method for an electronic device to interpret button presses according to one illustrated embodiment.
Figure 20 is a flow diagram that shows yet another method for an electronic device to interpret button presses according to one illustrated embodiment.
Figure 21 is a flowchart that shows another method that interprets button press and swipe input gestures to select a character from a menu according to one illustrated embodiment.
Figure 22 is a flow diagram that shows a method for specifying a character from among a plurality of characters according to one illustrated embodiment.
Figure 23 is a flow diagram that shows a method for an electronic device to interpret button presses according to one illustrated embodiment. Figure 24 is a table of possible values and variables for a method of interpreting input according to one to set of selection buttons.
Figure 25 is table of possible values and variables for a method of interpreting input according to another set of selection buttons.
Figure 26 is another flow diagram that shows a method for an electronic device to interpret button presses according to one illustrated embodiment.
Figure 27 is a table of possible variable combinations for a method of interpreting button presses according to one illustrated embodiment.
Figure 28 is a table of value assignments, a user interface and a list of variables for one embodiment of a method of character identification.
Figure 29 is an example of an application of a method of character identification.
Figure 30 is another example of an application of a method of character identification.
Figure 31 is a schematic drawing of another embodiment of the electronic device 100 for input of characters.
Figure 32 is another table of value assignments, a user interface and a list of variables for one embodiment of a method of character identification.
Figure 33 is a plot of graphical representations of examples of possible button input gestures as a function of time and positional displacement.
Figure 34 is a schematic drawing of one embodiment of the electronic device 100 for input of characters. The user interface 150 was previously disclosed in Figure 8 of U.S. Patent No. 8,487,877, which is hereby incorporated by reference in its entirety.
Figure 35 is a plot of graphical representations of possible examples of responses of input gestures.
Figure 36 is a flow diagram that shows a method for specifying a character from among a plurality of characters according to one illustrated embodiment.
Figure 37 is a flow diagram that shows a method for an electronic device to interpret button presses according to one illustrated embodiment.
Figure 38 is a table of possible values and variables for a method of interpreting input according to one to set of selection buttons.
Figure 39 is table of possible values and variables for a method of interpreting input according to another set of selection buttons.
Figure 40 is another flow diagram that shows a method for an electronic device to interpret button presses according to one illustrated embodiment. Figure 41 is a table of possible variable combinations for a method of interpreting button presses according to one illustrated embodiment.
Figure 42 is a table of value assignments, a user interface and a list of variables for one embodiment of a method of character identification.
Figure 43 is an example of an application of a method of character identification.
Figure 44 is another example of an application of a method of character identification.
Figure 45 is a schematic drawing of another embodiment of the electronic device 100 for input of characters.
Figure 46 is another table of value assignments, a user interface and a list of variables for one embodiment of a method of character identification.
DETAILED DESCRIPTION
In the following description, certain specific details are set forth in order to provide a thorough understanding of various disclosed embodiments. However, one skilled in the relevant art will recognize that embodiments may be practiced without one or more of these specific details, or with other methods, components, materials, etc. In other instances, well-known structures associated with computing systems including client and server computing systems, as well as networks, including various types of telecommunications networks, have not been shown or described in detail to avoid unnecessarily obscuring descriptions of the embodiments.
Unless the context requires otherwise, throughout the specification and claims which follow, the word "comprise" and variations thereof, such as "comprises" and "comprising," are to be construed in an open, inclusive sense, that is, as "including, but not limited to."
Reference throughout this specification to "one embodiment" or "an embodiment" means that a particular feature, structure or characteristic described in connection with the embodiment is included in at least one embodiment. Thus, the appearances of the phrases "in one embodiment" or "in an embodiment" in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more embodiments.
As used in this specification and the appended claims, the singular forms "a," "an," and "the" include plural referents unless the content clearly dictates otherwise. It should also be noted that the term "or" is generally employed in its sense including "and/or" unless the content clearly dictates otherwise.
The headings and Abstract of the Disclosure provided herein are for convenience only and do not interpret the scope or meaning of the embodiments.
Various embodiments are described herein that provide systems, devices and methods for input of characters with optional time-dependent button presses.
For example, Figure 1 is a schematic view of one example electronic device, in this case mobile device 100, for input of characters with optional time- dependent button presses according to one illustrated embodiment. The mobile device 100 shown in Figure 1 may have a housing 102, a display 104, a graphics engine 106, a central processing unit (CPU) 108, one or more user input devices 110, one or more storage mediums 112 having various software modules 114 stored thereon comprising instructions that are executable by the CPU 108, input/output (I/O) port(s) 116, one or more wireless receivers and transmitters 118, one or more network interfaces 120, and a power source 122. In some embodiments, some or all of the same, similar or equivalent structure and functionality of the mobile device 100 shown in Figure 1 and described herein may be that of, part of or operably connected to a communication and/or computing system of another device or machine.
The mobile device 100 may be any of a large variety of devices such as a cellular telephone, a smartphone, a wearable device, a wristwatch, a portable media player (PMP), a personal digital assistant (PDA), a mobile communications device, a portable computer with built-in or add-on cellular communications, a portable game console, a global positioning system (GPS), a handheld industrial electronic device, a television, an automotive interface, an augmented reality (AR) device, a virtual reality (VR) device or the like, or any combination thereof. The mobile device 100 has at least one central processing unit (CPU) 108 which may be a scalar processor, a digital signal processor (DSP), a reduced instruction set (RISC) processor, or any other suitable processor. The central processing unit (CPU) 108, display 104, graphics engine 106, one or more user input devices 110, one or more storage mediums 112, input/output (I/O) port(s) 116, one or more wireless receivers and transmitters 118, and one or more network interfaces 120 may all be communicatively connected to each other via a system bus 124. The system bus 124 can employ any suitable bus structures or architectures, including a memory bus with memory controller, a peripheral bus, and/or a local bus.
The mobile device 100 also includes one or more volatile and/or nonvolatile storage medium(s) 112. The storage mediums 112 may be comprised of any single or suitable combination of various types of processor-readable storage media and may store instructions and data acted on by CPU 108. For example, a particular collection of software instructions comprising software 114 and/or firmware instructions comprising firmware are executed by CPU 108. The software or firmware instructions generally control many of the operations of the mobile device 100 and a subset of the software and/or firmware instructions may perform functions to operatively configure hardware and other software in the mobile device 100 to provide the initiation, control and maintenance of applicable computer network and
telecommunication links from the mobile device 100 to other devices using the wireless receiver(s) and transmitter(s) 118, network interface(s) 120, and/or I/O ports 116.
The CPU 108 includes an elapsed time counter 140. The elapsed time counter 140 may be implemented using a timer circuit operably connected to or as part of the CPU 108. Alternately some or all of the elapsed time counter 140 may be implemented in computer software as computer executable instructions stored on volatile and/or non-volatile storage medium(s) 112, for example, that when executed by CPU 108 or a processor of a timer circuit, performs the functions described herein of the elapsed time counter 140.
The CPU 108 includes an integer value counter (also called button press value counter) 142. Alternately, some or all of the integer value counter 142 may be implemented in computer software as computer executable instructions stored on volatile and/or non-volatile storage medium(s) 112, for example, that when executed by CPU 108, performs the functions described herein of the integer value counter 142.
The CPU 108 includes a swipe gesture interpreter 144. Alternately, some or all of the swipe gesture interpreter 144 may be implemented in computer software as computer executable instructions stored on volatile and/or non-volatile storage medium(s) 112, for example, that when executed by CPU 108, performs the functions described herein of the swipe gesture interpreter 144.
By way of example, and not limitation, the storage medium(s) 112 may be processor-readable storage media which may comprise any combination of computer storage media including volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data. Combinations of any of the above should also be included within the scope of processor-readable storage media.
The storage medium(s) 112 may include system memory which includes computer storage media in the form of volatile and/or nonvolatile memory such as read- only memory (ROM) and random access memory (RAM). A basic input/output system (BIOS), containing the basic routines that help to transfer information between elements within mobile device 100, such as during start-up or power-on, is typically stored in ROM. RAM typically contains data and/or program modules that are immediately accessible to and/or presently being operated on by CPU 108. By way of example, and not limitation, Figure 1 illustrates software modules 114 including an operating system, application programs and other program modules that implement the processes and methods described herein.
The mobile device 100 may also include other removable/non- removable, volatile/nonvolatile computer storage media drives. By way of example only, the storage medium(s) 112 may include a hard disk drive or solid state storage drive that reads from or writes to non-removable, nonvolatile media, a SSD that reads from or writes to a removable, nonvolatile SSD, and/or an optical disk drive that reads from or writes to a removable, nonvolatile optical disk such as a DVD-RW or other optical media. Other removable/non-removable, volatile/nonvolatile computer storage media that can be used in an operating environment of the mobile device 100 include, but are not limited to, flash memory cards, other types of digital versatile disks (DVDs), micro-discs, digital video tape, solid state RAM, solid state ROM, and the like. The storage medium(s) are typically connected to the system bus 124 through a non- removable memory interface. The storage medium(s) 112 discussed above and illustrated in Figure 1 provide storage of computer readable instructions, data structures, program modules and other data for the mobile device 100. In Figure 1, for example, a storage medium may store software 114 including an operating system, application programs, other program modules, and program data. The storage medium(s) 112 may implement a file system, a flat memory architecture, a database, or any other method or combination capable for storing such information.
A user may enter commands and information into the mobile device 100 through touch screen display 104 or the one or more other input device(s) 110 such as a keypad, keyboard, tactile buttons, camera, motion sensor, position sensor, light sensor, biometric data sensor, accelerometer, or a pointing device, commonly referred to as a mouse, trackball or touch pad. Other input devices of the mobile device 100 may include a microphone, joystick, thumbstick, game pad, optical scanner, other sensors, or the like. Furthermore the touch screen display 104 or the one or more other input device(s) 110 may include sensitivity to swipe gestures, such as a user dragging a finger tip across the touch screen display 104. The sensitivity to swipe gestures may include sensitivity to direction and/or distance of the swipe gesture. These and other input devices are often connected to the CPU 108 through a user input interface that is coupled to the system bus 124, but may be connected by other interface and bus structures, such as a parallel port, serial port, wireless port, game port or a universal serial bus (USB). Generally, a unique software driver stored in software 114 configures each input mechanism to sense user input, and then the software driver provides data points that are acted on by CPU 108 under the direction of other software 114. The display is also connected to the system bus 124 via an interface, such as the graphics engine 106. In addition to the display 104, the mobile device 100 may also include other peripheral output devices such as speakers, a printer, a projector, an external monitor, etc., which may be connected through one or more analog or digital I/O ports 116, network interface(s) 120 or wireless receiver(s) and transmitter(s) 118. The mobile device 100 may operate in a networked environment using connections to one or more remote computers or devices, such as a remote computer or device.
When used in a LAN or WAN networking environment, the mobile device 100 may be connected via the wireless receiver(s) and transmitted s) 118 and network interface(s) 120, which may include, for example, cellular receiver(s) and transmitter(s), Wi-Fi receiver(s) and transmitter(s), and associated network interface(s). When used in a WAN networking environment, the mobile device 100 may include a modem or other means as part of the network interface(s) for establishing
communications over the WAN, such as the Internet. The wireless receiver(s) and transmitter(s) 118 and the network interface(s) 120 may be communicatively connected to the system bus 124. In a networked environment, program modules depicted relative to the mobile device 100, or portions thereof, may be stored in a remote memory storage device of a remote system.
The mobile device 100 has a collection of I/O ports 116 and/or short range wireless receiver(s) and transmitter(s) 118 and network interface(s) 120 for passing data over short distances to and from the mobile device 100 or for coupling additional storage to the mobile device 100. For example, serial ports, USB ports, Wi- Fi ports, Bluetooth® ports, IEEE 1394 (i.e., FireWire), and the like can
communicatively couple the mobile device 100 to other computing apparatuses.
Compact Flash (CF) ports, Secure Digital (SD) ports, and the like can couple a memory device to the mobile device 100 for reading and writing by the CPU 108 or couple the mobile device 100 to other communications interfaces such as Wi-Fi or Bluetooth transmitters/receivers and/ or network interfaces.
Mobile device 100 also has a power source 122 (e.g., a battery). The power source 122 may supply energy for all the components of the mobile device 100 that require power when a traditional, wired or wireless power source is unavailable or otherwise not connected. Other various suitable system architectures and designs of the mobile device 100 are contemplated and may be utilized which provide the same, similar or equivalent functionality as those described herein.
It should be understood that the various techniques, components and modules described herein may be implemented in connection with hardware, software and/or firmware or, where appropriate, with a combination of such. Thus, the methods and apparatus of the disclosure, or certain aspects or portions thereof, may take the form of program code (i.e., instructions) embodied in tangible media, such as various solid state memory devices, DVD-RW, RAM, hard drives, flash drives, or any other machine-readable or processor-readable storage medium wherein, when the program code is loaded into and executed by a machine, such as a processor of a computer, vehicle or mobile device, the machine becomes an apparatus for practicing various embodiments. In the case of program code execution on programmable computers, vehicles or mobile devices, such generally includes a processor, a storage medium readable by the processor (including volatile and non-volatile memory and/or storage elements), at least one input device, and at least one output device. One or more programs may implement or utilize the processes described in connection with the disclosure, e.g., through the use of an API, reusable controls, or the like. Such programs are preferably implemented in a high level procedural or object oriented programming language to communicate with a computer system of mobile device 100. However, the program(s) can be implemented in assembly or machine language, if desired. In any case, the language may be a compiled or interpreted language, and combined with hardware implementations.
Figure 2 shows a schematic drawing of one embodiment of the electronic device 100 for input of characters. The device 100 may have some or all the components and functionality described herein with respect to the mobile device 100 of Figure 1. The device 100 has aspects previously disclosed in Figure 8 of U.S. Patent No. 8,487,877, which is hereby incorporated by reference in its entirety.
The electronic device 100 includes the display 104, a plurality of characters 200 that populate positions 242 of a character menu 240, a plurality of selection buttons 110 and a spacebar button 264, which together make up a user interface 150 of the device 100. Each of the plurality of selection buttons 110 has an assigned button press value 222. Included as part or within proximity to the menu 240 is a reference 258 and an offset scale 260. The display 104, the plurality of selection buttons 110, and the spacebar button 264 are communicatively coupled with the CPU 108, as described in the embodiment of Figure 1. The CPU 108 includes the elapsed time counter 140, the integer value counter 142 and the swipe gesture interpreter 144, as described in the embodiment of Figure 1. The CPU 108 is communicatively coupled with the storage medium 112 and the power source 122, as described in the
embodiment of Figure 1.
In the embodiment of Figure 2, the positions 242 of the menu 240 are arranged in a one-dimensional array similar to the embodiment in Figure 8 of Patent No. 8,487,877, except that the menu 240 and corresponding selection buttons 110 are shown on the display 104 instead of as physical features of the user interface 150. The buttons 110 are communicatively coupled with the CPU 108.
The menu 240 and the offset scale 260 are positioned in respective one-dimensional arrays in the user interface region 150 of the device 100. In one embodiment the character menu 240 and the offset scale 260 are positioned on the user interface 150 so that they lie adjacent to and parallel with one other. In one
embodiment, the character menu 240 and the offset scale 260 are programmed in software so that they appear as features on the display 104 of the device 100.
In one embodiment, positions 242 of the menu 240 are distributed in a one-dimensional array in evenly spaced increments. In a further embodiment, values of the offset scale 260 are distributed in a one-dimensional array in spatial increments that match the increment of the menu 240, so that by referencing the offset scale 260 to the menu 240, characters 200 in the menu are effectively numbered.
The reference 258 is an indicator located near or on one of the positions 242 of the menu 240. The offset scale 260 includes a value of zero that is located to correspond with the reference 258 of the menu 240. Values of the offset scale 260 increase from zero in pre-selected increments as positions of the offset scale get farther from the zero value. In a further embodiment, values of the offset scale 260 decrease from zero in pre-selected increments as positions of the offset scale get farther from the zero value in a direction opposite to the increasing direction. In one embodiment, the pre-selected increment of the offset scale 260 equals one and the values of the offset scale extend from a negative value to a positive value passing through zero. In an alternative embodiment, the increment of the offset scale 260 is 10 and positions 242 of the menu 240 are marked off in corresponding units of 10.
In one specific embodiment, the positions 242 of the menu 240 and the values of the offset scale 260 are distributed in respective one-dimensional arrays positioned adjacent to and parallel with one another, the values of the offset scale 260 count in increments of one and are spaced with respect to one another in their array to correspond with the spacing of positions 242 of the menu 240, and the zero value of the offset scale 260 corresponds to the reference 258 of the menu 240 so that the values of the offset scale 260 label the positions of the menu 240 according to how many positions a given position 242 of the menu 240 is offset from the reference 258.
The plurality of selection buttons 110 lie on the display 104 of the user interface 150 of the device 100. In one embodiment, the buttons 110 are arranged in a row that corresponds to the physical alignment of the menu 240 on the user interface. Each button is communicatively coupled with the CPU 108 and is assigned a button press value 222. The assigned button press value 222 can be either positive or negative. Each button 110 has the function that when the button is pressed the value 222 assigned to the button is input to the CPU 108. Furthermore, each button 110 has the function that when the button is pressed and a swipe gesture follows the press, the value 222 assigned to the button is input to the CPU 108 along with identification of one or more mathematical operators and the assigned value 222 of one or more neighboring selection buttons. In an alternative embodiment, each button 1 10 has the function that when the button is pressed and a swipe gesture follows the press, the value 222 assigned to the button is doubled, then increased by one, and then input to the CPU 108.
In one embodiment, the assigned button press value 222 of each selection button is unique. In another embodiment there are four selection buttons and the buttons' assigned values are -3, -2, +2, and +3. In another embodiment there are four selection buttons and the buttons' assigned values are -3, -1, +1, and +3. In yet another embodiment there are five selection buttons and the buttons' assigned values are -3, -2, 0, +2, and +3.
In one embodiment, the one or more mathematical operators identified by the swipe gesture are addition and/or subtraction. In a further embodiment, the identified math operators are specified by the direction of the swipe gesture. In another embodiment, the identified math operators are specified by the distance (or length) of the swipe gesture. In a further embodiment, an assigned integer value of a neighboring button is mathematically combined with the integer value identified by the button press, wherein the neighboring button and mathematical operator are identified by the direction of the swipe.
The spacebar 264 also lies in the user interface region 150 of the device 100, can be either a hard or soft key, and is communicatively coupled with the CPU 108.
In one embodiment of Figure 2, the menu 240 has 13 menu positions 242 and the plurality of selection buttons includes four buttons with the assigned button press values 222: '-3, -2, +2, +3' . In a further embodiment, the menu positions 242 are populated by 13 of the 26 characters 200 of the English alphabet.
Figure 3 shows graphical representations of examples of four button press types (BPTs) 224. Button press types are a means of classifying a received button press. In one embodiment a button press is classified according to a measured duration 208 and a swipe distance 372. As disclosed in U.S. Provisional Patent Application No. 62/155,372, filed April 30, 2015, entitled SYSTEMS AND METHODS FOR WORD IDENTIFICATION THAT USE BUTTON PRESS TYPE ERROR ANALYSIS
(Attorney Docket No. 680065.406P1), which is hereby incorporated by reference in its entirety, in one embodiment a button press type is assigned a mathematical operator. In a further embodiment the interpretation of a button press type determines the character selected from a menu according to the character's position in the menu and the mathematical operator assigned to the button press type.
Each example button press type 224 of Figure 3 has a first and second horizontal bar 326, 331. The first horizontal bar 326 represents the passage of time. The second horizontal bar 331 represents the distance (or length) of a positional translation. In the case of a pair BPT 350, there is a first and second horizontal bar 326, 331 for each button press of the pair.
For the first bar 326, a black region 327 within the bar indicates a time period when a button is pressed. A white region 328 indicates a time period when a button is not pressed. A solid vertical marker 329 indicates the beginning or end of an elapsed time period (ETP) metric 330. The elapsed time period 330 is of selectable length and commences with the onset of a button press. The length of the black region 327 indicates the duration 208 of the button press. To classify a button press, the duration of the button press is compared with the elapsed time period metric 330. In one embodiment, the selected length of the elapsed time period 330 is 0.10 seconds.
For the second bar 331, the black region 327 within the bar indicates a distance (or length) 372 of a positional translation of a button press along the touch sensitive screen 104. In a further embodiment, the positional translation is measured starting from the position of the button press at the onset of the press. The white region 328 within the bar indicates a distance from the starting position that the positional translation of the button press has not reached. The solid vertical markers 329 indicate the beginning or end of a swipe distance metric 373. The swipe distance metric 373 is of selectable length and commences at the position of the button press on the screen 104 at the button press' onset. The distance (or length) 372 of a positional translation of a button press is compared with the swipe distance metric 373. In one embodiment, the selected length of the swipe distance metric 373 is 20 pixels.
In one embodiment, a device 100 monitors the duration 208 and swipe distance 372 of a button press concurrently. In other words the elapsed time
counter 140 measures the duration 208 of the button press and the swipe gesture interpreter 144 measures its swipe distance 372. In a further embodiment, the device monitors the duration 208 starting at the onset of a button press and monitors the swipe distance 372 starting at the position of the button press at the onset of the press, so that the measurement of duration and distance begin simultaneously.
The button press types 224 of Figure 3 are distinguished from one another by (1) the button press duration 208 with respect to the ETP metric 330 and (2) the swipe distance 372 with respect to the swipe distance metric 373.
For example, for a short BPT 340 both the duration 208 and the swipe distance 372 are less than their respective metrics. In another example, for a long BPT 345 the duration 208 exceeds its metric while the swipe distance 372 does not.
For the example of the pair BPT 350, a second button press is received before the ETP 330 expires, while the swipe distance for neither button press exceeds the swipe distance metric 330.
For the example of a swipe BPT 357, the swipe distance 372 exceeds its metric while the duration 208 does not. Note that for the example of the swipe BPT, the rate of the positional translation must be fast enough so that the swipe distance 372 exceeds the distance metric 373 before the elapsed time period 330 expires. Otherwise, the gestures could be interpreted as the short BPT 340. For the example swipe BPT 357 of Figure 3, the elapsed timer period equals 0.1 sec and the swipe distance metric equals 20 pixels. In this case, the rate of the positional translation must be at least 20 pixels /
0.1 seconds for the gesture to avoid being interpreted as a short BPT 340 and to be interpreted as the swipe BPT 357.
A BPT not disclosed in Figure 3, but disclosed later on, is one where both the duration 208 and the swipe distance 372 exceed their respective metrics.
Figure 4 shows an embodiment of a portion of the user interface 150.
The portion shown includes the selection buttons 110 implemented as on-screen buttons on the touch display screen 104.
An example of an executed button press is represented by a button press position 321 on a particular selection button 111. From the position 321 of the button press the CPU 108 determines (1) which of the selection buttons 110 is selected (in this example, the button with assigned integer value -3) and (2) an initial point from which any positional translation of the button press is measured. Also, as described in U.S. Patent No. 8,487,877, the onset of the button press triggers the elapsed time counter 330 to start, which the CPU 108 uses to distinguish time-dependent BPTs.
Once the elapsed time counter starts, the CPU 108 measures the duration 208 of the button press (as shown in Figure 3). Furthermore, the swipe gesture interpreter 144 of the CPU 108 monitors a present position 323 of the button press relative to the initial position 321. In one embodiment, the gesture interpreter 144 monitors the distance (or length) 372 and a direction 374 of any positional translation that occurs.
In one embodiment, the swipe gesture interpreter 144 has programmable thresholds that (1) define whether the distance 372 of a given positional translation constitutes a swipe gesture and (2) classifies the direction 374 of a positional translation, if one occurs. In one embodiment, a positional translation from the button press position 321 must occur within the bounds of a given sweep 375 to be considered a swipe gesture. In another embodiment, a positional translation must extend a minimum distance 373 from the initial position 321 to be considered a swipe gesture. In yet another embodiment, the two previously mentioned conditions must both be met for the positional translation to be considered a sweep gesture. In still another embodiment, the threshold for the minimum translation is the boundary of the button itself, or some position relative to the button's boundary.
In one embodiment, the distance 373 is measured in mm but in an alternative embodiment the distance is measured in pixels. In one embodiment the minimum distance 373 is 15 pixels, which for an example button size of 60 x 60 pixels is a translation of 25% of the button's width.
In one embodiment, there are multiple sweeps 375 within which a positional translation could become classified. In a further embodiment, the particular sweep 375 that a position translation falls within defines a direction 374 of the swipe gesture. In another embodiment, there are two possible sweeps 376, 377 for which a positional translation could be classified, one on each side of the initial press position 321. In a further embodiment, positional translations occurring within the opposing sweeps 376, 377 are classified as having a direction 374 opposite to one another, even if the exact positional translations that lead to their respective classification are not precisely parallel. In one embodiment, a sweep 375 spans 90 degrees, but in alternative embodiments the span of the sweep is less or more.
Figure 5 shows another embodiment of the user interface 150. The embodiment includes the selection buttons 110, the menu 240 and the values of the offset scale 260. The menu 240 further includes individual character positions 242 and the reference 258, all as previously described in Figure 2.
In the embodiment of Figure 5, the menu 240 is one-dimensional. In other words, positions 242 of the menu are arranged in a row, similar to elements of a one-dimensional array. Consistent with a one-dimensional array, each menu position is identified by an index, which in one embodiment is a value of the offset scale 260.
In still a further embodiment, a menu position 242 can be specified for selection by its index. Furthermore, a change in a selected menu position to a neighboring position of the menu is considered a movement of the selection. In one embodiment, movement of a selection from a menu position farther from the reference 258 to a neighboring menu position closer to the reference is a movement in an 'inward' direction 374. Furthermore, movement of a selection from a menu position closer to the reference 258 to a neighboring menu position farther from the reference is a movement in the 'outward' direction 374. In yet a further embodiment, the possible directions 374 of the swipe gestures of Figure 4 correspond with the possible directions 374 of movement of selections within the menu 240 of Figure 5.
In a further embodiment of the invention, the directions 374 of the swipe gestures of Figure 4 each have an associated math operator. Figure 6 shows a table that identifies mathematical operators 181 and their correspondence to a direction 374 of a swipe gesture. In one embodiment, an 'outward' swipe corresponds to an 'addition' math operator and an 'inward' swipe corresponds to a 'subtraction' math operator. In alternative embodiments, more than two swipe directions is possible and the
mathematical operators associated with each swipe gesture are different than those shown for the embodiment of Figure 6.
Figure 7 shows a table that discloses one possible relationship between
(1) the assigned integer value 222 of a pressed selection button 110, (2) the direction 374 of a swipe gesture, and (3) the math operator 181 that gets specified as a result of (1) and (2).
As previously described, the button press that initiates a swipe gesture also specifies for the CPU 108 two other pieces of information: (1) the selection button pressed, and thereby the assigned integer value 222 that the CPU stores in the integer value counter 142, and (2) the orientation (and consequently) the direction 374 of a swipe gesture if one is interpreted, i.e., 'inward' or Outward' depending on the selection button identified and the relationships of Figure 5. In the event that the swipe gesture interpreter 144 does interpret a swipe gesture, then a math operator 181 is specified according to the interpreted input for (1) and (2) above, and the table of Figure 6.
For a first example 190, in an embodiment consistent with the embodiments of Figures 4-6, a button press 321 on the selection button with assigned value -3 followed by a swipe gesture in the direction of menu position -2 according to Figure 5 is interpreted as a swipe in the 'inward' direction 374. The 'inward' swipe is then interpreted according to the table of Figure 6 as specifying the 'subtraction' math operator 181. In another example 191, a button press 321 on the selection button with assigned value +2 followed by a swipe gesture in the direction of menu position +3 according to Figure 5 is interpreted as a swipe in the 'outward' direction 374. The 'outward' swipe is interpreted according to the table of Figure 6 as specifying the 'addition' math operator 181.
The table of Figure 7 also shows a further relationship where the value of the button that neighbors the pressed button is mathematically combined with the value of the pressed button according to (1) the direction 374 of the swipe gesture, and (2) the math operator 181 specified by the swipe gesture. In one embodiment, the integer value 226 of the button neighboring the pressed selection button (on the side corresponding to the direction of the swipe gesture) is mathematically combined with the value 222 of the pressed selection button according to the math operator 181 specified by the direction of the swipe gesture.
In an extension of the first example 190 above, for a button press 222 on the selection button with assigned integer value -3 and a swipe gesture interpreted in the 'inward' direction 374, the integer value -2 (the integer value 226 of the selection button inward from the pressed selection button) is subtracted from the integer value -3, yielding the value -1. In an extension of the second example 191 above, for a button press 222 on the selection button with assigned integer value +2 and a swipe gesture interpreted in the 'outward' direction 374, the integer value +3 (the integer value 226 of the selection button outward from the pressed selection button) is added to the value +2, yielding the value +5.
Said another way, in a further embodiment of the invention, a button press and swipe gesture together specify positions 242 of the menu 240 that cannot be specified by the actuations of a single button. For the embodiments discussed so far, the positions -5, -1, +1 and +5 are examples where a button press and a swipe gesture identify positions that cannot be identified with the actuations of a single selection button (in this embodiment, the values -3, -2, +2 and +3). Figure 7 discloses other examples of button presses and swipe gestures that can be combined to specify positions 242 of the menu for the embodiments of Figures 4-6.
In a further embodiment, the integer value 242 calculated from the math operation selects a character 200 of the menu 240 by specifying the offset value that identifies the character's position in the menu. According to one embodiment, the first example 190 specifies the character 'f and the second example 191 specifies the character T .
Figure 8 shows another embodiment of a portion of the user interface 150. The portion shown includes the selection buttons 110 implemented as on-screen buttons on the touch display screen 104.
An example of an executed button press is represented by a button press position 321 on a particular selection button 111. From the position 321 of the button press the CPU 108 determines (1) which of the selection buttons 110 is selected (in this example, the button with assigned integer value -3) and (2) an initial point from which any positional translation of the button press is measured. Also, as described in previous U.S. Patent No. 8,487,877, the onset of the button press triggers the elapsed time counter 330 to start, which the CPU 108 uses to distinguish time-dependent BPTs.
Once the elapsed time counter starts, the CPU 108 measures the duration 208 of the button press (as shown in Figure 3). Furthermore, the swipe gesture interpreter 144 of the CPU 108 monitors the present position 323 of the button press relative to the initial position 321. In an embodiment alternative to the one of Figure 4, the gesture interpreter 144 monitors a direction 374 of any positional translation that occurs and the intersection of the positional translation with any boundary 324 of any selection button 110.
In a further embodiment, the integer value counter 142 mathematically combines the assigned integer values 222 of each selection button 110 intersected during the course of the positional translation. In a further embodiment, the
mathematical operator applied for each math combination is selected according to the direction 374 of the swipe as the swipe enters a previously un-intersected selection button. In a third example 192, for the embodiment of Figure 8, the initial button press 321 causes the integer value counter 140 to store the value -3. Intersection of the positional translation with the button -2 by an 'inward' gesture subtracts the value -2 from the stored value -3. Intersection of the positional translation with the button 0 by a continued 'inward' gesture subtracts the value 0 from the stored value -1. Intersection of the positional translation with the button +2 is an 'outward' gesture and adds the value +2 to the stored value -1. Intersection of the positional translation with the button +3 by a continued Outward' gesture adds the value +3 to the stored value +1. At the conclusion of the swipe gesture, the integer value counter 140 stores the value +4.
In alternative embodiments, the assigned values of the selection buttons and the mathematical operators associated with the direction of the position translation could be different than those given in the example.
Figure 9 shows a table that discloses one possible set of relationships between (1) the assigned integer value 222 that an initial button press 321 selects, (2) the assigned integer value 227 of a pressed selection button 110 where a translation of the initial button press ends, (3) the math operators 181 identified by the direction 374 of the positional translation at each intersection of the translation with a subsequent button, (4) the accumulated math operations executed by the integer value counter 140, and (5) the integer value calculated as a result of the accumulated math operations, which in a further embodiment specifies a position 242 of the character selection menu 240.
The third example 192 described in Figure 8 is disclosed in the fourth line of the table of Figure 9.
In a further embodiment, the integer value calculated from the accumulated math operations selects a character 200 of the menu 240 by specifying the offset value that identifies the character's position in the menu. In the third example 192, the button press and swipe gesture of Figure 8 specify the character 'k', according to one embodiment.
Figure 10 shows a flowchart of an embodiment of a method 504 for a user to specify a character from among a plurality of characters. In one step 510 of the method 504, a user views the characters 200 displayed in the menu 240. In another step 512, the user selects a character from the menu 240 for input to the electronic device 100. In another step 514, the user identifies the selected character by the position of the character with respect to the reference 258 of the menu 240, for example by a value equal to the number of positions the selected character is offset from the menu's reference 258. The user can identify the position (or index) of the selected character in a number of ways, including by referencing the position to a corresponding value in the offset scale 260, counting the number of positions that the selected character is offset from the reference 258, recalling from memory the value that identifies the particular selected character, and recalling by muscle memory the selection button keystrokes that correspond with the selected character or the selected character's position. In another step 516, the user determines whether the value that identifies the selected character's position 242 in the menu 240 equals the assigned button press value 222 of any selection button 110.
If so, in another step 538 the user presses the selection button with the assigned value that equals the selected character's position and releases the button before the elapsed time counter expires. The aforementioned step 538 inputs the assigned value 222 of the pressed selection button to the CPU 108, triggers the CPU 108 to start the elapsed time counter 140, and indicates to the CPU that the type of button press is a SHORT press. In a subsequent step 520, the user waits for the elapsed time counter 140 to expire before, in an optional step 522, the user views the specified character on the display 104. In an alternative embodiment, step 522 is bypassed.
However, if the value that identifies the selected character's position 242 in the menu 240 is not equal to the assigned value of any selection button, then in an alternate step 536, the user determines whether the value that identifies the selected character's position 242 in the menu 240 equals twice the assigned button press value 222 of any selection button 110.
If so, in another step 540 the user presses the selection button 110 with the assigned value 222 that equals half the selected character's position and maintains the button press until the elapsed time counter expires. The aforementioned step 540 inputs the assigned value 222 of the pressed selection button to the CPU 108, triggers the CPU 108 to start the elapsed time counter 140, and indicates to the processor that the type of button press is a LONG press. In an optional step 522, the user views the specified character on the display 104. In an alternative embodiment, step 522 is bypassed.
However, if none of the values 222 assigned to the selection buttons 110 equals the selected character's position 242 or is half the selected character's position, in an alternate step 548 the user determines if the value that identifies the position of the selected character is the sum or the difference of the assigned button press values of any two adjacently positioned selection buttons.
If the user determines that the sum of the assigned values of two selection button values identifies the position of the selected character, in a step 550 the user presses one of the selection button values and swipes from the pressed selection button toward the other button. In one embodiment, the user presses the button with smaller assigned value and swipes toward the button with the larger assigned value. In an alternative embodiment, one button is positioned closer to an associated reference, the other button is positioned farther from the associated reference, and the user presses the button closer to the associated reference and swipes in an outward direction.
In still a further embodiment, the button assigned the smaller value is positioned closer to an associated reference, the button assigned the larger value is further from the reference, and the user swipes in an outward direction with respect to the reference.
If the user determines that the difference of the assigned values of two selection button values identifies the position of the selected character, in a step 552 the user presses one of the selection button values and swipes from one selection button toward the other. In one embodiment, the user presses the button with the larger assigned value and swipes toward the button with the smaller assigned value. In an alternative embodiment, one button is positioned closer to an associated reference, the other button is positioned farther from the associated reference, and the user presses the button farther from the associated reference and swipes in an inward direction.
In still a further embodiment, the button assigned the smaller value is positioned closer to an associated reference, the button assigned the larger value is further from the reference, and the user swipes in an inward direction with respect to the reference.
In either of the aforementioned steps 550 or 552, the button press that initiates the swipe gesture inputs the assigned value 222 of the pressed selection button 110 to the integer value counter 142 and triggers the CPU 108 to start the elapsed time counter 140. A swipe gesture greater than some predetermined distance before the elapsed time counter expires causes the addition or subtraction operation by the integer value counter 142 to occur. The math operation is determined by the direction 374 of the swipe. The value added or subtracted is determined by the assigned integer value 226 of the button neighboring the pressed selection button in the direction
corresponding to the direction 374 of the swipe gesture.
Interpretation of a swipe gesture indicates to the processor that the type of button press is SWIPE 357. The direction 374 determines if the type is SWIPE-OUT or SWIPE-IN. Optionally, as part of either step 550 or 552, the CPU 108 may also terminate the elapsed time counter 140 before expiration once the BPT is known to be SWIPE 357. In a subsequent step 522 the user views the specified character on the display 104, which is an optional step and in an alternative embodiment is bypassed.
According to another embodiment of the invention, the character specification method 504 described above is used iteratively to specify series' of characters from the character menu 240. In one embodiment, words and sentences are formed on the display 104 by iteratively specifying characters according the method above, with the spacebar 264 used to input spaces between words on the display.
Figures 11 shows a flowchart of an embodiment of a method 611 for the processor 108 of an electronic device to interpret button presses. In one step 610 of the method 600, the CPU 108 initializes the integer value counter 142 to zero. In another step 612 the CPU 108 initializes the elapsed time counter 140 to zero. In another step 614, the CPU 108 monitors the selection buttons 110 for a pressed selection button 110. Once a first selection button press occurs, in another step 616, the CPU 108 adds to the integer value counter 142 a value equal to the assigned value 222 of the first pressed selection button 110. In another step 618, the CPU 108 starts the elapsed time counter 140.
In a trio of steps 620, 684, 622, the CPU 108 monitors the selection buttons 110 for the occurrence of a second selection button press or a swipe gesture while comparing the elapsed time counter 140 with a user chosen selectable-length time period.
If the elapsed time counter 140 exceeds the duration of the elapsed time period (i.e., expires) before an additional selection button press occurs, in a subsequent step 640 the CPU 108 determines if the first button press is still pressed.
If the first button press is not still pressed when the elapsed time period expires, then in a subsequent step 624 the CPU 108 interprets as input the character 200 of the menu 240 whose position 242 equals the value of the integer value counter 142.
If, however, the first button press is still pressed when the elapsed time period expires, then in an alternate subsequent step 642 the processor multiplies the value of the integer value counter 142 by two before commencing the subsequent step 624, where the CPU 108 interprets as input the character 200 of the menu 240 whose position 242 equals the value of the integer value counter 142.
If, however, in steps 620 and 622 a second selection button press occurs before the elapsed time counter 140 expires, in another step 626 the CPU 108 adds to the integer value counter 142 a value equal to the assigned value 222 of the second pressed selection button. Then, in the subsequent step 624 the CPU 108 interprets as input the character 200 of the menu 240 whose position 242 equals the value of the integer value counter 142.
If, however, in steps 684 and 622 a swipe gesture occurs before the elapsed time counter 140 expires, in a step 686, the swipe gesture interpreter 144 determines the direction 374 of the swipe gesture. If in step 686 the interpreter 144 determines that the swipe is outward, in a step 688 the CPU 108 adds to the integer value counter the assigned value of the button neighboring the pressed selection button on the side corresponding to the direction of the swipe gesture. Then, in the subsequent step 624 the CPU 108 interprets as input the character 200 of the menu 240 whose position 242 equals the value of the integer value counter 142.
If in step 686 the interpreter 144 determines that the swipe is inward, in a step 690 the CPU 108 subtracts from the integer value counter the assigned value of the button neighboring the pressed selection button on the side corresponding to the direction of the swipe gesture. Then, in the subsequent step 624 the CPU 108 interprets as input the character 200 of the menu 240 whose position 242 equals the value of the integer value counter 142.
According to one embodiment of the method 611, the CPU 108 reinitializes the integer value counter 142 and the elapsed time counter 140 to zero and repeats the method. According to another embodiment, in a further step the CPU 108 displays the identified character 200 on the screen 104.
In alternative embodiments, math operations other than addition, subtraction and multiplication-by-two are used in steps 626, 642, 688 and 690 to identify characters by their numerical position in a menu, array or table. Furthermore, in alternative embodiments the particular button neighboring the pressed selection button is in an alternative direction from the embodiment disclosed or is identified by an alternative input than the direction of the swipe gesture. Although the method 611 of Figure 11 is one embodiment of a method for specifying series of characters, the scope of the method is not limited by this particular embodiment, but rather by the scope of the claims.
Figure 12 shows the user interface 150 of Figure 2, a table 185 of value assignments for variables of the method 611 of Figure 11, and a list 186 of input variables for the method 611. The user interface 150, table 185, and list 186 are examples used to demonstrate the embodiments of Figures 10 and 11. The scope of the invention is not limited by the variables and values shown here, but rather by the scope of the claims.
The table 185 is divided into rows and columns. Rows are grouped by path as identified in Figure 11. Each column is one variable: the variable 'input gesture' 210, the variable 'duration' 208, the variable 'swipe direction' 374, the variable 'button press type' 224, the variable 'button press values' 222, the variable math operator 181, a variable 'total button press value' 228 and the variable 'character' 200.
Values for the variable 'button press type (BPT)' 224 conform with the path, as defined by the method 611 of Figure 11 : Path 1 is the short BPT 340, Path 2 is the long BPT 345, Path 3 is the pair BPT 350, Path 4 is a swipe-out BPT 358 and Path 5 is a swipe-in BPT 359.
Values for the variables 'input gesture' 210, 'duration' 208 and
'direction' 374 align with the values for 'BPT' 224 as dictated by the method 611 of Figure 11 : when 'gesture' is 'single press' and 'duration' is '< ETP', then the BPT is short; when 'gesture' is 'single press' and 'duration' is '> ETP', then the BPT is long; when 'gesture' is 'pair press', then the BPT is pair; when 'gesture' is 'swipe' and 'direction' is 'outward', then the BPT is 'swipe out'; and when 'gesture' is 'swipe' and 'direction' is 'inward', then the BPT is 'swipe in'.
Values for the variable 'button press values' 222 are the assigned button press values 222 (or possible combinations of the assigned values 222) of the user interface 150. For the short and long BPTs, the button press values 222 are single values. For the pair, swipe-out and swipe-in BPTs, the button press values 222 are possible combinations of two of the values 222. The variable 'button press value' 222 identifies for the processor 108 the particular selection button 110 pressed along with the second button pressed if the BPT is 'pair' or the value identified by the direction of the swipe gesture, if a swipe gesture occurs. For the user interface 150 of Figure 2, values for variable 'button press values' 222 are -3, -2, +2 +3, and possible
combinations of any two of those values. In alternative embodiments, other values for the variable 'button press values' 222 are possible.
Values for the variable 'math operator' 181 are the operators assigned to each specific BPT 234. For the 'short' BPT, the operator is '=' . For the 'long' BPT, the operator is 'χ2' . For the 'pair' BPT, the operator is '+'. For the 'swipe-out' BPT, the operator is '+'. For the 'swipe-in' BPT, the operator is '-'.
Values for the variable 'total button press value' 228 are the values that result from the mathematical operations of steps 624, 626, 642, 688 and 690 of the method 611 of Figure 11. As the method of Figure 11 shows, the path taken determines which math operation becomes implemented. As a result, values of the variable 'total button press value' 228 depend on both the variables 'button press values' 222 and 'button press type' 224. For the short BPT 340, the 'total button press value' 228 equals the 'button press value' 222. For the long BPT 345, the 'total button press value' 228 equals two times the 'button press value' 222. For the pair BPT 350, the 'total button press value' 228 equals the sum of the 'button press values' 222. For the swipe-out BPT, the 'total button press value' 228 equals the sum of the 'button press values' 222. For the swipe-in BPT, the 'total button press value' 228 equals the difference between the 'button press values' 222.
Values for the variable 'character' 200 are the characters available in the menu 240 of the user interface 150. Each character is identified by its position 242 in the menu 240, as previously disclosed in U.S. Patent No. 8,487,877. As described in step 624 of the method 611 of Figure 11, the processor 108 interprets characters 200 of the menu 240 by their position 242. Values for the variable 'total button press value' 228 identify that menu position.
The list 186 shows explicitly which variables of the method 611 of Figure 11 are input variables. Input variables are variables that require input from a user. The input variables are: (1) 'button press values' 222, (2) 'input gesture' 210,
(3) 'duration' 208, and (4) 'direction' 374. The remaining variables of the table 185 ('button press type' 224, 'math operator' 181, 'total button press value' 228, and
'character' 200) all follow as a result of the input variables and the user interface 150.
Figure 13 shows a first flowchart of variables and a second flowchart of example values for each variable of the method 611 of Figure 11 and the user interface 150 of Figure 2.
The first flowchart shows the four input variables for the method of
Figure 11 : (1) 'button press values' 222, (2) 'input gesture' 210, (3) 'duration' 208 and
(4) 'direction' 374. Next in the flowchart, the variables 'input gesture' 210, 'duration' 208 and 'direction' 374 together determine the 'button press type' 224, as disclosed by the method 611 of Figure 11. Next, the variables 'button press values' 222 and 'button press type' 224 together determine the 'total button press value' 228, which follows step 624, 626, 642, 688 or 690 of the method 611 of Figure 11. Finally, the variable 'total button press value' 228 determines the 'character' 200 which is based on the menu 240 of the user interface 150 of Figure 2.
The second flowchart shows example values for each variable of the method 611 of Figure 11 and the embodiment of the user interface 150 of Figure 2. The variable 'button press values' 222 has the values '-3, -2, +2 or +3' 223 (or combinations of them). The variable 'input gesture' 210 has the values 'single, 'pair' or 'swipe' 211. The variable 'duration' 208 has the values '< ETP' or '> ETP' 209. The variable 'swipe direction' 374 has the values 'inward' or 'outward' 375. The variable 'button press type' 224 has the values 'short', 'long', 'pair', 'swipe-out' or 'swipe-in' 225. The variable 'total button press value' 228 has the values '-6, -5, -4, -3, -2, -1, 0, +1, +2, +3, +4, +5, or +6' 229. The variable 'character' 200 has the values 'a, b, c, d, e, f, g, h, i, j, k, 1, or m' 201. The values of the second flowchart are examples used to demonstrate the embodiments of Figures 2 and 11. The scope of the invention is not limited by the variables and particular values shown here, but rather by the scope of the claims.
Figures 14 and 15 show examples of how characters of a word 130 derive from the variables of the method 611 of Figure 11 and the user interface 150.
For the example of Figure 14, the word 130 is 'fad' . Rows of a table show values for each of the variables 'character' 200, 'menu position' 242, 'button press values' 222 and 'button press type' 224. Values for the variable 'character' 200 derive directly from the word 130. Values for the variable 'menu position' 242 derive from the position of each character 200 in the menu 240 according to the user interface 150.
Values for the variable 'button press values' 222 derive from the values for 'menu position' 242 and steps of the method 611 of Figure 11 taken in reverse. In other words, for example, in order for the method 611 of Figure 11 and the user interface 150 of Figure 2 to interpret the 'menu position' as -3, thereby leading to the character 'd', then the button press value can only be -3. For the assigned selection buttons 222 available in the user interface 150 of Figure 2, no combination of one or two button press values 222 except -3 can produce the value -3. As another example, to interpret the 'menu position' as -1, thereby leading to the character 'f , then the math operation can be either (-3) - (-2) or (-3) + 2. For the assigned selection buttons 222 available in the user interface 150 of Figure 2, either a 'pair' BPT can be executed to specify the '+' operator or a 'swipe-in' BPT to specify the '-' operator. The 'swipe-out' BPT is not a possibility because although the gesture does select the '+' operator, the required integers values to make -1 are not assigned to neighboring selection buttons. To interpret the 'menu position' as -6, thereby leading to the character 'a', the 'χ2' operator and the assigned integer value -3 are required. That requirement dictates a 'long' BPT selection of the selection button with the assigned integer value -3 to select 'a' .
Values for the variable 'button press type' 224 derive from the values for
'menu position' 242 in the same way. In order for the method 611 of Figure 11 and the user interface 150 of Figure 2 to interpret the 'menu position' 242 as -3, the method must follow the first path 644. Review of Figures 11 confirms this, because none of the mathematical operations along the other paths of method 611 allow an outcome of -3 for the assigned selection button values 222 that are available for the user interface 150. With the restriction that Path 1 of the method 611 must be the one followed, then the 'button press type' 224 must be the short BPT 340. Similar logic holds true for the 'menu position' -6. For the 'menu position' -1, either paths 3 or 5 could be followed, but no others.
The examples above hint at three facts true of the method 611 of Figure 11 and the user interface 150 of Figure 2 that make the method and interface useful:
1) every menu position 242 is accessible by at least one combination of the variables 'button press values' 222 and 'button press types' 224,
2) every combination of the variables 'button press values' 222 and 'button press types' 224 leads to at least one menu position 242, and
3) every combination of the variables 'button press values' 222 and
'button press types' 224 leads to no more than one menu position 242.
Figure 14 also shows a variable 'button press type sequence' 382 and a variable 'total number of button presses' 384. For the word 'fad', the button press type sequence 382 is 'swipe-in - long - short. Based on the number of button press types 224 and the number of button press values 222 per button press type 224, the total number of button presses 384 for the word 'fad' is three if a swipe gesture is used to select 'f (or four if a pair button press is used to select 'f ).
For the example of Figure 15, the word 130 is 'leg'. Rows of a table show values for each of the variables 'character' 200, 'menu position' 242, 'button press values' 222 and 'button press type' 224. Individual values for each of the variables are derived just as explained for the example of Figure 14. For the word 'leg', the button press type sequence 382 is 'swipe-out - short - short. Based on the number of button press types 224 and the number of button press values 222 per button press type 224, the total number of button presses 384 for the word 'leg' is three if a swipe gesture is used to select T (or four if a pair button press is used to select T).
Figure 16 shows a schematic drawing of another embodiment of the electronic device 100 for input of characters. The device 100 may have some or all the components and functionality described herein with respect to the mobile device 100 of Figure 1. The device 100 has aspects previously disclosed in Figure 8 of U.S. Patent No. 8,487,877, which is hereby incorporated by reference in its entirety.
The electronic device 100 includes the display 104, the plurality of characters 200 that populate positions 242 of the character menu 240, the plurality of selection buttons 110 and the spacebar button 264, which together make up the user interface 150 of the device 100. Each of the plurality of selection buttons 110 has an assigned button press value 222. Included as part or within proximity to the menu 240 is the reference 258 and the offset scale 260. The display 104, and specifically the menu 240 the plurality of selection buttons 110, and the spacebar button 264, are communicatively coupled with the CPU 108, as described in the embodiment of Figure 1. The CPU 108 includes the elapsed time counter 140, the integer value counter 142 and the swipe gesture interpreter 144, as described in the embodiment of Figure 1. The CPU 108 is communicatively coupled with the storage medium 112 and the power source 122, as described in the embodiment of Figure 1.
In the embodiment of Figure 16, the menu 240 has 17 menu positions 242 and the plurality of selection buttons includes six buttons with the assigned button press values 222: '-4, -3, -2, +2, +3, +4' . In a further embodiment, the menu positions 242 are populated by 17 of the 33 characters 200 of the Russian alphabet.
Figure 17 shows an embodiment of the table 185 of value assignments for variables of the method 611 of Figure 11 for the embodiment of the user interface 150 of Figure 16.
Figure 18 shows graphical representations of another example of the swipe BPT 357 shown in Figure 3 and an example alternative swipe BPT 362. As in Figure 3, each example button press type 224 of Figure 18 has a first and second horizontal bar 326, 331. The first horizontal bar 326 represents the passage of time. The second horizontal bar 331 represents the distance (or length) of a positional translation.
For the first bar 326, the black region 327 within the bar indicates a time period when a button is pressed. The white region 328 indicates a time period when a button is not pressed. The solid vertical marker 329 indicates the beginning or end of an elapsed time period (ETP) metric 330. The elapsed time period 330 is of selectable length and commences with the onset of a button press. The length of the black region 327 indicates the duration 208 of the button press. To classify a button press, the duration of the button press is compared with the elapsed time period metric 330.
For the second bar 331, the black region 327 within the bar indicates a distance (or length) 372 of a positional translation of a button press along the touch sensitive screen 104. In a further embodiment, the positional translation is measured starting from the position of the button press at the onset of the press. The white region 328 within the bar indicates a distance from the starting position that the positional translation of the button press has not reached. The solid vertical marker 329 indicates the beginning or end of the swipe distance metric 373. The swipe distance metric 373 is of selectable length and commences at the position of the button press on the screen 104 at the button press' onset. The distance (or length) 372 of a positional translation of a button press is compared with the swipe distance metric 373. A difference between the swipe BPT 357 and the alternative swipe BPT 362 is that for the swipe BPT 357 the button press duration 208 does not exceed the elapsed timer period metric 330, whereas for the alternative swipe BPT 357 the duration does exceed the elapsed time period metric. For the alternative swipe BPT 357, the fact that both the duration and the swipe distance exceed their respective metrics leads to a potential ambiguity in the determination of the BPT 224, when the gesture is interpreted according to the method 611 of Figure 11.
In Figure 11, note the loop through the steps 620, 622 and 684 of the method 611. If the swipe distance 372 exceeds the swipe distance metric 373 before the duration 208 exceeds the elapsed time metric 330 then a 'Yes' is concluded in step 684 and the button press is interpreted as the swipe BPT 357 (Path 4 or 5). On the other hand, if button press duration 208 exceeds the elapsed time period 330 before the swipe distance 372 exceeds the swipe distance metric 373, then a 'Yes' is concluded in step 622 and the button press is interpreted as the long BPT 345 (Path 2). Of these possibilities, the one that occurs first depends on the value selected for the elapsed time period metric 330, the value selected for the swipe distance metric 373, and the rate the user executes the positional translation. For the example alternative BPT 362 of Figure 18, the elapsed time period metric is 0.1 seconds and the swipe distance metric is 20 pixels. If the rate of the positional translation is less than 20 pixels per 0.1 seconds, then the button press is interpreted as the long BPT 345. If the rate of the positional translation is greater than 20 pixels per 0.1 seconds, then the button press is interpreted as the swipe BPT 357.
Figure 19 is a flowchart that shows an alternate method 613 to the method 611 of Figure 11 for the processor 108 of an electronic device to interpret button presses. Figure 20 is a flowchart that shows a method 700 to change an interpreted button press type 224 from the long BPT 345 to the swipe BPT 357 for the case where the swipe distance 372 does not exceed the swipe distance metric 373 until after the elapsed time period 330 expires. The method 700 of Figure 20 uses the alternate method 613 of Figure 19 to enable that correction.
One difference between the alternate method 613 of Figure 19 and the method 611 of Figure 11 is the order that the method executes the step 610 that initializes the BP value counter to zero occurs in the method. In the alternate method 613, the step 610 follows the step 614 that determines if a first selection button is pressed. Furthermore the step 610 falls ahead of the step 616 that adds to the BP value counter the button press value of the pressed selection button. Another difference between the alternate method 613 of Figure 19 and the method 611 of Figure 11 is that the step 624 that identifies the character of the menu according to the input gesture received leads directly to the step 612 that initializes the elapsed time counter to zero.
A step 699 is included in the alternate method 613 that is not included in the method 611 of Figure 11. The step 699 is a conglomeration of the steps 610, 616, 618, 620, 622, 624, 626, 640, 642, 684, 686, 688 and 690. The step 699 interprets as input the character of the menu whose position corresponds to the input gestures received.
Figure 19 shows a flowchart of an embodiment of a method 613 for the processor 108 of an electronic device to interpret button presses. In one step 612 of the method the CPU 108 initializes the elapsed time counter 140 to zero. In another step 614, the CPU 108 monitors the selection buttons 110 for a pressed selection button. Once a first selection button press occurs, in another step 610 of the method 613, the CPU 108 initializes the integer value counter 142 to zero. In another step 616, the CPU 108 adds to the integer value counter 142 a value equal to the assigned value 222 of the first pressed selection button 110. In another step 618, the CPU 108 starts the elapsed time counter 140.
In a trio of steps 620, 684, 622, the CPU 108 monitors the selection buttons 110 for the occurrence of a second selection button press or a swipe gesture while comparing the elapsed time counter 140 with a user chosen selectable-length time period.
If the elapsed time counter 140 exceeds the duration of the elapsed time period (i.e., expires) before an additional selection button press occurs, in a subsequent step 640 the CPU 108 determines if the first button press is still pressed.
If the first button press is not still pressed when the elapsed time period expires, then in a subsequent step 624 the CPU 108 interprets as input the character 200 of the menu 240 whose position 242 equals the value of the integer value counter 142.
If, however, the first button press is still pressed when the elapsed time period expires, then in an alternate subsequent step 642 the processor multiplies the value of the integer value counter 142 by two before commencing the subsequent step 624, where the CPU 108 interprets as input the character 200 of the menu 240 whose position 242 equals the value of the integer value counter 142.
If, however, in steps 620 and 622 a second selection button press occurs before the elapsed time counter 140 expires, in another step 626 the CPU 108 adds to the integer value counter 142 a value equal to the assigned value 222 of the second pressed selection button. Then, in the subsequent step 624 the CPU 108 interprets as input the character 200 of the menu 240 whose position 242 equals the value of the integer value counter 142.
If, however, in steps 684 and 622 a swipe gesture occurs before the elapsed time counter 140 expires, in a step 686, the swipe gesture interpreter 144 determines the direction 374 of the swipe gesture.
If in step 686 the interpreter 144 determines that the swipe is outward, in a step 688 the CPU 108 adds to the integer value counter the assigned value of the button neighboring the pressed selection button on the side corresponding to the direction of the swipe gesture. Then, in the subsequent step 624 the CPU 108 interprets as input the character 200 of the menu 240 whose position 242 equals the value of the integer value counter 142.
If in step 686 the interpreter 144 determines that the swipe is inward, in a step 690 the CPU 108 subtracts from the integer value counter the assigned value of the button neighboring the pressed selection button on the side corresponding to the direction of the swipe gesture. Then, in the subsequent step 624 the CPU 108 interprets as input the character 200 of the menu 240 whose position 242 equals the value of the integer value counter 142.
According to one embodiment of the method 613, the CPU 108 re- initializes the elapsed time counter 140 to zero and repeats the method. According to another embodiment, in a further step the CPU 108 displays the identified character 200 on the screen 104. According to another embodiment, the steps 610, 616, 618, 620, 622, 624, 626, 640, 642, 684, 686, 688 and 690 together make up the step 699, which interprets as input the character of the menu whose position corresponds to the input gestures received.
In alternative embodiments, math operations other than addition, subtraction and multiplication-by-two are used in steps 626, 642, 688 and 690 to identify characters by their numerical position in a menu, array or table. Furthermore, in alternative embodiments the particular button neighboring the pressed selection button is in an alternative direction from the embodiment disclosed or is identified by an alternative input than the direction of the swipe gesture. Although the method 613 of Figure 19 is one embodiment of a method for specifying series of characters, the scope of the method is not limited by this particular embodiment, but rather by the scope of the claims.
Figure 20 is a flowchart that shows a method 700 that changes a button press type initially interpreted as the long BPT 345 to the swipe BPT 357 for the case where the swipe distance 372 exceeds the swipe distance metric 373 after the elapsed timer counter 330 expires. In one step 612 of the method the CPU 108 initializes the elapsed time counter 140 to zero. In another step 614, the CPU 108 monitors the selection buttons 110 for a first pressed selection button. A first pressed selection button is button press with an onset after the step 612 in the currently executed cycle of steps 612, 614 and 699. In contrast, an ongoing button press is a button press with an onset before the step 612 of the current selection cycle. An ongoing button press is a button press initiated in the previous character selection cycle and that is still underway.
If the CPU finds no first pressed selection button, in a next step 710, the CPU 108 determines whether the swipe distance 372 of an ongoing button press (if any) exceeds the swipe distance metric 373. Note that in the step 710 the button press could only have been an ongoing button press, because the determination in the step 614 was that a first selection button press was not found.
If the CPU finds no button press with swipe distance 372 that exceeds the swipe distance metric 373 (i.e., if there is an ongoing button press for which the swipe distance 372 does not exceed the distance metric 373, or there is no ongoing button press at all), then in the next step 614 the CPU rechecks whether a first selection button is pressed. Alternatively, if the CPU finds that a button press is underway (i.e., carried over from the previous character selection cycle) and the swipe distance 372 of the ongoing button press exceeds the distance metric, then in a next step 712 the CPU 108 divides the value of the BP value counter by 2.
In the next step 686, the swipe gesture interpreter 144 determines the direction 374 of the swipe gesture.
If in step 686 the interpreter 144 determines that the swipe is outward, in a step 688 the CPU 108 adds to the BP value counter the assigned value of the button neighboring the pressed selection button on the side corresponding to the direction of the swipe gesture. Then, in a subsequent step 714 the CPU 108 replaces the last interpreted character with the character 200 of the menu 240 whose position 242 equals the value of the BP value counter 142.
If in step 686 the interpreter 144 determines that the swipe is inward, in a step 690 the CPU 108 subtracts from the BP value counter the assigned value of the button neighboring the pressed selection button on the side corresponding to the direction of the swipe gesture. Then, in the subsequent step 714 the CPU 108 replaces the last interpreted character with the character 200 of the menu 240 whose position 242 equals the value of the BP value counter 142. In the next step 614 the CPU re-checks whether a first selection button is pressed. Once a new selection button is received, then in the next step 699 the CPU 108 interprets as input the character of the menu whose position corresponds to the input gestures received, as described in the method 613 of Figure 19.
Figure 21 is a flowchart that shows another method 720 that interprets button press and swipe input gestures to select a character from a menu. In one step 612 of the method the CPU 108 initializes the elapsed time counter 140 to zero. In another step 614, the CPU 108 monitors the selection buttons 110 for a first pressed selection button. A 'first' pressed selection button is button press with an onset after the step 612 in the currently executed cycle of steps 612, 614 and 699. In contrast, an Ongoing' button press is a button press with an onset before the step 612 of the current selection cycle. In other words, an Ongoing' button press is a button press initiated in the previous character selection cycle that is still underway.
If the CPU finds no first pressed selection button, in a next step 710, the CPU 108 determines whether the swipe distance 372 of an ongoing button press (if any) exceeds the swipe distance metric 373. Note that in the step 710 the button press could only have been an ongoing button press, because the determination in the step 614 was that a first selection button press did not occur.
If the CPU finds no button press with swipe distance 372 that exceeds the swipe distance metric 373 (i.e., the CPU finds an ongoing button press but the swipe distance 372 does not exceed the distance metric 373, or finds no ongoing button press at all), then in the next step 614 the CPU rechecks whether a first selection button is pressed. Alternatively, if the CPU finds an ongoing button press (i.e., a button press still underway from the previous character selection cycle) and the swipe distance 372 of the ongoing button press exceeds the distance metric, then in a next step 730 the CPU 108 replaces the last interpreted character with the character 200 of a menu 240 whose position 242 corresponds to the swipe gesture. In the embodiment of Figure 21, the correspondence is based on any spatial relationship between the selection button activated, the swipe gesture interpreted and the position of the character in the menu, not necessarily to a correspondence with a button press value of the pressed selection button.
In the next step 614 the CPU rechecks whether a first selection button is pressed. Once a first selection button press occurs, in another step 618, the CPU 108 starts the elapsed time counter 140.
In a trio of steps 620, 684, 622, the CPU 108 monitors the selection buttons 110 for the occurrence of a second selection button press or a swipe gesture while comparing the elapsed time counter 140 with a user chosen selectable-length time period.
If the elapsed time counter 140 exceeds the duration of the elapsed time period (i.e., expires) before an additional selection button press occurs, in a subsequent step 640 the CPU 108 determines if the first button press is still pressed.
Depending on the outcome of the decisions steps 620, 622, 640 and 684, in a step 735 the CPU 108 interprets as input the character 200 of a menu 240 whose position 242 corresponds to the input gesture received. In the embodiment of
Figure 21, the correspondence is based on any spatial relationship between the selection button (or buttons) pressed, the gesture interpreted and the position of the character in the menu, not necessarily to a correspondence with a button press value of the pressed selection button.
According to one embodiment of the method 720, the CPU 108 re-initializes the elapsed time counter 140 to zero which initiates a subsequent character selection cycle. According to another embodiment, in a further step the CPU 108 displays the identified character 200 on the screen 104.
Figure 22 shows a flowchart of an embodiment of a method 505 for a user to specify a character from among a plurality of characters. In one step 510 of the method 505, a user views the characters 200 displayed in the menu 240. In another step 512, the user selects a character from the menu 240 for input to the electronic device 100. In another step 514, the user identifies the selected character by the position of the character with respect to the reference 258 of the menu 240, for example by a value equal to the number of positions the selected character is offset from the menu's reference 258. The user can identify the position (or index) of the selected character in a number of ways, including by referencing the position to a corresponding value in the offset scale 260, referencing the position to a corresponding mark or position indicator of known offset, counting the number of positions that the selected character is offset from the reference 258, recalling from memory the value that identifies the particular selected character, and recalling by muscle memory the selection button keystrokes that correspond with the selected character or the selected character's position.
In another step 516, the user determines whether the value that identifies the selected character's position 242 in the menu 240 equals the assigned button press value 222 of any selection button 110.
If so, in another step 538 the user presses the selection button with the assigned value that equals the selected character's position and releases the button before the elapsed time counter expires. The aforementioned step 538 inputs the assigned value 222 of the pressed selection button to the CPU 108, triggers the CPU 108 to start the elapsed time counter 140, and indicates to the CPU that the type of button press is a SHORT press.
In one embodiment, in a subsequent step 520, the user waits for the elapsed time counter 140 to expire before the character selection cycle progresses. In an alternative embodiment, the user bypasses step 520 because the character selection cycle ends once the user releases the selection button. In an optional step 522, the user views the specified character on the display 104. In an alternative embodiment, step 522 is bypassed.
However, if the value that identifies the selected character's position 242 in the menu 240 is not equal to the assigned value of any selection button, then in an alternate step 536, the user determines whether the value that identifies the selected character's position 242 in the menu 240 equals twice the assigned button press value 222 of any selection button 110.
If so, in another step 540 the user presses the selection button 110 with the assigned value 222 that equals half the selected character's position and maintains the button press until the elapsed time counter expires. The aforementioned step 540 inputs the assigned value 222 of the pressed selection button to the CPU 108, triggers the CPU 108 to start the elapsed time counter 140, and indicates to the processor that the type of button press is a LONG press. In an optional step 522, the user views the specified character on the display 104. In an alternative embodiment, step 522 is bypassed.
However, if none of the values 222 assigned to the selection buttons 110 equals the selected character's position 242 or half the selected character's position, then in an alternate step 551 the user determines if the value that identifies the position of the selected character equals the sum of two consecutive selection button values. For example, a user could identify the fifth menu position as the sum of the values of the selection buttons '2' and '3', i.e., 2 + 3 = 5. In another example, a user could identify the first menu position as the sum of the values of the selection buttons '0' and Ί ', i.e., 0 + 1 = 1.
If so, in another step 553 the user presses the selection button 110 with the assigned value 222 that is the smaller of the two consecutive selection button values and then swipes outward before the elapsed time counter expires. The aforementioned step 553 inputs the sum of the assigned values 222 of the consecutive selection buttons to the CPU 108, on initiation of the swipe triggers the CPU 108 to start the elapsed time counter 140, upon recognition of the swipe gesture triggers the CPU to end the elapsed time counter, and indicates to the processor that the type of button press is a SWIPE press. In an optional step 522, the user views the specified character on the display 104. In an alternative embodiment, step 522 is bypassed.
In an alternative step 555, the user determines if the value that identifies the position of the selected character equals twice the value of any selection button value, plus one. For example, a user could identify the fifth menu position as twice the value of selection button '2', plus one, i.e., (2 * 2) + 1 = 5. In another example, a user could identify the first menu position as twice the value of selection button '0' plus one, i.e., (2 * 0) + 1 = 1. The user may recognize the '2χ + relationship before any button press is made. Alternatively they may recognize the relationship once a LONG press selection is pending. In other words, a cursor may tentatively identify the menu position that is twice the value of the pressed selection button; then the user may recognize that the cursor must advance one additional position along the menu to indicate the position of the character they identified in the step 514.
If a user makes the determination of the step 555, then in a step 557 the user presses the selection button 110 with the assigned value 222 that equals half the value that is one less than the selected character's position, maintains the button press until the elapsed timer expires, and then swipes outward. The aforementioned step inputs to the CPU 108 a value equal twice the assigned value 222 of the pressed selection button plus one, on initiation of the swipe triggers the CPU 108 to start the elapsed time counter 140, on recognition of the swipe gesture triggers the CPU to end the elapsed time counter, and indicates to the processor that the type of button press is a SWIPE press. In an optional step 522, the user views the specified character on the display 104. In an alternative embodiment, step 522 is bypassed.
Note that the determination of the step 555 identifies the same position identified in the step 551. So the steps 551 and 555 identify a character in the same position of the menu, but offer two alternative ways for a user to recognize how to select it. Whichever of steps 551 and 553 the user chooses to follow, the character that gets selected is the same.
In the event that none of the determinations 516, 536, 551, 555 leads to the character selected in the step 512, then the user steps through the determination steps again.
According to another embodiment of the invention, the character specification method 505 described above is used iteratively to specify series' of characters from the character menu 240. In one embodiment, words and sentences are formed on the display 104 by iteratively specifying characters according the method above, with the spacebar 264 used to input spaces between words on the display.
Figure 23 shows a flowchart of an embodiment of a method 740 for the processor 108 of an electronic device to interpret button presses and swipes. In one step 742 of the method 740, the CPU 108 initializes a variable 'button press value' (BPV) stored by the button press value counter 142 to zero. In another step 744 the CPU initializes a variable 'button press type' (BPT) to a empty string. In another step 612 the CPU 108 initializes a variable 'elapsed time' (ET) stored by the elapsed time counter 140 to zero. In another step 746 the CPU initializes a variable 'duration of the ETP' to a non-zero value or alternatively receives a non-zero value selected by a user.
In another step 614, the CPU 108 monitors the selection buttons 110 for a pressed selection button 110. Once a button press occurs, in another step 616, the CPU 108 sets the variable BPV to a value equal to the assigned value 222 of the pressed selection button 110. In another step 618, the CPU 108 starts the elapsed time counter 140.
In a pair of steps 622, 684 the CPU 108 monitors the selection buttons 110 for the occurrence of a swipe gesture while comparing the elapsed time (ET) with the selected duration of the elapsed time period (ETP).
If in the step 684 the swipe gesture interpreter 144 recognizes that the swipe threshold is exceeded before the elapsed time period expires, in a subsequent step 750, the CPU adds to the variable BPV a value equal to BPV + 1. In a subsequent step 756, the CPU updates the variable BPT to SWIPE and in another subsequent step 758 the CPU outputs the current values for the variables BPV and BPT.
If, on the other hand, the elapsed time exceeds the duration of the elapsed time period (i.e., expires) before a swipe gesture occurs, in a subsequent step 640 the CPU 108 determines if the button is still pressed.
If the button is not still pressed, then in a subsequent step 752 the CPU updates the variable BPT to SHORT and in another subsequent step 758 the CPU outputs the current values for the variables BPV and BPT.
If, however, the button is still pressed when the elapsed time period expires, then in an alternate subsequent step 642 the processor multiplies the value of the variable BPV by two.
Then, in a pair of steps 640, 684 the CPU 108 monitors the selection buttons 110 to determine if the selection button remains pressed and for the occurrence of a swipe gesture. If the pressed selection button is released without a swipe gesture occurring, then in a subsequent step 754 the CPU updates the variable BPT to LONG and in another subsequent step 758 the CPU outputs the current values for the variables BPV and BPT.
Alternatively, in the step 684, if the swipe gesture interpreter 144 recognizes that the swipe threshold is exceeded, then in a subsequent step 748 the CPU adds one to the variable BPV. Then in a subsequent step 756, the CPU updates the variable BPT to SWIPE and in another subsequent step 758 the CPU outputs the current values for the variables BPV and BPT.
In one embodiment of the method 740, the CPU 108 interprets as input the character 200 of the menu 240 whose position 242 equals the BPV output in the step 758.
According to a further embodiment of the invention, the CPU executes the method 740 iteratively, which selects one character from the menu for each iteration. According to another embodiment, in a further step the CPU 108 displays the identified character 200 on the screen 104.
Although the method 740 of Figure 23 is one embodiment of a method for specifying series of characters, the scope of the method is not limited by this particular embodiment, but rather by the scope of the claims.
Figure 24 shows a portion of the user interface 150 of Figure 2 and a table. The portion of the user interface includes selection buttons 110 that have assigned button press values 222 equal 0, 2 and 3. The table shows possible values for six variables of the method 740 of Figure 23. Three of the variables are input variables 810, which are selectable by a user. Three of the variables are output variables 815, which are determined by the device 110 according to the logic of Figure 23.
The input variables 810 selectable by a user are: the variable 'assigned value of pressed button' 222, a variable 'swipe threshold exceeded?' 804, a variable 'button lifted before or after time expires?' 788. The output variables 815 determined by the device are: the variable 'button press type (BPT)' 224, the calculation 790, and the 'calculated button press value (BPV)' 228.
Each row of the table discloses a unique combination of the three input variables 810. Assuming for a moment that the variable 'assigned value of pressed button' 222 is constant, then the remaining two input variables 'swipe threshold exceeded?' 804 and 'button lifted?' 788 have four possible unique combinations:
no/before, no/after, yes/before, yes/after. Each combination specifies a unique calculation 790. The specified calculation 790, together with the value of the pressed button 222, determines a value for the variable 'calculated BPV 228.
A notable outcome of the logic of the method 740 is that for a given assigned button press value 222, whether the swipe gesture becomes completed before or after the ETP expires, the same calculated BPV 228 results. For example, for the assigned value equal 2, a swipe threshold exceeded before the ETP expires yields a calculated BPV equal five, i.e., 2 + (2 + 1) = 5. And a swipe threshold exceeded after the ETP expires also yields a calculated BPV equal five, i.e., (2 * 2) + 1 = 5. The effect is that for the method 740 of Figure 23, swipe gestures are time-independent.
Another notable outcome is the fact that swipe gestures are time independent doesn't change the fact that button presses are not. For button presses (but not swipes), the duration of the button press still determines whether the calculated BPV 228 equals the value of the pressed button 222 (SHORT BPT) or twice the value of the pressed button (LONG BPT).
The selection buttons 110, assigned button values 222 and values for the input and output variables 810, 815 are examples used to demonstrate the embodiments of Figures 2, 22 and 23. The scope of the invention is not limited by the variables and values shown here, but rather by the scope of the claims.
Figure 25 shows a portion of an alternative embodiment of the user interface 150 of Figure 2 and a corresponding table of variables 810, 815. The portion of the alternative embodiment includes the selection buttons 1 10 and assigned button press values 222 of the embodiment of Figure 24, but also includes an additional button with an assigned value equal 4. The table reinforces that for a given assigned button press value 222, whether the swipe gesture becomes completed before or after the ETP expires, the same calculated BPV 228 results. For example, for the assigned value equal 3, a swipe threshold exceeded before the ETP expires yields a calculated BPV equal seven, i.e., 3 + (3 + 1) = 7. And a swipe threshold exceeded after the ETP expires also yields a calculated BPV equal seven, i.e., (3 * 2) + 1 = 7. The effect is that for the method 740 of Figure 23, swipe gestures are time-independent.
Figure 26 shows another flowchart of an embodiment of a method 780 for the processor 108 of an electronic device to interpret button presses. In one step 742 of the method 740, the CPU 108 initializes a variable 'button press value' (BPV) stored by the button press value counter 142 to zero. In another step 744 the CPU initializes a variable 'button press type' (BPT) to a empty string. In another step 612 the CPU 108 initializes a variable 'elapsed time' (ET) stored by the elapsed time counter 140 to zero. In another step 746 the CPU initializes a variable 'duration of the ETP' to a non-zero value or alternatively receives a non-zero value selected by a user. In another step 766 the CPU initializes a variable 'cycle interrupted' to FALSE.
In another step 614, the CPU 108 monitors the selection buttons 110 for a pressed selection button. Once a first selection button press occurs, in another step 616, the CPU 108 sets the variable BPV to a value equal to the assigned value 222 of the first pressed selection button 110.
In a subsequent step 768, the CPU determines if the variable BPV equals zero. If the BPV is not equal zero, in a step 722 the CPU sets a local ETP variable equal to the variable 'duration of ETP' initialized in the step 746. Alternatively, if the BPV equals zero, then in an alternative step 724 the CPU sets the local ETP variable equal zero.
In another step 618, the CPU 108 starts the elapsed time counter 140.
In a trio of steps 622, 684, 620 the CPU 108 monitors the selection buttons 110 for the occurrence of a swipe gesture or a second button press, while also comparing the elapsed time (ET) with the local ETP variable to see if the ETP has expired.
If in the step 684 the swipe gesture interpreter 144 recognizes that the swipe threshold is exceeded before the elapsed time period expires or a second button press occurs, then in a subsequent step 750, the CPU adds to the variable BPV a value equal to BPV + 1. In a subsequent step 756, the CPU updates the variable BPT to
SWIPE and in another subsequent step 758 the CPU outputs the current values for the variables BPV and BPT.
If, on the other hand, the CPU interprets a second button press before a swipe occurs or the elapsed time period expires (or the first button press is lifted), then in a subsequent step 752 the CPU updates the variable BPT to SHORT, in a subsequent step 776 the CPU changes the variable 'cycle interrupted' from FALSE to TRUE, and in another subsequent step 758 the CPU outputs the current values for the variables BPV and BPT.
If, on the other hand, the elapsed time counter expires before a swipe gesture or a second button press occurs, then in a subsequent step 640 the CPU 108 determines if the first button press is still pressed.
If the first button press is not still pressed, then in a subsequent step 752 the CPU updates the variable BPT to SHORT and in another subsequent step 758 the CPU outputs the current values for the variables BPV and BPT.
If, however, the first button press is still pressed when the elapsed time period expires, then in an alternate subsequent step 642 the CPU multiplies the value of the variable BPV by two and then in another subsequent step 754 the CPU updates the variable BPT to LONG.
Then, in a trio of steps 640, 684, 620 the CPU 108 monitors the selection buttons 110 for the occurrence of a swipe gesture or a second button press, while also determining if the first button press is still pressed.
If the CPU determines the first button press is no longer pressed before either a swipe or second button press occurs, then in a subsequent step 758 the CPU outputs the current values for the variables BPV and BPT.
However, if the CPU determines that a second button occurs before a swipe occurs or the first button is released, then in a subsequent step 776 the CPU changes the variable 'cycle interrupted' from FALSE to TRUE and in another subsequent step 758 the CPU outputs the current values for the variables BPV and BPT.
However, if in the step 684 the swipe gesture interpreter 144 recognizes that the swipe threshold is exceeded before the elapsed time period expires or a second button press occurs, then in a subsequent step 748 the CPU adds one to the variable BPV. In a subsequent step 756, the CPU updates the variable BPT to SWIPE and in another subsequent step 758 the CPU outputs the current values for the variables BPV and BPT.
In a step 612 subsequent to the step 758 that outputs values for the variables BPV and BPT, the CPU resets the variable 'elapsed time' (ET) stored by the elapsed time counter 140 to zero. Then, in a subsequent step 778, the CPU determines the value stored in the variable 'cycle interrupted'.
If the CPU determines that the variable 'cycle interrupted' is FALSE, then in a subsequent step 614 the CPU 108 monitors the selection buttons 110 for a next pressed selection button. Alternatively, if the CPU determines the variable 'cycle interrupted' is TRUE, in a subsequent step 782 the CPU sets the variable BPV stored by the button press value counter 128 to the button press value 222 of the second pressed selection button in the previous character selection cycle. Then, in a subsequent step, the CPU updates the variable 'cycle interrupted' to FALSE. Then, in the subsequent step 768, the CPU determines if the variable BPV stored by the button press value counter 142 equals zero.
In one embodiment, for each iteration of the method 780 of Figure 26 the
CPU 108 interprets as input the character 200 of the menu 240 whose position 242 equals the BPV output in the step 758.
According to a further embodiment of the invention, the CPU executes the method 740 iteratively, which selects one character from the menu per iteration. According to another embodiment, in a further step the CPU 108 displays the identified character 200 on the screen 104.
Although the method 780 of Figure 26 is one embodiment of a method for specifying series of characters, the scope of the method is not limited by this particular embodiment, but rather by the scope of the claims.
Figure 27 shows a table 800 that lists the ways each menu position 242 can be identified using the logic of Figure 23 for the user interface 150 of Figure 2. Although the table 800 shows only six positions plus a '0' position, due to symmetry these six positions can applied to either half of the 13 -position menu of the user interface 150 of Figure 2. Although the table of Figure 27 applies to the embodiment of the user interface of Figure 2, the table could be extended to apply to any length menu 240.
The table 800 includes values for the following variables: variable 'menu position' 242, variable 'gesture to select character' 802, variable 'assigned value of pressed button 222, variable 'swipe threshold exceeded' 804, variable 'button released' 806, variable ΈΤΡ expired' 808.
The table 800 shows that for positions accessible by a swipe gesture (Positions 1 and 5 for the embodiment of Figure 2), there is always at least two ways for a user to reach that position. Furthermore, for those positions, the variable ΈΤΡ expired' is FALSE for at least one way and TRUE for at least one of the others. That fact guarantees that even if a user fails to complete a swipe gesture when they expect to (i.e., expected to complete the swipe before but completed it after, or vice-versa), the same character gets selected anyway. That fact makes swipe gestures time- independent.
Each row of the table has one grey box 809 that marks one or the other of the variables 'swipe threshold exceeded' 804 and 'button released' 806. The grey box 809 indicates the action that signifies the end the character selection cycle.
For button presses (i.e., shorts and longs, but not swipes), the character selection cycle terminates with a button release. In other words, if a button is released and no swipe gesture has occurred, then the selection cycle ends. (In an alternative embodiment, the selection cycle extends until the ETP expires for short presses, but that isn't necessary.)
On the other hand, for a swipe gesture (i.e., not a short or long press), the selection cycle may or may not immediately end. In one embodiment, swipes that occur before the ETP expires cause the selection cycle to immediately end, but swipes that occur after the ETP expires do not cause the selection cycle to end. For swipes completed after the ETP expires, the button release ends the selection cycle. This enables the user to "undo" a swipe gesture, if they want, by swiping back to the position where the swipe gesture originated. Ultimately there are multiple ways that the end of a character selection can be triggered that are consistent with gestures 802 of Figure 27 and the logic of the method 740 of Figure 23.
Figure 28 is a table 800 that lists the ways each position 242 of the menu 240 can be reached using the selection buttons 110 and the logic of Figure 26 for the user interface 150 of Figure 2. The table 800 of the embodiment of Figure 28 also includes assigned characters 200 for each position of the menu 240.
The table includes values for the following variables: variable 'menu position' 242, variable 'gesture to select character' 802, variable 'button pressed' 222, variable 'swipe threshold exceeded' 804, variable 'button released' 806, variable ΈΤΡ expired' 808 and character 200. The table of Figure 28 is just one possible embodiment of the user interface of Figure 2 and the methods of Figures 22, 23 and 26, but in alternative embodiments could include alternative character assignments, alternative assigned values for the selection buttons 110, and alternative numbers of menu positions 242 and selection buttons 110, among other possible variations.
Figures 29 and 30 show examples of how a word 130 is composed according to the method 780 of Figure 26 and the embodiment of the user interface 150 of Figure 2. For the example of Figure 29, the composed word 130 is 'flag'.
Each row of Figure 29 shows one or more ways in which a particular letter 200 of the word 130 could be composed from a 'button press value' 222 and a 'button press type' 224.
Values for the variables 'button press value' 222 and 'button press type' 224 are selected by a user based on the position of an intended character 200 in the menu 240 and knowledge about how gestures identify calculations 790 according to the method 780 of Figure 26. The variable ΈΤΡ expired' 808 shows that for the SWIPE BPT the swipe gesture may be completed before or after the ETP expires and the same character becomes selected in either case.
The variable 'calculation' 790 is specified based on the BPT 224 according to the logic of the method 780 of Figure 26. The variable 'calculated BPV 228 is the result of the calculation of 790 and the assigned BPV 222 selected by the user. The device identifies the user's intended character 200 based on the 'calculated BPV and the assignment of the characters in the menu 240.
For the example of Figure 29, a BPV = 0 and BPT = SWIPE identifies the character 'f . A BPV = 2 and a BPT = SWIPE identifies the character T . A BPV = -3 and a BPT = LONG identifies the character 'a' . A BPV = 0 and a BPT = SHORT or LONG identifies the character 'g' .
For the example of Figure 30, the composed word 130 is 'bike' . Each row of Figure 30 shows one or more ways in which a character 200 could be composed from a 'button press value' 222 and a 'button press type' 224.
Values for the variables 'button press value' 222 and 'button press type' 224 are selected by a user based on the position of an intended character 200 in the menu 240 and knowledge about how gestures identify calculations 790 according to the method 780 of Figure 26. The variable ΈΤΡ expired' 808 shows that for the SWIPE BPT the swipe gesture may be completed before or after the ETP expires and the same character becomes selected in either case.
The variable 'calculation' 790 is specified based on the BPT 224 according to the logic of the method 780 of Figure 26. The variable 'calculated BPV 228 is the result of the calculation of 790 and the assigned BPV 222 selected by the user. The device identifies the user's intended character 200 based on the 'calculated BPV and the assignment of the characters in the menu 240.
For the example of Figure 30, a BPV = -2 and BPT = SWIPE identifies the character 'b' . A BPV = 2 and a BPT = SHORT identifies the character 'i'. A BPV = 2 and a BPT = LONG identifies the character 'k'. A BPV = -2 and a BPT = SHORT identifies the character 'e'.
Figure 31 shows a schematic drawing of another embodiment of the electronic device 100 for input of characters. The device 100 may have some or all the components and functionality described herein with respect to the mobile device 100 of Figure 1. The device 100 has aspects previously disclosed in Figure 8 of U.S. Patent No. 8,487,877, which is hereby incorporated by reference in its entirety.
The electronic device 100 includes the display 104, the plurality of characters 200 that populate positions 242 of the character menu 240, the plurality of selection buttons 110 and the spacebar button 264, which together make up the user interface 150 of the device 100. Each of the plurality of selection buttons 110 has an assigned button press value 222. Included as part or within proximity to the menu 240 is the reference 258 and the offset scale 260. The display 104, and specifically the menu 240 the plurality of selection buttons 110, and the spacebar button 264, are communicatively coupled with the CPU 108, as described in the embodiment of Figure 1. The CPU 108 includes the elapsed time counter 140, the integer value counter 142 and the swipe gesture interpreter 144, as described in the embodiment of Figure 1. The CPU 108 is communicatively coupled with the storage medium 112 and the power source 122, as described in the embodiment of Figure 1.
In the embodiment of Figure 31, the menu 240 has 17 menu positions 242 and the plurality of selection buttons includes six buttons with the assigned button press values 222: '-4, -3, -2, +2, +3, +4' . In a further embodiment, the menu positions 242 are populated by 17 of the 33 characters 200 of the Russian alphabet.
Figure 32 shows a table 800 that lists the ways each position 242 of the menu 240 can be reached using the logic of the method 780 of Figure 26 for the embodiment of the user interface 150 of Figure 31. The table 800 includes assigned characters 200 for each position of the menu 240.
The table includes values for the following variables: variable 'menu position' 242, variable 'gesture to select character' 802, variable 'button pressed' 222, variable 'swipe threshold exceeded' 804, variable 'button released' 806, variable ΈΤΡ expired' 808 and character 200. The table of Figure 32 is just one possible embodiment of the user interface of Figure 31 and the methods of Figures 22, 23 and 26, but in alternative embodiments could include alternative character assignments, alternative assigned values for the selection buttons 110, and alternative numbers of menu positions 242 and selection buttons 110, among other possible variations.
Figure 33 shows a plot 820 of graphical representations of examples of possible button input gestures as a function of time and positional displacement. The plot 820 includes a positional displacement axis 822 with units of pixels. The plot also includes a time axis 824 with units of milliseconds. An origin 826 of the axes marks the point in time and position where the onset of a button input gesture occurs. At the origin 826, the elapsed time counter 140 equals 0, the positional displacement measured by the swipe gesture interpreter 144 equals 0, and the calculated button press value 228 measured by the button press value counter 142 equals zero. The onset of a button input gesture is the onset of a button press.
The elapsed time axis 824 is divided into two segments by an elapsed time threshold 830, which in this example equals 100 msec. The elapsed time threshold identifies the point in time when the elapsed time period expires.
The positional displacement axis 822 is divided into two segments by the swipe threshold 832, which in this example equals 25 pixels. The swipe threshold identifies the point that positional displacement of a button press becomes identified as a SWIPE button press type.
The plot 820 is divided into four quadrants 840 according to four possible combinations of the variables positional displacement 822 and time 824. In this example, the combinations are (1) <25 / <100, (2) <25 / >100, (3) >25 / <100, and (4) >25 / >100.
Each quadrant has an associated math operation 842. The math operation is the calculation that the processor 108 takes on the current value of the button value counter 142, where x is the current value of the counter. At the onset of the button input gesture 826, which is also the onset of a character selection cycle, the value of x equals 0.
The plot 820 shows various paths 840 that a user input gesture could trace through the quadrants 840. The paths correspond to (1) the positional
displacement of an input gesture as measured by the swipe gesture interpreter 144 and (2) elapsing time as measured by the elapsed time counter 140. Each path has a termination 842, which identifies when a button press is released and the character selection cycle ends.
The particular path 840 that a user input gesture follows through the plot determines the one or more math operations 842 that the processer 108 takes on the value of the button press value counter 142. Each possible path of Figure 33 corresponds with each of the four possible outcomes for a given pressed button value 222 of Figure 24.
Note that the calculated BPV 228 for paths that terminate in a quadrant where the positional displacement is less than the swipe threshold 832 depends on the time elapsed (x or 2x). Note that the calculated BPV 228 for paths that terminate in a quadrant where the positional displacement is greater than the swipe threshold 832 do not depend on the time elapsed (x + (x + 1) or 2x + 1) because the mathematical result is the same whichever path is followed (x + (x + 1) = 2x + 1).
Figure 34 shows a schematic drawing of one embodiment of the electronic device 100 for input of characters. The device 100 may have some or all the components and functionality described herein with respect to the mobile device 100 of Figure 1. The device 100 has aspects previously disclosed in Figure 8 of U.S. Patent No. 8,487,877, which is hereby incorporated by reference in its entirety.
The electronic device 100 includes the display 104, a plurality of characters 200 that populate positions 242 of a character menu 240, a plurality of selection buttons 110 and a spacebar button 264, which together make up a user interface 150 of the device 100. Each of the plurality of selection buttons 110 has an assigned button press value 222. Included as part or within proximity to the menu 240 is a reference 258 and an offset scale 260. The display 104, the plurality of selection buttons 110, and the spacebar button 264 are communicatively coupled with the CPU 108, as described in the embodiment of Figure 1. The CPU 108 includes the elapsed time counter 140, the integer value counter 142 and the swipe gesture interpreter 144, as described in the embodiment of Figure 1. The CPU 108 is communicatively coupled with the storage medium 112 and the power source 122, as described in the
embodiment of Figure 1.
In the embodiment of Figure 34, the positions 242 of the menu 240 are arranged in a one-dimensional array similar to the embodiment in Figure 8 of Patent No. 8,487,877, except that the menu 240 and corresponding selection buttons 110 are shown on the display 104 instead of as physical features of the user interface 150. The buttons 110 are communicatively coupled with the CPU 108.
The menu 240 and the offset scale 260 are positioned in respective one-dimensional arrays in the user interface region 150 of the device 100. In one embodiment the character menu 240 and the offset scale 260 are positioned on the user interface 150 so that they lie adjacent to and parallel with one other. In one
embodiment, the character menu 240 and the offset scale 260 are programmed in software so that they appear as features on the display 104 of the device 100.
In one embodiment, positions 242 of the menu 240 are distributed in a one-dimensional array in evenly spaced increments. In a further embodiment, values of the offset scale 260 are distributed in a one-dimensional array in spatial increments that match the increment of the menu 240, so that by referencing the offset scale 260 to the menu 240, characters 200 in the menu are effectively numbered.
The reference 258 is an indicator located near or on one of the positions 242 of the menu 240. The offset scale 260 includes a value of zero that is located to correspond with the reference 258 of the menu 240. Values of the offset scale 260 increase from zero in pre-selected increments as positions of the offset scale get farther from the zero value. In a further embodiment, values of the offset scale 260 decrease from zero in pre-selected increments as positions of the offset scale get farther from the zero value in a direction opposite to the increasing direction. In one embodiment, the pre-selected increment of the offset scale 260 equals one and the values of the offset scale extend from a negative value to a positive value passing through zero. In an alternative embodiment, the increment of the offset scale 260 is 10 and positions 242 of the menu 240 are marked off in corresponding units of 10.
In one specific embodiment, the positions 242 of the menu 240 and the values of the offset scale 260 are distributed in respective one-dimensional arrays positioned adjacent to and parallel with one another, the values of the offset scale 260 count in increments of one and are spaced with respect to one another in their array to correspond with the spacing of positions 242 of the menu 240, and the zero value of the offset scale 260 corresponds to the reference 258 of the menu 240 so that the values of the offset scale 260 label the positions of the menu 240 according to how many positions a given position 242 of the menu 240 is offset from the reference 258.
The plurality of selection buttons 110 lie on the display 104 of the user interface 150 of the device 100. In one embodiment, the buttons 110 are arranged in a row that corresponds to the physical alignment of the menu 240 on the user interface. Each button is communicatively coupled with the CPU 108 and is assigned a button press value 222. The assigned button press value 222 can be either positive or negative. Each button 110 has the function that when the button is pressed the value 222 assigned to the button is input to the CPU 108. Furthermore, each button 110 has the function that when the button is pressed for a minimum time duration the value 222 assigned to the button is doubled and then input to the CPU 108. Furthermore, each button 110 has the function that when the button is pressed and a swipe gesture follows the press, the value 222 assigned to the button is doubled, then increased by one, and then input to the CPU 108.
In one embodiment, the assigned button press value 222 of each selection button is unique. In another embodiment there are four selection buttons and the buttons' assigned values are -3, -2, +2, and +3. In another embodiment there are four selection buttons and the buttons' assigned values are -3, -1, +1, and +3. In yet another embodiment there are five selection buttons and the buttons' assigned values are -5, -2, 0, +2, and +5.
The spacebar 264 also lies in the user interface region 150 of the device 100, can be either a hard or soft key, and is communicatively coupled with the CPU 108.
In one embodiment of Figure 34, the menu 240 has 13 menu positions 242 and the plurality of selection buttons includes five buttons with the assigned button press values 222: '-5, -2, 0, +2, +5' . In a further embodiment, the menu positions 242 are populated by 13 of the 26 characters 200 of the English alphabet.
The selection buttons 110 of the electronic device 100 of Figure 34 are receptive to two input gestures: button presses and swipe gestures. A 'button press' is an activation of a button that extends for some duration of time greater than zero. A 'swipe gesture' is a positional displacement of a button press along the screen 104 that occurs during a button press. For example, this may be the distance along the screen 104 a user moves their finger while it is held on the screen 104 during the button press. A swipe that ends on different key than it started is still during the "button press". Where a swipe gesture ends is irrelevant, except relative to its "swipe distance threshold". "Swipe distance threshold" is measured with respect to the position of the button press at its onset and nothing else.
"Final determination of a character" and "ending the selection cycle" usually occur simultaneously, but not always and they don't have to. Described herein are alternatives for ending the selection cycle. Various other alternatives for ending the cycle are possible. In the context of Figure 35 (described later), a swipe gesture includes the possibility of a zero-length displacement. Based on these two definitions, any activation of one of the selection buttons 110 includes both a 'button press' and a 'swipe gesture' .
The duration of a button press is measured from the onset of the button press until its release. The duration is typically measured in milliseconds. The positional displacement (also called length or distance) of a swipe gesture is measured along the plane of the screen 104 from the point of the button press at its onset to the point of the button press at its release. The swipe distance is typically measured in pixels, but can also be measured in other length units such as mm or fractional inches.
Although duration and swipe distance are measured responses to separate input gestures (button press and swipe gesture, respectively), both input gestures are inherent in any button activation. In other words, for the gestures as they're defined above, any button activation includes both a button press and swipe gesture (even if the swipe distance equals 0). As such, the response of each input gesture can be acquired simultaneously for any button activation.
Figure 35 shows a plot 820 that represents possible examples of responses for duration and swipe distance for the respective input gestures 'button press' and 'swipe gesture'. A single curve 840 represents a possible combination of measurements for duration and swipe distance over the course of a character selection cycle (also referred to as button activation).
In the plot, button press duration is plotted on the x-axis 824 and swipe distance on the_y-axis 822. Duration is measured in units of milliseconds and swipe distance is measured in units of pixels. Onset of a button press occurs at the plot's origin 826 and marks the point in time and distance where the onset of an input gesture occurs. The release of a button is represented by a terminus 842 at the end of each curve. The path that a curve 840 follows through the plot reflects the duration and swipe distance of a received button activation. The response of each input gesture is converted to a binary output using a threshold value. A threshold value is an arbitrary value that enables the analog output of each measured response to be recast as a binary output, i.e., a high or low value.
In the plot 820, the duration axis 824 is divided into two segments by an elapsed time threshold 830, which in this example equals 200 msec. The elapsed time threshold is the end of a selectable elapsed time period (ETP).
The swipe distance axis 822 is divided into two segments by a swipe distance threshold 832, which in this example equals 25 pixels. The swipe distance threshold identifies the minimum required positional displacement for a swipe gesture to be classified as a SWIPE BPT (rather than a LONG or SHORT BPT).
Applying the threshold values 830, 832 to the plot 820 divides the plot into four regions 838. Each region represents a unique combination of the binary output values from the input gestures. In other words, for the input gesture responses button press duration and swipe distance, each region represents one possible combination of high and low values: low/low, low/high, high/low and high/high. For the example of Figure 35, the measured responses would be segregated among the four regions as follows: <25 / <100, <25 / >100, >25 / <100 and >25 / >100.
Each region 838 of the plot is identified by a button press type (BPT) value. The BPT is merely a label for the combination of binary values that identify a given region. The current BPT value for a character selection cycle reflects the current measured responses for duration and swipe distance. Because the path that a curve 840 takes through the plot may intersect more than one region 838 of the plot during the course of a character selection cycle, the BPT may evolve during the selection cycle. The final BPT value of a character selection cycle is determined when the button press is lifted, which is identified by the terminus 842 of the curve. For the embodiment of Figure 35, the possible BPTs are SHORT, LONG and SWIPE.
Each region 838 of the plot also has an associated math operation 842. The math operation is a calculation that the processor 108 executes on the current value of the BPV 228 variable stored in the button value counter 142.
Because the BPT can evolve during a character selection cycle, the number of math operations that can occur during a selection cycle varies. Each instance that a curve 840 crosses over a threshold 830, 832, a new math operation 842 associated with the newly entered region 383 becomes applied to the current value for BPV 228.
The particular path that a curve follows determines which, and how many, of the one or more math operations 842 the processer 108 applies to the BPV.
The number of math operations is from one (BPT = SHORT, so BPV = x) to three (BPT = SWIPE via LONG, so BPV = x + 1 + 1), where x = the assigned value of the pressed selection button.
Note that the calculated BPV 228 for curves that terminate in a region where the swipe distance is less than the swipe threshold 832 depends on the time elapsed (BPV = x or BPV = x + 1). Note that the calculated BPV 228 for curves that terminate in a region where the swipe distance is greater than the swipe threshold 832 do not depend on the time elapsed (BPV = x + 2 or BPV = x + 1 + 1); in either case the result is mathematically the same. This consequence is an intentional, so that button activations that are not of the SWIPE BPT can be time-dependent, while button activations that are the SWIPE BPT are time-independent.
Figure 36 shows a flowchart of an embodiment of a method 560 for a user to specify a character from among a plurality of characters. In one step 510 of the method 560, a user views the characters 200 displayed in the menu 240. In another step 512, the user selects a character from the menu 240 for input to the electronic device 100. In another step 514, the user identifies the selected character by the position of the character with respect to the reference 258 of the menu 240, for example by a value equal to the number of positions the selected character is offset from the menu's reference 258. The user can identify the position (or index) of the selected character in a number of ways, including by referencing the position to a corresponding value in the offset scale 260, referencing the position to a corresponding mark, color or position indicator of known offset or correspondence with a selection button, counting the number of positions that the selected character is offset from the reference 258, recalling from memory the value that identifies the particular selected character, and recalling by muscle memory the selection button keystrokes that correspond with the selected character or the selected character's position.
In another step 516, the user determines whether the value that identifies the selected character's position 242 in the menu 240 equals the assigned button press value 222 of any selection button 110.
If so, in another step 538 the user presses the selection button with the assigned value that equals the selected character's position and releases the button before the elapsed time counter expires. The aforementioned step 538 inputs the assigned value 222 of the pressed selection button to the button value counter 142, triggers the CPU 108 to start the elapsed time counter 140, and indicates to the CPU that the type of button press is a SHORT press.
In an optional step 522, the user views the specified character on the display 104. In an alternative embodiment, step 522 is bypassed. However, if the value that identifies the selected character's position 242 in the menu 240 is not equal to the assigned value of any selection button, then in an alternate step 562, the user determines whether the value that identifies the selected character's position 242 in the menu 240 equals one more than the assigned button press value 222 of any selection button 110.
If so, in another step 564 the user presses the selection button 110 with the assigned value 222 that equals one less than the selected character's position and maintains the button press until the elapsed time counter expires. The aforementioned step 564 inputs the assigned value 222 of the pressed selection button to the button press value counter 142, triggers the CPU 108 to start the elapsed time counter 140, and indicates to the processor that the type of button press is a SHORT press. Then, once the elapsed time counter expires the CPU adds one to the button press value counter and indicates to the processor that the type of button press is a LONG press. In an optional step 522, the user views the specified character on the display 104. In an alternative embodiment, step 522 is bypassed.
However, if none of the values 222 assigned to the selection buttons 110 equals the selected character's position 242 or one less than the selected character's position, then in an alternate step 566 the user determines if the value that identifies the position of the selected character equals two more than the assigned button press value 222 of any selection button 110.
If so, in another step 568 the user presses the selection button 110 with the assigned value 222 that equals two less than the selected character's position and maintains the button press until the elapsed time counter expires. The aforementioned step 568 inputs the assigned value 222 of the pressed selection button to the button press value counter 142, triggers the CPU 108 to start the elapsed time counter 140, and indicates to the processor that the type of button press is a SHORT press. Then, once the elapsed time counter expires the CPU adds one to the button press value counter and indicates to the processor that the type of button press is a LONG press.
Then, in a subsequent step 569 the user completes, in the same button activation as step 568, a swipe gesture that exceeds the swipe distance threshold. The aforementioned step 569 triggers the CPU to add one to the button press value counter 142 and indicates to the processor that the type of button press is a SWIPE gesture.
In an optional step 522, the user views the specified character on the display 104. In an alternative embodiment, step 522 is bypassed.
Alternatively, in a step 570 that is an alternate to steps 568 and 569, the user presses the selection button 110 with the assigned value 222 that equals two less than the selected character's position and completes a swipe gesture that exceeds the swipe distance threshold before the elapsed time counter expires. The aforementioned step 570 inputs the assigned value 222 of the pressed selection button to the button press value counter 142, triggers the CPU 108 to start the elapsed time counter 140, and indicates to the processor that the type of button press is a SHORT press. Then, once the swipe gesture interpreter 144 recognizes the BPT as a SWIPE BPT, the CPU adds two to the button press value counter and indicates to the processor that the type of button press is a SWIPE BPT.
Note that the determination of step 568 and 569 identifies the same position identified in the step 570. So the steps 568 and 569 together identify a character in the same position of the menu as the step 570. Therefore, the method 560 offers two alternative ways for a user to recognize how to select a character two positions greater than the assigned value of a selection button. Whichever of steps 568/569 or 570 the user chooses to follow, the character that gets selected is the same.
In the event that none of the determinations 516, 562, 566 leads to the character selected in the step 512, then the user steps through the determination steps again.
According to another embodiment of the invention, the character specification method 560 described above is used iteratively to specify series' of characters from the character menu 240. In one embodiment, words and sentences are formed on the display 104 by iteratively specifying characters according the method above, with the spacebar 264 used to input spaces between words on the display.
Figure 37 shows a flowchart of an embodiment of a method 741 for the processor 108 of an electronic device to interpret button presses and swipes. In one step 742 of the method 741, the CPU 108 initializes a variable 'button press value' (BPV) stored by the button press value counter 142 to zero. In another step 744 the CPU initializes a variable 'button press type' (BPT) to a empty string. In another step 612 the CPU 108 initializes a variable 'elapsed time' (ET) stored by the elapsed time counter 140 to zero. In another step 746 the CPU initializes a variable 'duration of the ETP' to a non-zero value or alternatively receives a non-zero value selected by a user.
In another step 614, the CPU 108 monitors the selection buttons 110 for a pressed selection button 110. Once a selection button press occurs, in another step 616, the CPU 108 sets the variable BPV to a value equal to the assigned value 222 of the pressed selection button 110. In another step 618, the CPU 108 starts the elapsed time counter 140. In a pair of steps 622, 684 the swipe gesture interpreter 144 monitors the selection button pressed in the step 614 for the occurrence of a swipe gesture. At the same time, the elapsed time counter 140 compares the elapsed time (ET) with the selected duration of the elapsed time period (ETP). The step 622 corresponds with the comparison of the curve 842 with the threshold value 830 of Figure 35. The step 684 corresponds with the comparison of the curve 842 with the threshold value 832 of Figure 35.
If in the step 684 the swipe gesture interpreter 144 recognizes that the swipe threshold is exceeded before the elapsed time period expires, in a subsequent step 760, the CPU adds two to the variable BPV. In a subsequent step 756, the CPU updates the variable BPT to SWIPE and in another subsequent step 758 the CPU outputs the current values for the variables BPV and BPT.
If, on the other hand, the elapsed time exceeds the duration of the elapsed time period (i.e., expires) before a swipe gesture occurs, in a subsequent step 640 the CPU 108 determines if the button is still pressed.
If the button is not still pressed, then in a subsequent step 752 the CPU updates the variable BPT to SHORT and in another subsequent step 758 the CPU outputs the current values for the variables BPV and BPT.
If, however, the button is still pressed when the elapsed time period expires, then in an alternate subsequent step 748 the CPU adds one to the variable BPV.
Then, in a pair of steps 640, 684 the CPU 108 monitors the selection buttons 110 to determine if the pressed selection button remains pressed and for the occurrence of a SWIPE BPT.
If the pressed selection button is released without a swipe BPT occurring, then in a subsequent step 754 the CPU updates the variable BPT to LONG and in another subsequent step 758 the CPU outputs the current values for the variables BPV and BPT.
Alternatively, in the step 684, if the swipe gesture interpreter 144 recognizes that the swipe threshold is exceeded, then in a subsequent step 748 the CPU adds one to the variable BPV. Then in a subsequent step 756, the CPU updates the variable BPT to SWIPE and in another subsequent step 758 the CPU outputs the current values for the variables BPV and BPT.
In one embodiment of the method 741, the CPU 108 interprets as input the character 200 of the menu 240 whose position 242 equals the BPV output in the step 758. According to a further embodiment of the invention, the CPU executes the method 741 iteratively, which selects one character from the menu per iteration. According to another embodiment, in a further step the CPU 108 displays the identified character 200 on the screen 104.
Although the method 741 of Figure 37 is one embodiment of a method for specifying series of characters, the scope of the method is not limited by this particular embodiment, but rather by the scope of the claims.
Figure 38 shows a portion of the user interface 150 of Figure 34 and a table. The portion of the user interface includes selection buttons 110 that have assigned button press values 222 equal 0, 2 and 5. The table shows possible values for six variables of the method 741 of Figure 37. Three of the variables are input variables 810, which are selectable by a user. Three of the variables are output variables 815, which are determined by the device 110 according to the logic of Figure 37.
The input variables 810 selectable by a user are: the variable 'assigned value of pressed button' 222, a variable 'swipe threshold exceeded?' 804, a variable 'button lifted before or after time expires?' 788. The output variables 815 determined by the device are: the variable 'button press type (BPT)' 224, the calculation 790, and the 'calculated button press value (BPV)' 228.
Each row of the table discloses a unique combination of the three input variables 810. Assuming for a moment that the variable 'assigned value of pressed button' 222 is constant, then the remaining two input variables 'swipe threshold exceeded?' 804 and 'button lifted?' 788 have four possible unique combinations:
no/before, no/after, yes/before, yes/after. Each combination specifies a unique calculation 790. The specified calculation 790, together with the value of the pressed button 222, determines a value for the variable 'calculated BPV 228.
A notable outcome of the logic of the method 741 is that for a given assigned button press value 222, whether the swipe gesture exceeds the swipe threshold before or after the ETP expires, the same calculated BPV 228 results. For example, for the assigned value equal 2, a swipe that exceeds the swipe threshold before the ETP expires yields a calculated BPV equal four, i.e., 2 + 2 = 4. And a swipe that exceeds the swipe threshold after the ETP expires also yields a calculated BPV equal four, i.e., (2 + 1) + 1 = 4. The effect is that for the method 741 of Figure 37, button activations that are SWIPE BPT are time-independent.
Another notable outcome is the fact that although button activations that are SWIPE BPT are time independent, button activations that are not SWIPE BPT (i.e., SHORT and LONG BPTs) are not. For button activations that are not SWIPE BPT, the duration of the button press still determines whether the calculated BPV 228 equals the value of the pressed button 222 (SHORT BPT) or one more than the value of the pressed button (LONG BPT).
The selection buttons 110, assigned button values 222 and values for the input and output variables 810, 815 are examples used to demonstrate the embodiments of Figures 34, 35, 36 and 37. The scope of the invention is not limited by the variables and values shown here, but rather by the scope of the claims.
Figure 39 shows a portion of an alternative embodiment of the user interface 150 of Figure 34 and a corresponding table of variables 810, 815. The portion of the alternative embodiment includes the selection buttons 1 10 and assigned button press values 222 of the embodiment of Figure 38, but also includes an additional button with an assigned value equal 8. The table reinforces that for a given assigned button press value 222, whether the swipe gesture exceeds a threshold value before or after the ETP expires, the same calculated BPV 228 results. For example, for the assigned button press value equal 5, a swipe threshold value exceeded before the ETP expires yields a calculated BPV equal seven, i.e., 5 + 2 = 7. And a swipe threshold value exceeded after the ETP expires also yields a calculated BPV equal seven, i.e., (5 + 1) + 1 = 7. The effect is that for the method 741 of Figure 37, button activations that are SWIPE BPT are time-independent.
Figure 40 shows another flowchart of an embodiment of a method 781 for the processor 108 of an electronic device to interpret button presses. In one step 742 of the method 741, the CPU 108 initializes a variable 'button press value' (BPV) stored by the button press value counter 142 to zero. In another step 744 the CPU initializes a variable 'button press type' (BPT) to a empty string. In another step 612 the CPU 108 initializes a variable 'elapsed time' (ET) stored by the elapsed time counter 140 to zero. In another step 746 the CPU initializes a variable 'duration of the ETP' to a non-zero value or alternatively receives a non-zero value selected by a user. In another step 766 the CPU initializes a variable 'cycle interrupted' to FALSE.
In another step 614, the CPU 108 monitors the selection buttons 110 for a pressed selection button. Once a first selection button is pressed, in another step 616, the CPU 108 sets the variable BPV to a value equal to the assigned value 222 of the first pressed selection button 110.
In a subsequent step 768, the CPU determines if the variable BPV equals zero. If the BPV is not equal zero, in a step 722 the CPU sets a local ETP variable equal to the variable 'duration of ETP' initialized in the step 746. Alternatively, if the BPV equals zero, then in an alternative step 724 the CPU sets the local ETP variable equal zero.
In another step 618, the CPU 108 starts the elapsed time counter 140. In a trio of steps 622, 684, 620 the CPU 108 monitors the selection buttons 110 for a swipe gesture that exceeds the swipe distance threshold or for a second button press, while also comparing the elapsed time (ET) with the local ETP variable to see if the ETP has expired. The step 622 corresponds with the comparison of the curve 842 with the threshold value 830 of Figure 35. The step 684 corresponds with the comparison of the curve 842 with the threshold value 832 of Figure 35.
If in the step 684 the swipe gesture interpreter 144 recognizes that the swipe distance threshold is exceeded before (a) the elapsed time period expires or (b) a second button press occurs, then in a subsequent step 760, the CPU adds two to the variable BPV. In a subsequent step 756, the CPU updates the variable BPT to SWIPE and in another subsequent step 758 the CPU outputs the current values for the variables BPV and BPT.
If, on the other hand, the CPU interprets a second button press before (a) the swipe distance threshold is exceeded, (b) the elapsed time period expires, or (c) the first button press is lifted, then in a subsequent step 752 the CPU updates the variable BPT to SHORT, in a subsequent step 776 the CPU changes the variable 'cycle interrupted' from FALSE to TRUE, and in another subsequent step 758 the CPU outputs the current values for the variables BPV and BPT.
If, on the other hand, the elapsed time counter expires before (a) the swipe distance threshold is exceeded or (b) a second button press occurs, then in a subsequent step 640 the CPU 108 determines if the first button press is still pressed.
If the first button press is not still pressed, then in a subsequent step 752 the CPU updates the variable BPT to SHORT and in another subsequent step 758 the CPU outputs the current values for the variables BPV and BPT.
If, however, the first button press is still pressed when the elapsed time period expires, then in an alternate subsequent step 748 the CPU adds one to the variable BPV and then in another subsequent step 754 the CPU updates the variable BPT to LONG.
Then, in a trio of steps 640, 684, 620 the CPU 108 monitors the selection buttons 110 for (a) a swipe gesture that exceeds the swipe distance threshold or (b) a second button press, while also determining if the first button press is still pressed.
If the CPU determines the first button press is no longer pressed before either (a) a swipe gesture that exceeds the swipe distance threshold or (b) a second button press, then in a subsequent step 758 the CPU outputs the current values for the variables BPV and BPT.
However, if the CPU determines that a second button press occurs before the swipe distance threshold is exceeded or the first button is released, then in a subsequent step 776 the CPU changes the variable 'cycle interrupted' from FALSE to TRUE and in another subsequent step 758 the CPU outputs the current values for the variables BPV and BPT.
However, if in the step 684 the swipe gesture interpreter 144 recognizes that the swipe threshold is exceeded before the elapsed time period expires or a second button press occurs, then in a subsequent step 748 the CPU adds one to the variable BPV. In a subsequent step 756, the CPU updates the variable BPT to SWIPE and in another subsequent step 758 the CPU outputs the current values for the variables BPV and BPT.
In a step 612 subsequent to the step 758 that outputs values for the variables BPV and BPT, the CPU resets the variable 'elapsed time' (ET) stored by the elapsed time counter 140 to zero. Then, in a subsequent step 778, the CPU determines the value stored in the variable 'cycle interrupted'.
If the CPU determines that the variable 'cycle interrupted' is FALSE, then in a subsequent step 614 the CPU 108 monitors the selection buttons 110 for a next pressed selection button. Alternatively, if the CPU determines the variable 'cycle interrupted' is TRUE, in a subsequent step 782 the CPU sets the variable BPV stored by the button press value counter 128 to the button press value 222 of the second pressed selection button in the previous character selection cycle. Then, in a subsequent step, the CPU updates the variable 'cycle interrupted' to FALSE. Then, in the subsequent step 768, the CPU determines if the variable BPV stored by the button press value counter 142 equals zero.
In one embodiment, for each iteration of the method 781 of Figure 40 the
CPU 108 interprets as input the character 200 of the menu 240 whose position 242 equals the BPV output in the step 758.
According to a further embodiment of the invention, the CPU executes the method 741 iteratively, which selects one character from the menu per iteration.
According to another embodiment, in a further step the CPU 108 displays the identified character 200 on the screen 104.
Although the method 781 of Figure 40 is one embodiment of a method for specifying series of characters, the scope of the method is not limited by this particular embodiment, but rather by the scope of the claims. Figure 41 shows a table 801 that lists the ways each menu position 242 can be identified using the logic of Figure 37 for the user interface 150 of Figure 34. Although the table 801 shows only six positions plus a '0' position, due to symmetry about the value 0, these six positions can be applied to either half of the 13 -position menu of the user interface 150 of Figure 34. Although the table of Figure 41 applies to the embodiment of the user interface of Figure 34, the table could be extended to apply to any length menu 240.
The table 801 includes values for the following variables: variable 'menu position' 242, variable 'gesture to select character' 802, variable 'assigned value of pressed button 222, variable 'swipe threshold exceeded' 804, variable 'button released' 806, variable ΈΤΡ expired' 808.
The table 801 shows that for positions accessible using a SWIPE BPT (Positions 1 and 4 for the embodiment of Figure 34), there is always at least two ways for a user to reach that position. Furthermore, for those positions, the variable ΈΤΡ expired' is FALSE for at least one way and TRUE for at least one of the others. That fact guarantees that even if a user fails to exceed the swipe distance when they expect to (i.e., expected to exceed the swipe threshold before ETP expired but completed it after, or vice-versa), the same character gets selected anyway. That fact makes SWIPE BPT time-independent.
Each row of the table has one grey box 809 that marks one or the other of the variables 'swipe threshold completed' 804 and 'button released' 806. The grey box 809 indicates the action that signifies the end the character selection cycle.
For button activations where a swipe gesture does not exceed the swipe threshold (i.e., SHORT and LONG BPTs), the character selection cycle terminates with a button release. In other words, if a button is released and a swipe threshold is not exceeded, then the selection cycle ends. (In an alternative embodiment, the selection cycle extends until the ETP expires for short presses, but that isn't necessary.)
On the other hand, for button activations where a swipe gesture does exceed the swipe distance threshold (i.e., a SWIPE BPT), the selection cycle may or may not immediately end. In one embodiment, swipes that exceed the swipe threshold before the ETP expires cause the selection cycle to immediately end, but swipes that exceed the swipe threshold after the ETP expires do not cause the selection cycle to end. For swipes that exceed the threshold after the ETP expires, the button release ends the selection cycle. This enables the user to "undo" a SWIPE BPT, if they want, by swiping back to the position where the swipe gesture originated. Ultimately there are multiple ways that the end of a character selection can be triggered that are consistent with gestures 802 of Figure 41 and the logic of the method 741 of Figure 37.
Figure 42 is a table 801 that lists the ways each position 242 of the menu 240 can be reached using the selection buttons 110 and the logic of Figure 40 for the user interface 150 of Figure 34. The table 801 of the embodiment of Figure 42 also includes assigned characters 200 for each position of the menu 240.
The table includes values for the following variables: variable 'menu position' 242, variable 'gesture to select character' 802, variable 'button pressed' 222, variable 'swipe threshold exceeded' 804, variable 'button released' 806, variable ΈΤΡ expired' 808 and character 200. The table of Figure 42 is just one possible embodiment of the user interface of Figure 34 and the methods of Figures 35, 36, 37 and 40, but in alternative embodiments could include alternative character assignments, alternative assigned values for the selection buttons 110, and alternative numbers of menu positions 242 and selection buttons 110, among other possible variations.
Figures 43 and 44 show examples of how a word 130 is composed according to the method 781 of Figure 40 and the embodiment of the user interface 150 of Figure 34. For the example of Figure 43, the composed word 130 is 'flag'.
Each row of Figure 43 shows one or more ways in which a particular letter 200 of the word 130 could be composed from a 'button press value' 222 and a 'button press type' 224.
Values for the variables 'button press value' 222 and 'button press type' 224 are selected by a user based on the position of an intended character 200 in the menu 240 and knowledge about how gestures identify calculations 790 according to the method 781 of Figure 40. The variable ΈΤΡ expired' 808 shows that for the SWIPE BPT the swipe gesture may be completed before or after the ETP expires and the same character becomes selected in either case.
The variable 'calculation' 790 (sometimes referred to as 'math operation') is specified based on the BPT 224 according to the logic of the method 781 of Figure 40. The variable 'calculated BPV 228 (sometimes also referred to as 'total BPV) is the result of the calculation of 790 and the assigned BPV 222 selected by the user. The device identifies the user's intended character 200 based on the 'calculated BPV and the assignment of the characters in the menu 240.
For the example of Figure 43, a button with assigned BPV = 0 activated with gestures that correspond to BPT = SWIPE identifies the character 'f . A button with assigned BPV = 5 activated with gestures that correspond to BPT = SHORT identifies the character T. A button with assigned BPV = -5 activated with gestures that correspond to BPT = LONG identifies the character 'a' . A button with assigned BPV = 0 activated with gestures that correspond to BPT = SHORT or LONG identifies the character 'g' .
For the example of Figure 44, the composed word 130 is 'bake' . Each row of Figure 44 shows one or more ways in which a character 200 could be composed from a 'button press value' 222 and a 'button press type' 224.
Values for the variables 'button press value' 222 and 'button press type' 224 are selected by a user based on the position of an intended character 200 in the menu 240 and knowledge about how gestures identify calculations 790 (sometimes referred to as math operations) according to the method 781 of Figure 40. The variable ΈΤΡ expired' 808 shows that for the SWIPE BPT the swipe distance threshold may be completed before or after the ETP expires and the same character becomes selected in either case.
The variable 'calculation' 790 is specified based on the BPT 224 according to the logic of the method 781 of Figure 40. The variable 'calculated BPV 228 is the result of the calculation of 790 and the assigned BPV 222 selected by the user. The device identifies the user's intended character 200 based on the 'calculated BPV and the assignment of the characters in the menu 240.
Figure 45 shows a schematic drawing of another embodiment of the electronic device 100 for input of characters. The device 100 may have some or all the components and functionality described herein with respect to the mobile device 100 of Figure 1. The device 100 has aspects previously disclosed in Figure 8 of U.S. Patent No. 8,487,877, which is hereby incorporated by reference in its entirety.
The electronic device 100 includes the display 104, the plurality of characters 200 that populate positions 242 of the character menu 240, the plurality of selection buttons 110 and the spacebar button 264, which together make up the user interface 150 of the device 100. Each of the plurality of selection buttons 110 has an assigned button press value 222. Included as part or within proximity to the menu 240 is the reference 258 and the offset scale 260. The display 104, and specifically the menu 240 the plurality of selection buttons 110, and the spacebar button 264, are communicatively coupled with the CPU 108, as described in the embodiment of Figure 1. The CPU 108 includes the elapsed time counter 140, the integer value counter 142 and the swipe gesture interpreter 144, as described in the embodiment of Figure 1. The CPU 108 is communicatively coupled with the storage medium 112 and the power source 122, as described in the embodiment of Figure 1. In the embodiment of Figure 45, the menu 240 has 17 menu positions 242 and the plurality of selection buttons includes six buttons with the assigned button press values 222: '-5, -2, 0, +2, +5, +8' . In a further embodiment, the menu positions 242 are populated by 17 of the 33 characters 200 of the Russian alphabet.
Figure 46 shows a table 801 that lists the ways each position 242 of the menu 240 can be reached using the logic of the method 781 of Figure 40 for the embodiment of the user interface 150 of Figure 45. The table 801 includes assigned characters 200 for each position of the menu 240.
The table includes values for the following variables: variable 'menu position' 242, variable 'gesture to select character' 802, variable 'button pressed' 222, variable 'swipe threshold exceeded' 804, variable 'button released' 806, variable ΈΤΡ expired' 808 and character 200. The table of Figure 46 is just one possible embodiment of the user interface of Figure 45 and the methods of Figures 35, 36, 37 and 40, but in alternative embodiments could include alternative character assignments, alternative assigned values for the selection buttons 110, and alternative numbers of menu positions 242 and selection buttons 110, among other possible variations.
The various embodiments described above can be combined to provide further embodiments. All of the U.S. patents, U.S. patent application publications, U.S. patent applications, foreign patents, foreign patent applications and non-patent publications referred to in this specification and/or listed in the Application Data Sheet are incorporated herein by reference, in their entirety. Aspects of the embodiments can be modified, if necessary to employ concepts of the various patents, applications and publications to provide yet further embodiments.
These and other changes can be made to the embodiments in light of the above-detailed description. In general, in the following claims, the terms used should not be construed to limit the claims to the specific embodiments disclosed in the specification and the claims, but should be construed to include all possible
embodiments along with the full scope of equivalents to which such claims are entitled. Accordingly, the claims are not limited by the disclosure.

Claims

1. A computer processor-implemented method comprising:
receiving and storing, by at least one computer processor, a value in response to initiation of a button activation;
adding another value to the stored value based on a duration of the button activation;
if the button activation includes a positional displacement, adding another value to the stored value based on the positional displacement of the button activation; and
selecting a character, by at least one computer processor, based on the stored value.
2. The method of claim 1 wherein the at least one computer processor:
receives and stores a first value in response to the initiation of the button activation;
adds a second value to the stored first value based on the duration of the button activation; and
if the button activation includes a positional displacement, adds a third value to the stored first value, or a fourth value to the sum of the first and second values, based on the positional displacement of the button activation.
3. The method of claim 2 wherein the processor adds the second value to the first value if the duration of the button activation exceeds a pre-determined length of time.
4. The method of claim 2 wherein the processor adds the third value to the first value if the positional displacement exceeds a pre-determined distance before the duration of the button activation exceeds a pre-determined length of time.
5. The method of claim 2 wherein the processor adds the fourth value to the sum of the first and second values if the positional displacement exceeds a pre-determined distance after the duration of the button activation exceeds a predetermined length of time.
6. The method of claim 2 wherein the sum of the second and fourth values equals the third value.
7. The method of claim 6 wherein the third value equals 2.
8. The method of claim 6 wherein the second and fourth values each equal 1.
9. A system comprising:
at least on processor; and
at least one memory coupled to the at least one processor wherein the at least one memory has computer executable instructions stored thereon that, when executed, cause the at least one processor to perform:
receiving and storing a value in response to initiation of a button activation;
adding another value to the stored value based on a duration of the button activation;
if the button activation includes a positional displacement, adding another value to the stored value based on the positional displacement of the button activation; and
selecting a character based on the stored value.
10. The system of claim 9 wherein the processor:
receives and stores a first value in response to the initiation of the button activation;
adds a second value to the stored first value based on the duration of the button activation; and
if the button activation includes a positional displacement, adds a third value to the stored first value, or a fourth value to the sum of the first and second values, based on the positional displacement of the button activation.
11. The system of claim 10 wherein the processor adds the second value to the first value if the duration of the button activation exceeds a pre-determined length of time.
12. The system of claim 10 wherein the processor adds the third value to the first value if the positional displacement exceeds a pre-determined distance before the duration of the button activation exceeds a pre-determined length of time.
13. The system of claim 10 wherein the processor adds the fourth value to the sum of the first and second values if the positional displacement exceeds a pre-determined distance after the duration of the button activation exceeds a predetermined length of time.
14. The system of claim 10 wherein the sum of the second and fourth values equals the third value.
15. The system of claim 14 wherein the third value equals 2.
16. The system of claim 14 wherein the second and fourth values each equal 1.
17. A non-transitory computer-readable storage medium having computer-executable instructions stored thereon that, when executed, cause at least one computer processor to perform:
receiving and storing a value in response to initiation of a button activation;
adding another value to the stored value based on a duration of the button activation;
if the button activation includes a positional displacement, adding another value to the stored value based on the positional displacement of the button activation; and
selecting a character based on the stored value.
18. The non-transitory computer-readable storage medium of claim 17 wherein the processor:
receives and stores a first value in response to the initiation of the button activation;
adds a second value to the stored first value based on the duration of the button activation; and if the button activation includes a positional displacement, adds a third value to the stored first value, or a fourth value to the sum of the first and second values, based on the positional displacement of the button activation.
19. The non-transitory computer-readable storage medium of claim 18 wherein the processor adds the second value to the first value if the duration of the button activation exceeds a pre-determined length of time.
20. The non-transitory computer-readable storage medium of claim
18 wherein the processor adds the third value to the first value if the positional displacement exceeds a pre-determined distance before the duration of the button activation exceeds a pre-determined length of time.
21. The non-transitory computer-readable storage medium of claim
18 wherein the processor adds the fourth value to the sum of the first and second values if the positional displacement exceeds a pre-determined distance after the duration of the button activation exceeds a pre-determined length of time.
22. The non-transitory computer-readable storage medium of claim 18 wherein the sum of the second and fourth values equals the third value.
23. The non-transitory computer-readable storage medium of claim 22 wherein the third value equals 2.
24. The non-transitory computer-readable storage medium of claim 22 wherein the second and fourth values each equal 1.
PCT/US2017/012605 2016-01-08 2017-01-06 Method of character identification that uses time dependent button presses and time independent swipe gestures WO2017120522A1 (en)

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
US201662276729P 2016-01-08 2016-01-08
US62/276,729 2016-01-08
US201662318125P 2016-04-04 2016-04-04
US62/318,125 2016-04-04
US201662334702P 2016-05-11 2016-05-11
US62/334,702 2016-05-11

Publications (1)

Publication Number Publication Date
WO2017120522A1 true WO2017120522A1 (en) 2017-07-13

Family

ID=59274451

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2017/012605 WO2017120522A1 (en) 2016-01-08 2017-01-06 Method of character identification that uses time dependent button presses and time independent swipe gestures

Country Status (2)

Country Link
US (1) US20170199661A1 (en)
WO (1) WO2017120522A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109583307A (en) * 2018-10-31 2019-04-05 东华大学 A kind of Cashmere and Woolens fiber recognition method based on local feature Yu word packet model

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10452264B2 (en) 2015-04-30 2019-10-22 Michael William Murphy Systems and methods for word identification that use button press type error analysis
WO2018213805A1 (en) 2017-05-19 2018-11-22 Murphy Michael William An interleaved character selection interface
US11212847B2 (en) 2018-07-31 2021-12-28 Roku, Inc. More secure device pairing
US11922007B2 (en) 2018-11-29 2024-03-05 Michael William Murphy Apparatus, method and system for inputting characters to an electronic device
US11416138B2 (en) * 2020-12-11 2022-08-16 Huawei Technologies Co., Ltd. Devices and methods for fast navigation in a multi-attributed search space of electronic devices

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090174667A1 (en) * 2008-01-09 2009-07-09 Kenneth Kocienda Method, Device, and Graphical User Interface Providing Word Recommendations for Text Input
US20120174043A1 (en) * 2011-01-04 2012-07-05 Google Inc. Gesture-based selection
US20150007089A1 (en) * 2013-06-26 2015-01-01 Samsung Electronics Co., Ltd. Method and apparatus for processing inputting of character
US20150121285A1 (en) * 2013-10-24 2015-04-30 Fleksy, Inc. User interface for text input and virtual keyboard manipulation
US20150234592A1 (en) * 2014-02-20 2015-08-20 Michael William Murphy Systems, methods and devices for input of characters with optional time-based button taps

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8487877B2 (en) * 2010-06-10 2013-07-16 Michael William Murphy Character specification system and method that uses a limited number of selection keys
US20140173522A1 (en) * 2012-12-17 2014-06-19 Michael William Murphy Novel Character Specification System and Method that Uses Remote Selection Menu and Touch Screen Movements

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090174667A1 (en) * 2008-01-09 2009-07-09 Kenneth Kocienda Method, Device, and Graphical User Interface Providing Word Recommendations for Text Input
US20120174043A1 (en) * 2011-01-04 2012-07-05 Google Inc. Gesture-based selection
US20150007089A1 (en) * 2013-06-26 2015-01-01 Samsung Electronics Co., Ltd. Method and apparatus for processing inputting of character
US20150121285A1 (en) * 2013-10-24 2015-04-30 Fleksy, Inc. User interface for text input and virtual keyboard manipulation
US20150234592A1 (en) * 2014-02-20 2015-08-20 Michael William Murphy Systems, methods and devices for input of characters with optional time-based button taps

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109583307A (en) * 2018-10-31 2019-04-05 东华大学 A kind of Cashmere and Woolens fiber recognition method based on local feature Yu word packet model

Also Published As

Publication number Publication date
US20170199661A1 (en) 2017-07-13

Similar Documents

Publication Publication Date Title
WO2017120522A1 (en) Method of character identification that uses time dependent button presses and time independent swipe gestures
US8957868B2 (en) Multi-touch text input
TWI507968B (en) Method and non-transitory computer-readable medium for controlling electronic device with touch screen and electronic device thereof
US20120293434A1 (en) Touch alphabet and communication system
US20190138208A1 (en) Method and system of multi-variable character input
US20140218315A1 (en) Gesture input distinguishing method and apparatus in touch input device
US20160210452A1 (en) Multi-gesture security code entry
US11853545B2 (en) Interleaved character selection interface
KR20100088647A (en) A method and apparatus for character input and command input with touch screen
CN104598786A (en) Password inputting method and device
US20150234592A1 (en) Systems, methods and devices for input of characters with optional time-based button taps
US20160124535A1 (en) Method of character identification that uses button press types
KR101458295B1 (en) Keyboard input system and the method using eye tracking
US11922007B2 (en) Apparatus, method and system for inputting characters to an electronic device
KR101348763B1 (en) Apparatus and method for controlling interface using hand gesture and computer-readable recording medium with program therefor
KR101561783B1 (en) Method for inputing characters on touch screen of terminal
JP5712232B2 (en) Input device
CN109558007B (en) Gesture control device and method thereof
US20240126430A1 (en) Interleaved character selection interface
JP5519546B2 (en) Handwritten character input device
JP5642862B2 (en) Input device and input method
JP5495406B2 (en) Input device and input method
JP6102241B2 (en) Character input program, character input device, and character input method
JP2003058301A (en) Keyboard input device with pointing device
JP2015122114A (en) Input device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17736468

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17736468

Country of ref document: EP

Kind code of ref document: A1

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 14.11.2018)

122 Ep: pct application non-entry in european phase

Ref document number: 17736468

Country of ref document: EP

Kind code of ref document: A1