WO2015089451A1 - Method for detecting user gestures from alternative touchpads of a handheld computerized device - Google Patents

Method for detecting user gestures from alternative touchpads of a handheld computerized device Download PDF

Info

Publication number
WO2015089451A1
WO2015089451A1 PCT/US2014/070112 US2014070112W WO2015089451A1 WO 2015089451 A1 WO2015089451 A1 WO 2015089451A1 US 2014070112 W US2014070112 W US 2014070112W WO 2015089451 A1 WO2015089451 A1 WO 2015089451A1
Authority
WO
WIPO (PCT)
Prior art keywords
finger
user
touchpad
hand
computer
Prior art date
Application number
PCT/US2014/070112
Other languages
French (fr)
Inventor
Tong LUO
Chuan Lin
Raymond C. COMBS
Original Assignee
Handscape Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Handscape Inc. filed Critical Handscape Inc.
Priority claimed from US14/568,492 external-priority patent/US9678662B2/en
Publication of WO2015089451A1 publication Critical patent/WO2015089451A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/038Indexing scheme relating to G06F3/038
    • G06F2203/0382Plural input, i.e. interface arrangements in which a plurality of input device of the same type are in communication with a PC

Definitions

  • the present disclosure generally relates to a computerized device including a touchpad installed on the back panel or other portion of the body other than the display screen. More particularly, the present disclosure relates to a. method and graphical user interface that enables the user to see the user's finger position and motion from the back or other portion of the device, superimposed on a keyboard layout on the display screen. This makes it easier for a user to input keystrokes and mouse actions from a touchpad that is installed on the back panel or other portion of a handhold device. In an embodiment, the user can also control and manipulate a virtual keyboard shown in the display screen.
  • embodiments of the invention are described with reference to a handheld computerized device by way of an example, it is understood that the invention is not limited by the type of computerized device or system.
  • Handheld computerized devices i.e., devices including microprocessors and sophisticated displays
  • PDA personal digital assistants
  • game devices such as iPad, wearable computerized devices, and the like
  • tabletPCs such as iPad, wearable computerized devices, and the like
  • PDA personal digital assistants
  • tabletPCs such as iPad, wearable computerized devices, and the like
  • keyboard keys are smaller and smaller, miniaturizing the keys. Additionally the keyboard keys may be given multiple functions - i.e. overloaded, and more complex function keyboard keys may be introduced as well.
  • a method for controlling a control region on a display screen of a computerized system includes obtaining first data from a touchpad.
  • the first data is associated with a position of a. portion of the hand of a. user when the user operates the touchpad.
  • the first data is not associated with an image of a finger of the user from an image sensor.
  • the method then includes transmitting the first data from the touchpad to the computerized system.
  • the touchpad is located in a location that is different from the location of the display screen.
  • the method further includes analyzing the first data in accordance with a model of a human hand and assigning the first data to at least one of a plurality of fingers of the model.
  • the method may include detecting, by the computerized system, an interaction of at least the portion of the hand of the user on the first touchpad with at least one object, displayed on the display screen. The method may then include causing, by the
  • the method may include detecting a finger swipe of a first finger of the user on the first touchpad when the user interacts with the at, least one object on the displa - screen.
  • the object may correspond to a multipage application displayed on the display screen and the property of the object may correspond to a page number in the multipage application.
  • the method may include changing the page number of the multipage application in response to the finger swipe.
  • the method may include detecting a finger swipe of a plurality of fingers of the user on the first touchpad when the user interacts with the object on the display screen.
  • the method may include changing the page number by a pre-determined number of pages in response to the finger swipe of the plurality of fingers.
  • the method may include detecting a velocity of the finger swipe and changing and the page number by the pre-determined number of pages in response to the velocity.
  • the method may include determining a first direction of the finger swipe and incrementing the page number by a pre-determined number of pages in response to the first direction.
  • the method may include determining a. second direction of the finger swipe and decrementing the page number by a pre-determined number of pages in response to the second direction.
  • the first direction may be different from the second direction.
  • the method may include detecting an interaction of at least a second finger of the user with the first finger.
  • the second finger may be located a second touchpad.
  • the second touchpad may be located in a location different from the first touchpad.
  • the second finger may also be located on the display screen.
  • the method may include identifying a first position of the first finger on the first touchpad, detecting a selection of the object, identifying a second position of the second finger on the second touchpad and detecting a movement of the first position of the first finger relative to the second position of the second finger.
  • the method may include detecting the selection of the object by the first, finger on the first touchpad or the second finger on the second touchpad. In some embodiments, the method may include rotating the object based on the detected movement, altering an axis of rotation of the object based on the detected movement, altering the size of the object based on the detected movement, altering a display characteristic of the object such as color, size, font and the like based on the detected movement or moving the object based on the detected movement.
  • the method may include identifying a first position of the first finger on the display screen, detecting a selection of a point of rotation of the object, identifying a second position of the second finger on the second touchpad and detecting a mo vement of the second position of the second finger relative to the first position of the first finger.
  • the method may include detecting the selection of the point of rotation of the object by the first finger on the first touchpad or the second finger on the second touchpad.
  • the method may include moving the object around the point of rotation of the object based on the detected movement.
  • the object may correspond to a virtual joystick and the method may include pushing the handle of the virtual joystick in a direction and magnitude defined by a corresponding direction and magnitude of the movement of the second position of the second finger relative to the first position of the first finger.
  • the method may include detecting a selection of the object, identifying a first contact position of the first finger, detecting a change in a characteristic of the first contact position and causing the at least one property of the object to be controlled based on the change in the characteristic.
  • the method may include detecting a movement of the first finger away from the first contact position, detecting a. change in the angle of the first finger in the first conta ct position, detecting an increase in a. touch area of the first contact position and the like.
  • the characteristic may comprise at lea st one of the area, the size or the shape of the first contact position.
  • the method may include applying a. corresponding pressure or a load to the displayed object based on the change in the characteristic of the first contact position.
  • the method may include detecting a movement of the first finger from a. first position to a second position on the first touchpad , repositioning the object in the display screen in accordance with a direction of the movement of the first finger or an amount, of movement of the first finger from the first position to the second position and enabling, for the user, an interaction with the re-positioned object.
  • the object may correspond to a set of virtual keys in a virtual control region in the display screen.
  • a computer-readable storage medium comprises instructions to obtain first data from a first touchpad, the first data being associated with a position of a portion of the hand of a user when the user operates the computerized system using the first touchpad, the first data not being associated with an image of a finger of the user from an image sensor, transmit, the first data from the first touchpad to the computerized device, the first touchpad being located in a location that is different from the location of the display screen, analyze the first data in accordance with a model of a human hand and assigning the first data to at least one of a plurality of fingers of the model, detect an interaction of at least the portion of the hand of the user on the first touchpad with at least one object displayed on the display screen and cause at least one property of the object to be controlled in accordance with the interaction of the portion of the hand of the user on the first touchpad.
  • a system for controlling a control region on a display screen of a computerized system obtains first data from a first touchpad, the first data being associated with a position of a portion of the hand of a user when the user operates the computerized system using the first touchpad, the first data not being associated with an image of a finger of the user from an image sensor, transmits the first data from the first touchpad to the computerized device, the first touchpad being located in a location that is different from the location of the display screen, analyzes the first data in accordance with a model of a human hand and a ssigning the first data to at least one of a plurality of fingers of the model, detects an interaction of at least the portion of the hand of the user on the first touchpad with at least, one object displayed on the display screen and causes at least one property of the object, to be controll ed in accordance with the interaction of the portion of the hand of the user on the first touchpad.
  • Figure 1 depicts a simplified exemplary front panel view of the handheld
  • Figure 2 depicts a simplified exemplary back panel view of the handheld computerized device depicted in Figure 1 , in accordance with one embodiment of the present invention.
  • Figure 3 depicts a simplified exemplary front panel view of the handheld computerized device depicted in Figure 1 displaying a. multitude of groups of keys, in accordance with one embodiment of the present invention
  • Figure 4 depicts a simplified exemplary front panel view of the handheld computerized device depicted in Figure 1 displaying the position and motion of the fingers holding the back panel and the multitude of groups of keys depicted in Figure 3 in the same time, in accordance with one embodiment of the present invention.
  • Figure 5 depicts a simplified exemplar ⁇ ' front panel view of a smaller handheld computerized device displaying the position and motion of at least one finger in contact with the touchpad of the back panel, in accordance with one embodiment of the present invention.
  • Figure 6A depicts a simplified exemplary front panel view of the smaller handheld computerized device depicted in Figure 5 displaying the position and motion of at least one user's finger in contact with the touchpad of the back panel at the touchpad touch points and a multitude of groups of virtual keyboard keys similarly depicted in Figure 3 in the same time, in accordance with one embodiment of the present invention.
  • Figure 6B depicts a simplified exemplary front panel view of a smaller handheld computerized device displaying the position and motion of one finger in contact with the touchpad of the back panel, in accordance with one embodiment of the present invention.
  • the position of a small virtual keyboard, composed of a plurality of groups of keys, is also shown.
  • this virtual keyboard was previously software aligned to correspond to the direction of the user's fingers and hand.
  • Figure 6C depicts a simplified exemplary front panel view of a smaller handheld computerized device displaying the position and motion of one finger in contact with the touchpad of the back panel, in accordance with one embodiment of the present invention.
  • the position of a small virtual keyboard, composed of a plurality of groups of keys, is also shown.
  • the keys were previously software aligned to correspond to the direction of the user's fingers and hand, and the spacing between the keys has also been user adjusted by software.
  • Figure 6D depicts a simplified exemplary flowchart, how biomechanical models of hand and finger movement may be used to display a virtual image of at least, a portion of a hand of a user on a display screen of the computerized system of Figure 8, in accordance with one embodiment, of the present invention.
  • Figure 7 depicts a simplified exemplary front panel view of the handheld computerized device displaying another embodiment, of the layout of virtual keys as the standard virtual keyboard, in accordance with one embodiment of the present invention.
  • Figure 8 depicts a simplified exemplar ⁇ ' block diagram of a computerized system capable of executing various embodiments of the invention, in accordance with one embodiment of the present invention.
  • Figure 9 depicts a simplified exemplary flowchart how biomechanical models of hand and finger movement may be calibrated and adapted to help turn the raw touchpad data into an accurate model of the user's hand and finger positions, in accordance with one embodiment of the present invention.
  • Figure 10 depicts a simplified exemplary flowchart how predictive typing methods may be used to improve the accuracy of the appearance of the virtual hand and fingers while typing, in accordance with one embodiment of the present invention.
  • Figure 1 1 depicts a simplified exemplar ⁇ ' flowchart how dynamic changes in touchpad sensitivity may, for finger proximity touchpads, assist in highlighting the virtual keys about to be struck by a user while typing on the virtual keyboard, in accordance with one embodiment of the present invention.
  • Figure 12 depicts a simplified exemplary flowchart for generating images of the virtual hand and fingers on the device's graphics display screen, in accordance with one embodiment of the present invention.
  • Figure 13 depicts a simplified exemplary biomechanical and/or anatomical model of the human hand, showing the internal skeletal structure with a skin overlay, in accordance with one embodiment of the present invention.
  • Figure 14 depicts how r the simplified exemplar ⁇ ' user's hand or hands may be photographed by the device's camera or other camera, and this image information may be used to refine the default parameters of the biomechanical and/or anatomical model of the user's hanc in accordance with one embodiment of the present invention.
  • Figure 15 depicts how an exemplary device camera may be used to obtain a partial image of the user's hand while using the device's touchpad, and this information also used to update and refine the biomechanical and/or anatomical model of the user's hand, in accordance with one embodiment of the present invention.
  • Figures 16A - 16B depict how a simplified exemplary palm angle rotation
  • transformation may help the system relate raw touchpad data to a standard biomechanical and/or anatomical model of the human hand, in accordance with one embodiment of the present invention.
  • Figure 17 depicts more exemplary details of the relationship between the finger roots and the hand's overall palm angle, in accordance with one embodiment of the present invention.
  • Figure 18 depicts more exemplary details of the relationship between the hand's palm direction or palm angle and the tips of the user's fingers, in accordance with one embodiment of the present invention.
  • Figure 19 depicts how simplified exemplary biomechanical and/or anatomical model data pertaining to the width of the fingers may be used to help interpret raw touchpad data, in accordance with one embodiment of the present invention.
  • FIG. 20 depicts how in a more accurate exemplary model, the location of the various finger roots will be displaced to some extent from the palm line (which forms the palm angle) by various amounts ⁇ , in accordance with one embodiment of the present invention.
  • Figure 21 depicts how the simplified exemplary system may attempt to correlate detected fingertip data from some fingers with finger root data from other fingers, determine that some fingertip data is missing, and thus deduce that these fingers are elevated above the touchpad, in accordance with one embodiment of the present invention
  • Figure 22 depicts how the simplified exemplar ⁇ ' system may further assign raw touchpad data to two different hands of the same user, based on the assumption that the range of possible hand angles for the same user is limited by the user's anatomy, in accordance with one embodiment of the present invention
  • Figure 23 depicts a first simplified exemplary example of angle based finger matching algorithms, in accordance with one embodiment of the present invention.
  • Figure 24 depicts a second simplified exemplary example of angle based finger matching algorithms, in accordance with one embodiment of the present invention.
  • Figure 25 depicts a simplified exemplar ⁇ ' flowchart how hiomechanicai models of hand and finger movement may he used to display a virtual image of at least a portion of a hand of a user on a display screen of the computerized system of Figure 8, in accordance with one embodiment of the present invention.
  • Figure 26 depicts a simplified exemplary flowchart of a "lift and tap" technique of key entry for controlling an input from a. user to the computerized system, in accordance with one embodiment of the present invention
  • Figures 27A - 27F depict a series of simplified exemplar ⁇ ' display screen shots of the "lift and tap” technique of key entry depicted in Figure 26 being used to type the first two letters of a "Hello World” message on the computerized system, in accordance with embodiments of the present invention.
  • Figure 28 depicts a simplified exemplary flowchart of a "lift and dra g" technique of key entry for controlling an input from a user to the computerized system, in accordance with one embodiment of the present invention.
  • Figure 29 depicts a simplified exemplary flowchart of a "lift and tap" technique of key entry modified to use force applied per finger for controlling an input from a user to the computerized system, in accordance with one embodiment of the present invention.
  • Figure 30 depicts a simplified exemplary flowchart of a modified "lift and tap" technique of key entry modified to use a third force applied per finger for controlling an input from a user to the computerized system, in accordance with one embodiment of the present invention.
  • Figures 31 A - 3 IB respectively depict simplified exemplary side and top views of a portion of the touchpad using the contact area resulting from a first force, in accordance with one embodiment of the present invention.
  • Figures 32A - 32B respectively depict simplified exemplary side and top views of a portion of the touchpad using the contact area resulting from a second force, in accordance with one embodiment of the present invention.
  • Figure 33 depicts a simplified exemplary flowchart of a "push and lift" technique of key entry for controlling an input from a user to the computerized system, in accordance with one embodiment of the present invention.
  • Figure 34 depicts a simplified exemplary time-line of the "push and lift" technique depicted in Figure 33, in accordance with one embodiment of the present invention.
  • Figure 35 depicts a simplified exemplary flowchart of a "hover and tap" technique of key entry for controlling an input from a user to the computerized system, in accordance with one embodiment of the present invention.
  • Figures 36A - 36F depict a series of simplified exemplary display screen shots of the "hover and tap" technique of key entry depicted in Figure 35 being used to type and enter the numeral "0" on the computerized system, in accordance with embodiments of the present invention.
  • Figure 37 depicts a simplified exemplar ⁇ ' flowchart of a method 3700 for controlling a control region on a. display screen of a computerized system, in accordance with an embodiment of the present invention.
  • Figures 38A-38F depict a series of simplified exemplar ⁇ ' illustrations, 3800, 3802, 3804, 3806, 3808 and 3810 of controlling a control region displayed in a display screen of a handheld computerized device by a. user operating a touchpad, in accordance with embodiments of the present invention.
  • Figures 39A-38F depict a series of simplified exemplar ⁇ ' illustrations, 3900, 3902, 3904, 3906, 3908 and 3910 of controlling a control region displayed in a display screen of a handheld computerized device by a. user operating a touchpad, in accordance with another embodiment of the present invention
  • Figure 40 depicts simplified exemplar ⁇ ' illustrations, 4000 and 4002, that indicate the manner in which a computerized system (e.g., the handheld computerized device 100) may interpret a single finger swipe from a. user operating a touchpa d on the back side of the computerized device, in accordance with an embodiment of the present invention.
  • a computerized system e.g., the handheld computerized device 100
  • Figure 40 depicts simplified exemplar ⁇ ' illustrations, 4000 and 4002, that indicate the manner in which a computerized system (e.g., the handheld computerized device 100) may interpret a single finger swipe from a. user operating a touchpa d on the back side of the computerized device, in accordance with an embodiment of the present invention.
  • Figure 41 depicts simplified exemplary illustrations, 4100 and 4102, that indicate the manner in which a handheld computerized device may interpret a multiple finger swipe from a user operating a touchpa d on the back side of the computerized device, in accordance with another embodiment of the present invention.
  • Figure 42 depicts exemplary illustrations, 4200, 4202 and 4204 of the manner in which a computerized system (e.g., the hand held computerized system, 100) may detect finger gestures from a user using multiple touch pads and/or the display screen of the computerized system to control an object displayed on the display screen, in accordance with an embodiment of the present invention.
  • a computerized system e.g., the hand held computerized system, 100
  • Figure 43 depicts exemplary illustrations, 4300 and 4302 of the manner in which a computerized system (e.g., the handheld computerized device 100) may detect finger gestures from a user using multiple touch pads and/or the display screen of the computerized system to control an object displayed on the display screen, in accordance with an embodiment of the present invention
  • Figure 44 depicts exemplary illustrations, 4400, 4402 and 4404 of the manner in which a computerized system (e.g., the hand held computerized system, 100) may detect finger gestures from a user using multiple touch pads and/or the display screen of the computerized system to control an object displayed on the display screen, in accordance with an embodiment of the present invention.
  • a computerized system e.g., the hand held computerized system, 100
  • Figure 45 depicts exemplary illustrations, 4500 and 4502 of the manner in which a computerized system (e.g. , the hand eld computerized device 100) may detect finger gestures of a user from multiple to uch pads and/or the display screen of the computerized system to control an object displayed on the display screen, in accordance with an embodiment of the present invention.
  • a computerized system e.g. , the hand eld computerized device 100
  • Figure 46 depicts a simplified exemplary flowchart of a method 4600 for controlling a control region on a display screen of a computerized system, in accordance with an embodiment of the present invention.
  • the embodiments of the present invention relate to a handheld computerized device including a bit mapped display screen on the front panel, and a touchpad installed on the back panel, side panel, or other area other than that of the display screen. More particularly, the embodiments of the present invention relate to a method and graphical user interface that enable the user to see the user's finger position and motion from behind the device or other portion of the device superimposed upon a virtual keyboard layout on the front panel.
  • the embodiments of the present invention present an effective solution for these above problems.
  • the embodiments of the present invention free the original keyboard space on the front panel for applications by utilizing the previously mostly unused back panel space for user input.
  • the embodiments of the present invention are able handle both keyboard input and mouse input.
  • the embodiments of the present invention present, a stunning graphic user interface on the front panel screen where a user may see the real-time position and motion of his/her fingers holding the back panel, on top of the display of keyboard layout, which is also referred to as a virtual keyboard.
  • the embodiments of the present invention are more precise than current, touch screen keyboards by removing the display layer that presently exists between the fingers and touch pad.
  • the embodiments of the present invention also move the user's fingers away from the front panel, so that the user's fingers will not block the view of the soft key or area that the finger is presently operating on.
  • the hand that holds the device may now also do input, hence freeing the other hand for other activities.
  • an object of the embodiments of the present invention are to provide a method for a more efficient and user-friendly user input for a handheld computerized device.
  • Another object of the embodiments of the present invention are to free up the space currently occupied by the keyboard on the front panel of small electronic devices, and utilize the mostly unused space on the back panel of the handheld devices for user input purposes.
  • Another object of the embodiments of the present invention are to present a visually compelling user-interface design that enables the real time position and motion of the fingers that
  • the user's finger positions and keyboard layout may be displayed either as background image, or a s a transparent layer on top of some of all of the applications currently running on the handheld device.
  • These semi-transparent representations of the user's finger positions and virtual keyboard allow the user to easily enter data while, at the same time, continuing to allow the user unimpeded access to the various applications running on the handheld device.
  • applications originally written for a computer device that had a physical keyboard may be easily run, without code modification, on a. tablet computer device that lacks a physical keyboard.
  • these virtual semi-transparent keyboards and methods that also give information of finger motion of the user may be highly useful.
  • a device and method include a display screen on the front panel, which may be a bit-mapped display screen, a touchpad embedded on the back panel capable of sensing the user's finger positions and motion, and a graphical user interface.
  • This graphical user interface will normally include both software and optional graphics acceleration hardware to enable complex graphics to be rapidly displayed on the display screen.
  • the device also has an optional virtual keyboard processor that displays the keyboard layout, as well as computes and displays the user's finger positions on a real-time basis.
  • the user's finger position and motion on the touchpad of the back panel may thus be computed and displayed on the front display screen as a layer, which may be a semi-transparent layer, on top of all of the other applications.
  • the virtual keyboard processor may also interpret the finger motions, i.e. strokes, and invoke corresponding operations based on the known location of the finger position on the keyboard.
  • the user's fingers do not need to be constrained to fit onto particular regions of the touchpad, but rather may be disposed in any arbitrary location.
  • embodiments of the invention may be aided to some extent by real-time video that may provide video information pertaining to at least some portion of the user's hand, visualization of the user's fingers, in particular the tips of the user's fingers is not necessary. This makes it feasible to use handheld device video cameras designed for general photographic purposes to be used to help in visualizing the user's hand, without requiring that much of the user's hand in fact be photographed. There is no requirement at all that the user's fingertips be photographed while operating the device.
  • Figure 1 depicts a simplified exemplary front panel view of a handheld computerized device (100) displaying the positio and motion of the user's fingers (108) holding the back panel, in accordance with one embodimnet of the present invention.
  • the user is holding the handheld electronic device (100), similar to an Apple iPadTM or equivalent pad device.
  • the front panel of the device is occupied by a large graphics display screen (102), which may be a bitmapped graphics display screen. In some embodiments, the whole front panel screen or front panel may be occupied by this graphics display screen (102).
  • the user is holding the handheld computerized device (100) using his or her hands (104), where a portion of the user's thumb (106) is in front of the device over a portion of the front panel, and the user's fingers (108) are behind the device.
  • device (100) is not transparent, nonetheless the graphics display screen (102) is shown representing a graphical representation of the user's fingers (108) as well as regions where the user's fingers are apparently touching an obscured from, view or "invisible" surface at touchpad touch points (1 10) at the back panel of the device.
  • Each of the touchpad touch points (1 10) may correspond to a real time finger print image of the tip of the user's finger.
  • Figure 2 depicts a simplified exemplary back panel view of the handheld computerized de vice (100) depicted in Figure 1 , in accordance with one embodiment of the present in vention.
  • the back panel of the handheld computerized device as depicted in Figure 2 does not include a large graphics display screen, but instead includes a large touchpad (200).
  • the user's fingers (208) may now be seen positioned above the touchpad with the tips of the user's fingers (210) touching the touchpad. It is understood that the expression “above the touchpad” refers to the relative position of the fingers with respect to the touchpad when the touchpad is facing upward.
  • this back touchpad may be provided as a retrofit or add-on to a handheld computerized device that otherwise lacks such a back touchpad.
  • Such methods and systems, such as "clip on" back touchpads, are described at more length in parent application 13/223,836, the contents of which are incorporated herein by reference in its entirety.
  • Figure 3 depicts a simplified exemplary front panel view of the handheld computerized device depicted in Figure 1 displaying a multitude of groups of keys (300, 302, 304), in accordance with one embodiment of the present invention.
  • Figure 3 depicts one possible optional multitude of groups of keys, i.e. a "virtual keyboard,” being displayed on graphics display screen (102) of device (100).
  • the "virtual keyboard” includes a symbol keypad (300), a numeric keypad (302), and a QUERTY keypad (304).
  • the keys may be drawn in outline or semi-transparent form so as not to obscure any other graphical applications running on graphics display screen (102).
  • FIG. 3 The scheme depicted in Figure 3 allows the user to optionally use a. touchpad keypad on the back of the device to input keystrokes and mouse actions, and these inputs will be reflected on the display screen on the front of the handheld computerized device as "virtual fingers" or equivalent.
  • this virtual keyboard layout displayed on graphics display screen (102) at the front panel may be a standard or modified QUERTY keyboard or keypad, a numeric keyboard or keypad, i.e.
  • keyboard or alternatively some less standard keyboard or keypad such as a musical keyboard, a Qwerty, Azerty, Dvorak, Colemak, Neo, Vietnamese, Arabic, Armenian, Greek, Hebrew, Russian, Moldovan, Ukranian, Bulgarian, Devanagari, Thai, Khmer, Vietnamese, Chinese, Flangul (Korean), Japanese, or other type of keyboard.
  • this keypad will be a semi-transparent keypad in order to allow the user to continue to view various application programs that are running on display screen (102) below the virtual keyboard.
  • Figure 4 depicts a simplified exemplary front panel view of the handheld computerized device (100) depicted in Figure 1 displaying the position and motion of the user's fingers (108) holding the back panel and the multitude of groups of keys (300, 302, 304) depicted in Figure 3 in the same time, in accordance with one embodiment of the present invention.
  • Figure 4 depicts an example of how a user, typing on a touchpad mounted on the back of the electronic device, may see a graphical representation of his or her fingers (108) displayed on graphics screen (102) of device (100), as well as a display of virtual keyboard layout (300, 302, 304).
  • the user's ability to enter input data to the handheld computerized device (100) is thus enhanced because the user may visually judge the distances between his or her fingers (108) and the keypad keys of interest (300, 302, 304) and move his or her fingers appropriately so as to hit the desired key.
  • the user may also click on hyperlinks, such as iinkl , link2, and the like, or other clickable objects or command icons.
  • the virtual display of the user's fingers may be a valuable feature for some of the newer tablet computers, such as the Microsoft SurfaceTM series, Windows 8, and the like, which may alternate operating modes between a first tablet operating mode designed for traditional touch input, and a second desktop operating mode, derived from legacy desktop operating systems, that, is optimized for more precise mouse input.
  • a user By enabling such tighter control, it becomes more feasible for a user to operate such "Surface” like devices in legacy desktop mode without the need to use a mouse or other hand operated pointing instrument.
  • the embodiments of the present, invention free up the space on the device that might othenvise have been used for original mechanical keyboard space on the front panel, and create room for additional larger displays and applications.
  • the embodiments of the present invention make use of the presently mostly unused back panel space, thus, enabling the front display to show substantially larger virtual keys, or virtual keys including more space between them that are easier for the user to use.
  • the embodiments of the present, invention may create compelling visual effects, as well as useful visual effects, because the user may see his or her fingers (108), which are holding the back panel and thus normally blocked from view, being virtually displayed on the front panel along with a virtual, i.e. computer generated, keyboard layout display (300, 302, 304). Because
  • the layout of a. multitude of groups of virtual keyboard keys (300, 302, 304), including numbers, letters, and symbols may be displayed on an area separated from concurrently running other software applications that are being displayed simultaneously on the screen of the front panel (much like the traditional separately displayed area often used for soft keys near the bottom of the display screen).
  • the virtual keyboard keys (300, 302, 304) may be advantageously displayed in different size or in locations that are not the same locations that are determined by the other software applications and/or programs because the virtual keyboard keys (300, 302, 304) may be displayed translucently so as to display both the virtual keyboard keys (300, 302, 304) and the underlying concurrently running application or program display content.
  • FIG. 5 depicts a simplified exemplar ⁇ ' front panel view of a smaller handheld computerized device (500) displaying the position and motion of at least one finger in contact with the touchpad of the back panel, in accordance with one embodiment of the present invention.
  • Smaller handheld computerized device (500) may include a cellular phone sized device (e.g. an Apple iPhoneTM sized device) including a smaller graphics display screen (502) virtually displaying the position and motion of a multitude of fingers (108) in contact with the touchpad touch points (110) at the back panel of smaller handheld computerized device (500).
  • a cellular phone sized device e.g. an Apple iPhoneTM sized device
  • a smaller graphics display screen 502
  • Figure 6A depicts a simplified exemplary front panel view of the smaller handheld computerized device (500) depicted in Figure 5 displaying the position and motion of at least one user's finger (108) in contact with the touchpad of the back panel at touchpad touch points (110), and a multitude of groups of virtual keyboard keys (300, 302, 304) similarly depicted in Figure 3 in the same time, in accordance with one embodiment of the present invention.
  • Figure 6 may include similar features as Figure 4 with the exception of using smaller handheld computerized device (500) being held in just one user hand (104), the other hand of the user being free to do other tasks.
  • the virtual keyboard (or keypad) is software generated, it need not always be presented in the same position. However, in embodiments of the invention, at least temporary persistence of the various keys of the virtual keyboard is desirable, so that the user always knows the relative location of the key that they are going to strike, and so that the system can accurately compute the relative distance between the user's various fingers (as predicted by the anatomical and biomechanical model of the user's hand) and the various keys.
  • the user may be useful to allow the user to adjust, the position, orientation, and spacing between the virtual keys of the virtual keyboard either prior to beginning a typing session, or even during a typing session.
  • a user may indicate to the system by keypress, voice command, virtual key selection, other touchpad input, etc. that virtual keyboard repositioning is desired. The system may then use various options to reposition the virtual keyboard.
  • the virtual keyboard will be essentially allowed to float on the screen, and the user can then rotate the virtual keyboard, stretch and shrink it, change key spacing, etc., by multi-touch type commands either on a front display panel touchpad, a back panel touchpad, or other touchpad device.
  • the user may control the position, orientation, and spacing between the virtual keys of the virtual keyboard by verbal commands such as “rotate right 30 degrees”, or “go up one inch”, and so on,
  • the position and orientation of the virtual keyboard can be set to track the position and orientation of the user's hand(s).
  • design tradeoffs are taken into consideration. If the position and orientation of the virtual keyboard tracks the position and orientation of the user hand(s) too closely, then the ability of the software to determine which virtual key the bioniechanical and anatomical model of the user's hand is striking may be reduced. Thus, in some embodiments, it may be useful to set the virtual keyboard generation software to track a time averaged position and orientation of the user's hand (usually over periods of at least several seconds, or even minutes).
  • the virtual keyboard generation software can be set to initialize the position and orientation of the virtual keyboard based on the position and orientation of the user's hand during a specified time. This can be as simple as having the user place his or her (hands) on the back touchpad and giving an "initialize keyboard orientation and position" command, either verbally, by pressing a real or virtual key, or by other activation system. Once so initialized, the virtual keyboard can then maintain its position and orientation until the user then decides to reset it.
  • Figure 6B depicts a simplified exemplary iron: panel view of a smaller handheld computerized device displaying the position and motion of one finger in contact with the touchpad of the back panel, in accordance with one embodiment of the present invention.
  • the position of a small virtual keyboard, composed of a plurality of groups of keys, is also shown.
  • this virtual keyboard was previously software aligned to correspond to the direction of the user's fingers and hand.
  • Figure 6C depicts a simplified exemplar ⁇ ' front panel view of a smaller handheld computerized device displaying the position and motion of one finger in contact with the touchpad of the back panel, in accordance with one embodiment of the present invention.
  • the position of a small virtual keyboard, composed of a plurality of groups of keys, is also shown.
  • the keys were previously software aligned to correspond to the direction of the user's fingers and hand, and the spacing between the keys has also been user adjusted by software,
  • Figure 6D depicts a simplified exemplary flowchart how biomechaiiical models of hand and finger movement may be used to control a. virtual keyboard on a display screen of the computerized system of Figure 8, in accordance with one embodiment of the present invention.
  • Figure 6D depicts the method includes obtaining data, from a touchpad, the data being associated with the location and movement of a finger and/or hand of a user when the user operates the computerized system using the touchpad, the data not being associated with an image of the finger of the user from an image sensor (620).
  • the method further includes communicating the data from the touchpad to the computerized device, the touchpad being located in a location that is different from, the location of the display screen (622).
  • the method further includes analyzing the data in accordance with a model of a human hand, and assigning the data to at least one of a plurality of fingers of the mode (624), computing a graphical representation of at least one finger of the user in accordance with the model (926), generating a virtual keyboard on the display screen (628), and repositioning the virtual keyboard according to either a verbal command from the user or a user input from the touchpad (630).
  • Figure 7 depicts a simplified exemplary front panel view of handheld computerized device (100) depicted in Figure 1 displ aying another embodiment of the layout of virtual keys (700) as the standard virtual keyboard, in accordance with one embodiment of the present invention.
  • Figure 7 depicts another embodiment of the layout of virtual keys (700) may include a modified QWERTY keyboard or keypad that includes splitting the keyboard in half and displaying each half of the keyboard at an angle adapted for better ergonomic typing than the keyboards depicted previously.
  • a computer-implemented method includes a handheld
  • a computerized device including a screen on the front of the device capable of displaying a graphical user interface, and a touch sensitive back panel or side panel or other area other than the display screen, and a user interface, such as a two dimensional touch sensor.
  • this touch sensitive panel which need not necessarily be flat, and need not necessarily be mounted on the back side of the device, hereinafter also referred to as a "touchpad,” “touch sensor,” or “touch sensitive back panel”, but this use is not intended to be limiting.
  • the touch sensor will determine the motion of the fingers in real time, and the computerized system's or device's software and processor(s) will use the touch sensor data to compute the real time position and motion of the user's fingers that are touching the touch sensor on the back panel.
  • These "virtual fingers” will then be displayed on the device's graphical user interface on top of a static background where optionally a multitude of groups of keys, including numbers, letters, and symbols (e.g. a virtual keyboard) or hyperlinks may be displayed.
  • the user may easily operate the device, and optionally determine precisely where to strike a finger in order to hit an intended virtual key.
  • the back panel user interface (Ul) may be outlined in a distinctive yet non-obstructive color and displayed as a transparent layer over the current applications; hence all the details of current application and back panel Ul are shown to the user at, the same time.
  • the real time position and motion of the fingers holding the back panel may be displayed on the screen of the front panel.
  • the layout of a multitude of groups of keys, including numbers, letters, and symbols may be display ed on the screen of front panel as background of real time position and motion of the fingers holding the back panel.
  • the real time position and motion of the fingers holding the back panel may be displayed on the static background of a multitude of groups of keys, including numbers, letters, and symbols, enabling the user to precisely strike a finger on an intended key.
  • the display of the virtual hand may be creative and artistic.
  • the display may instead show a skeleton, an animal claw, a furry hand, a tattooed hand, and the like to achieve more compelling or amusing effects.
  • the touchpad is able to sense the touch point positions, movement, and stroke motion data of a multitude of fingers.
  • the data information of the finger motion of one or a multitude of fingers is passed to a virtual keyboard processor, such as a computer processor.
  • the virtual keyboard processor may analyze the finger motion, compare the finger positions with the registered position of the keys, hereinafter referred to as virtual keys, as well as the hyperlinks and other touch buttons of the application program, e.g., generically the "user entry area", and then will decide which item in the user entry area was stroked or actuated. The virtual keyboard processor may then invoke the corresponding operation. The virtual keyboard processor may also update the real time image of the fingers, or finger pads or touch points, or indeed the user hand(s) on the front screen after each finger motion.
  • the touchpad may be installed on the back panel of the handheld computerized device, and may be able to sense the touch, movement, and stroke motion of a multitude of user fingers.
  • the information pertaining to the finger motion of a multitude of user fingers may be passed to a virtual keyboard processor.
  • the motion type e.g., touch, movement, and stroke action, and the like, as well as motion position
  • a virtual keyboard processor may be passed to a virtual keyboard processor.
  • the virtual keyboard processor may analyze the finger motion, compare finger position with the registered position of the keys, determine which key was stroked, and invoke the corresponding operation,
  • GUI graphical user interface
  • the interface may include a display of a multitude of groups of keys, including numbers, letters, and symbols.
  • the keys may be displayed on a. graphical user interface on the front panel display screen, and indeed this display area may occupy the whole screen. Thereby, the content of the graphic user interface is not blocked by applications, and is shown together with the applications.
  • One embodiment of the present invention includes a graphical user interface for a handheld computerized device.
  • This interface includes a display of the real time position and motion of the fingers holding the back panel.
  • the display is on the front panel screen, and in fact may occupy the whole screen. Due to the advantages of this approach, the content of the user's finger position and motion is not blocked by applications, or by the display of groups of keys, including numbers, letters, and symbols,
  • One embodiment of the present invention includes a method of assisting user data entry into a handheld computerized device.
  • This handheld computerized device includes at least one touchpad in one embodiment being located on a side of the handheld computerized device that, is behind the side of the device that holds the graphics display screen, at least one graphics display screen, at least one processor, memory, and software. Often, however, the handheld
  • the computerized device will lack a mechanically actuated and/or permanently dedicated physical QWERTY keypad or keyboard, and may also lack a mechanically actuated and/or permanently dedicated physical numeric keypad or keyboard as well.
  • the method will usually include displaying at least one data entry location on the at least one graphics display screen of the de vice. Often this at least one data entry location will be a graphical display of a keyboard or keypad that may be included of a multitude of data entry locations.
  • the system wil l use the touchpad to obtain data on the location and movement of the user's fingers and/or hand.
  • the system may analyze the data on the location and movement of the user's fingers and/or hand according to a biomechanical and/or anatomical model of a human hand, and will assign data on the location and movement of the user's fingers and/or hand to specific fingers on this biomechanical and/or anatomical model of a human hand (usually the user's hand).
  • the system may then use this biomechanical and/or anatomical model of the human hand to compute a graphical representation of at least the user's fingers, and frequently both the user fingers and the user hand(s).
  • the system will then display the graphical representation of at least the user's fingers (and again frequently both the user's finger and hand), on the device's graphics display screen.
  • the distance between the graphical representation of the user's virtual fingers on the graphics display screen, and the virtual data entry location (such as the virtual keyboard) will give information that will help the user properly position his or her real fingers and/or hand on the touchpad, which in turn will facilitate data, entry,
  • FIG. 8 depicts a simplified exemplary block diagram of a computerized system 800S capable of executing various embodiments of the invention, in accordance with one embodiment of the present invention.
  • Computerized system 800S includes software and hardware that may be used to implement one embodiment of the invention such as a front panel screen (804), a back panel touch pad 800, a virtual keyboard processor (802), an application process (806), and a device memory(808).
  • Finger position and motion data are first, collected from back panel touch pad (800), and then passed to virtual keyboard processor (802).
  • the virtual keyboard processor (which will often be implemented by a combination of software and hardware such as a microprocessor, graphics processor, touchpad controller, and memory) displays the virtual finger position and motion together with the keyboard layout on front panel screen (804).
  • the virtual keyboard processor also analyzes the finger position and motion information data, compares the data with the registered position of the keys (or hyperlinks) and invokes proper operation in application process (806).
  • the keyboard position information may be programmed in a virtual keyboard process, or alternatively may be saved in system memory (808),
  • the key-press or hyper-link information that the user intends to relay to the applications may be passed to the virtual keyboard controller either through memory, or alternatively through inter-process communications.
  • the display screen may be located at some distance from the touchpad. Indeed, the display screen and the touch pad may not even be physically connected at all. Rather the touchpad may transmit data pertaining to the user's hand position to a processor, which in turn may then generate the virtual image of the user's hand and display the virtual hand on the display screen, and neither touchpad, processor, or display screen need to be physically connected (although they may be). For example, data pertaining to the user's hand and finger position relative to the touchpad may be transmitted by a wired, wireless, or optical (e.g.
  • the processor infrared method to the processor.
  • the processor in turn may transmit the virtual image of the user's fingers and hand to the display screen by a wired, wireless, or optical (e.g. infrared) technique.
  • a wired, wireless, or optical (e.g. infrared) technique may be transmitted to the processor.
  • the display screen may thus be in nearly any location, such as on a regular monitor, TV screen, projector screen, or on a virtual heads- up eyeglass display worn by the user (e.g. a device similar to Google Glass).
  • touch pads are often flat and roughly rectangular devices, there is no constraint that the touch pads using embodiments of the present invention be either flat or rectangular. Indeed in some embodiments, there is advantage to employing touch pads that include variably shaped and curved surfaces. Such curved and/or variably shaped touch pads could be then placed on various non-traditional locations, such as on the surface of a ball or cylinder, on the surface of various common devices such as glasses frame stems for virtual heads-up displays such as windshields, eyeglasses, and the like, other wearable computerized devices such as smart watch bands, steering wheels - either for a vehicle or a game interface, joysticks, and the like, and/or, dashboards, instrument panels, and the like.
  • touchpad technology many different types may be used for this device, including capacitive sensing, conductance sensing, resistive sensing, surface acoustic wave sensing, surface capacitance sensing, projected capacitance sensing, strain gauges, optical imaging, dispersive signal technology, acoustic pulse recognition, pressure sensing and bidirectional screen sensing.
  • touchpad sensing technology that is capable of sensing multiple finger positions at the same time may be used. Such an ability to sense multiple finger positions or gestures at the same time hereinafter also referred to as "multi touch” or “multi-touch” sensing technology.
  • Touchpads are thus distinguished from previous mechanical keyboards or keypads because touchpads are not mechanically actuated, that is, since the surface of a touchpad is substantially rigid and responds to touch instead of a mechanical deflection, the touchpad gives the user substantially no indication that the immediate surface of the touchpad moves where touched, except perhaps for the entire rigid touchpad moving as a result, eve with pressure sensitive touchpad technology.
  • Touehpads are further distinguished from previous mechanical keyboards or keypads because the shape and/or location of input keys or buttons on a touchpad are not fixed because the keys and/or buttons are instead displayed on an electronically controlled screen with the flexibility of software control and not limited by fixed mechanical elements located on the device.
  • One example of a. multi-touch touchpad embodying the present invention may use a touch sensing device commercially available from Cypress Semiconductor Corporation, San Jose, California, and commonly known as the Cypress TrueTouchTM family of products.
  • This family of touchpad products works by projective capacitive technology, and is suited for multi- touch applications.
  • the technology functions by detecting the presence or proximity of a finger to capacitive sensors. Because this touchpad system senses finger proximity, rather than finger pressure, it is well suited to multi-touch applications because, depending upon the tuning of the capacitance detection circuit, various degrees of finger pressure, from light to intense, may be analyzed.
  • the projective capacitive technology method may function with a broad range of substrates.
  • a skeletal linked model of the human hand based software that creates a biology-ba sed (biomechanicai and/or anatomical) model of joint motion and associated set of constraints has been proposed.
  • the skeletal linked model approach also is based on a software model of the skin that may stretch and bulge in order to accommodate this internal skeleton.
  • the software models a natural joint axis for four different types of joints in the human hand, as well as takes into account the relative lengths of the underlying hand bone structure, and also accounts for the space occupied by the hand's muscles and skin.
  • Figure 25 depicts a simplified exemplary flowchart how biomechanicai models of hand and finger movement may be used to display a virtual image of at least a portion of a hand of a. user on a display screen of the computerized system of Figure 8, in accordance with one embodiment of the present invention.
  • Figure 25 depicts the flowchart includes obtaining data from a touchpad, the data being associated with the location and movement, of a finger and'or hand of the user and not associated with an image of the finger of the user from an image sensor, when the user operates the computerized system using the touchpad (2510).
  • the flowchart further includes communicating the data from the touchpad to the computerized device, the touchpad being located in a location that is different from the location of the display screen (2520).
  • the flowchart further includes analyzing the data in accordance with a model of a human hand, and assigning the data to at least one of a multitude of fingers of the model (2530), computing a graphical representation of at least one finger of the user in accordance with the model (2540), and displaying the graphical representation on the display screen (2550).
  • FIG. 9 depicts a simplified exemplary flowchart how biomechanicai models of hand and finger movement may be calibrated and adapted to help turn the raw touchpad data into an accurate model of the user's hand and finger positions, in accordance with one embodiment of the present invention.
  • the system may work with adequate accuracy using standardized models of hand and finger relationships.
  • the system may perform adequately by an initial calibration step where the system invites the user to place his or her hand on the display screen, the system displays various sized hands, and the user is invited to enter in which standardized hand size best fits his or her own hands. The system may then use this data for its various calculations. Even more simply, the system may default to an average hand size for initial use, allowing some degree of functionality to be achieved with no preliminary calibration.
  • the system may invite the user to do an active calibration step, or alternatively the user will voluntarily start an active calibration step, in step (900).
  • the model includes calibration information in accordance with pressing a portion of the user's hand on the touchpad in a specified manner.
  • the system may optionally display one or more targets on the screen, which may be keyboard targets, or alternatively may be specially designed calibration targets specifically designed for the active calibration step.
  • Optional photographic calibration steps are described for Figure 14.
  • the system may optionally request that the user calibrate one hand at a time, and indeed may request that the user operate the fingers on his or her hand in a manner different from normal typing so as to gather additional data.
  • a user may be requested to first extend a specific finger to a maximum length and press, then to a minimum length and press, then to the extreme left and press, then to the extreme right and press and so on, potentially through all fingers and the thumb on a one at a time basis. It should be apparent that such a data set may then naturally be translated into a reasonably detailed model of that particular user's hand and its capabilities to maintain a number of different configurations.
  • the system will accumulate touch data by invoking touchpad sensing hardware and calibration software (902).
  • the system will also make predictions as to the location of the user's hand and fingers by bootstrapping from various hand position estimates (904). Often the system will track the positions of the hand and fingers across successive time intervals to do the predicting, and compute probable finger paths (906).
  • the system will often use its internal model of the user's hand biomechanical features and anatomical features to do the computing, and to help associate the various projected paths with the user's fingertips and thumb position, which at least during the active calibration process will be known (908).
  • a path is understood to be the line or linkage between at least one finger root and the associated fingertip or touchpad touch point for the associated finger.
  • the system will then refine its models of the user's hand biomechanical and/or anatomical features by comparing the predicted results with real data, and determine if its user hand model is working with sufficient accuracy in step (910). If it is, then this user hand model will then be adopted and used for subsequent user virtual keyboard data entry ur o es (914). If the user hand model is not working with sufficient accuracy, then the system will attempt to adjust the hand model by varying one or more hand-model parameters (912), and often will then continue the calibration process until acceptable performance is obtained .
  • the calibration software enables the biomechanical and/or anatomical model of the human hand to be calibrated more accurately, so as to match the biomechanical and/or anatomical characteristics of a particular user's fingers and/or hand.
  • the realism of the simulated virtual fingers on the screen may optionally be facilitated by the use of predictive typing models.
  • the predictive typing model approach will be particularly useful when the user is typing text on a virtual keyboard, because the system may scan the previous text that has been entered, and utilize a dictionary and other means, such as the statistical distribution of letters in the particular language, to make educated guesses as to what letter is going to be typed next. This educated guess may then be used to supplement the touchpad data as to last fingertip position and movement to tend to direct the appearance of the simulated finger towards the logical next key. Because this system will occasionally tend to guess wrong, however, the user may find it useful to adjust this predictive typing "hint" to various settings depending upon the user and the situation.
  • Figure 10 depicts a simplified exemplar ⁇ ' flowchart how predictive typing methods may be used to improve the accuracy of the appearance of the virtual hand and fingers while typing, in accordance with one embodiment of the present invention.
  • the software will first access both the biomechanical and/or anatomical model data for the user's hands (1000), and the latest fingertip and thumb position data from the touchpad sensors (1002). The system will then use this information to display the user's virtual hands and lingers on the device's display screen (1004). If a predictive typing mode is on (1006), then the system will attempt to deduce (based upon typing speed, as well as the user's consistency in typing speed, and context) what is the most probable letter or letters that the user is likely to type next. The system will also attempt to predict the most probable finger or fingers that the user will use to type this most probable letter (1008).
  • the system may use this factor in its analysis of the somewhat noisy finger position data from the touch sensor to increase the probability that the user's left index finger (often used to type "e” on a keyboard, and which in-fact may not be registering on the touch pad because the user has lifted the left index finger to move to strike the "e” key), is moving towards the "e” key.
  • such predictive typing algorithms may help increase the illusion that the user is looking through the display and onto his or her hands below the display even though the
  • the efficiency of the predictive typing may be further enhanced by incorporating the user's history of finger use for each particular key. For example, one user may have a strong tendency to use the right index finger to type the keys "H” and "J", and as another example the same user may have a tendency to user his or her left pinky finger to type the letter's "A” and "Z”.
  • the system may observe the individual user's typing patterns over time, either as part of an initial calibration step, or later (and in one embodiment even continually) while monitoring the user's typing patterns, and use the user's individualized finger-to-letter correlation habits as part of the predictive typing algorithm.
  • the predictive typing software enables the computerized device to compute the graphical representation of at least the user's fingers, and often the user's fingers and hands, with better precision by additionally using keystroke predictions, in addition to the data on the location and movement of the user's fingers and/or hand obtained using the touchpad.
  • finger hover means highlighting or otherwise graphically altering the appearance of a virtual key on a virtual keyboard whenever the system believes that the user's finger is either hovering above that virtual key, or about to strike that virtual key.
  • touchpads that may sense relative finger proximity to the touchpad surface, such as projective capacitive technology touchpads, may be particularly useful.
  • the sensors and algorithms that detect relative finger-height above a surface may be tuned to various degrees of sensitivity, and indeed this sensitivity level represents an important engineering tradeoff. If the touchpad is tuned to too high a sensitivity, then it will tend to generate spurious (false) signals, and also lack precision as to precisely where on the touchpad a finger is about to land. If the touch pad is tuned to a lower sensitivity, then the touchpad will tend to detect fingertips that are exerting a considerable amount of pressure on the touchpad surface.
  • a touchpad might first operate at a normal level of sensitivity until it detects that a fingertip within strategic striking distance of a particular key has left the surface of the touchpad.
  • the touchpad circuitry might temporarily reset its sensitivity to a higher level, designed to more precisely detect when the user's finger is hovering above the key.
  • the key may be highlighted. If the higher level of touchpad sensiti vity detects the fingertip proximity, the key may be highlighted. If the higher level of touchpad sensitivity does not detect the hovering fingertip, then the key will not be highlighted. After a short period of time, about on the order a tenth of a second, the to uchpad may then be reset to the normal level of sensitivity to more precisely determine if the finger has then actually touched the touchpad, or not.
  • Figure 11 depicts a simplified exemplar ⁇ ' flowchart how dynamic changes in touchpad sensitivity may, for finger proximity touchpads, assist in highlighting the virtual keys about to be struck by a user while typing on the virtual keyboard, in accordance with one embodiment of the present invention.
  • Figure 1 1 depicts an example of an algorithm to detect and indicate "finger hover".
  • the system displays the virtual keyboard (1 100), as well as an overlay of the user's virtual fingers on or near this virtual keyboard (1102).
  • the system When the system detects that a finger, suspected of being a finger about to press a key due to the finger's proximity to the key and or predictive typing considerations, leaves the touchpad (most likely because the user has raised the finger above the touchpad in preparation for striking the virtual key), (1104) the system will momentarily turn the touchpad finger proximity detector to a higher level of sensitivity (1 106), and the software will look to see if finger hover over the suspected key or keys may be detected (1108). If the system does not detect that a finger is suspect of leaving the touchpad, the system returns to step 1102. If a finger hover signal may be detected over the suspected key, then this key will be highlighted to help guide the user (1 1 10).
  • the system will once again lower the sensitivity of the finger proximity detector down to the normal le vel (1 112), in order to precisely detect if the finger is about to strike the key (11 14). If the touchpad, now operating at normal sensitivity, now detects that the virtual key has been struck by the user, the system will appropriately indicate the keystroke on the virtual key board by further graphical changes to the key (1 1 16) and optionally may issue an audible keypress or key-click sound as well to give further feedback to the user. Then the system may record the key strike (5 5 58). If the appropriate finger press was not detected at (1 1 14), then the system repeats the flow at step (5 102).
  • the finger hover algorithm approach allows at least one data entry location (key) to be highlighted on the device's graphics display screen whenever the computerized device determines that at least one finger on the user's hand has left the touchpad, and the position and motion history of the finger is consistent with an ability of that finger to strike a position on the touchpacl that is consistent with the location of the data entry location (key) on the graphics display screen.
  • the system may utilize this biomechanical and/or anatomical model of the user's hand or hands to compute a graphical representation of at least the user's fingers, and often the user's hand and figures, suitable for display on the device's graphics display screen.
  • a life-like graphical representation of the user's hand and fingers is not necessary. Often, a more shadow-gram like or cartoon-like two-dimensional model (or representation) of the user's hand and fingers will be all that will be necessary. Often these two-dimensional representations of the user's hand and fingers need not include much, if any internal detail. Rather, these representations, may for example, look much like a translucent gray or other colored shadow projection of the user's hands and fingers on a surface.
  • the sharpness and the contrast and the detail of the user's hands and fingers may have reduced sharpness, and have enough distinguishing contrast from other areas of the display screen, so as to enable the user to accurately place his or her hands and fingers on the appropriate virtual buttons or virtual keyboard that is being shown in the graphical display. More fanciful or artistically inspired hand representations are also discussed later in this specification.
  • Figure 12 depicts a simplified exemplary flowchart for generating images of the virtual hand and fingers on the device's graphics display screen, in accordance with one embodiment of the present invention.
  • Many ways to graphically represent the user's hands and fingers, or at least the user's fingers, are possible.
  • a three-dimensional virtual model ma be constructed in the device's memory that depicts the user's hand(s) and fingers (1202).
  • a two-dimensional projection of the general outlines of the user's hand and fingers may be made upon a mathematical surface that corresponds to the surface of the touchpad (1204), This projection may be in the form of a hand and/or finger outline, or alternatively a. virtual hand and finger shadow may be produced. This projection may then be combined with the any other data that is being sent do a memory buffer or graphics display buffer for the display screen of the device, and then displayed to the user (1206).
  • the graphical representation of at least the user's fingers, and often both the user's hand and fingers, on the graphics display screen may be done by using the previous assignment of the data on the location and movement of the user's fingers and/or hand(s) to specific fingers on the biomechanical and/or anatomical model of the human hand(s) to create a three dimensional model of the user's hand(s) and fingers in the computerized device's memory.
  • a two-dimensional projection of this three dimensional model of the user's hand(s) and fingers in memory may be made.
  • the two-dimensional projection may be on an imaginary plane that corresponds in both distance and orientation from the model of the user's fingers to the touchpad.
  • the distance between the three dimensional model of the user's finger and the imaginary plane that corresponds in distance and orientation to the touchpad will also be 1 ⁇ 4".
  • This two-dimensional projection on the imaginary "touchpad” plane may be used to generate the graphical representation of at least the user's fingers on the graphics display screen, and often the user's fingers and hand(s) as well.
  • a two dimensional model of the user's hands and fingers may be manipulated to best fit the previously discussed hand and finger position and motion data, and this two dimensional model then used for the graphical representation.
  • This two dimensional model may be further user selected according to the user's hand size, and indeed may be calibrated by asking the user to place his or her hand on the touchpad, thus allowing the system to sense the dimensions of the user's hand directly.
  • Figure 53 depicts a simplified exemplary biomechanical and/or anatomical model of the human hand, showing the internal skeletal structure with a skin overlay, in accordance with one embodiment of the present invention.
  • This illustration shows the major bones of the hand, with the bones of the index finger and thumb separated in order to allow the joints to be better visualized.
  • the internal skeletal structure of the hand (1300) is depicted, along with an outline of the skin on the left side of the hand (1302).
  • the bones of the fingers include the distal phalanges (1304), the intermediate phalanges (1306), the proximal phalanges (1308) and the metacarpals (1310).
  • the thumb lacks the intermediate phalange.
  • the various finger joints include the distal inter-phalangeal joint (dip) (1312), the proximal inter-phalangeal joint (pip) (1314), and the metacarpophalangeal joint (mcp) (1316),
  • the thumb lacks the distal inter-phalangeal joint (dip), and instead includes the interphlangeal joint (ip) (1318) as well as the carpometacarpal (cmc) joint (1320).
  • the closer the various default parameters of the biomechanical and/or anatomical model of the human are to the actual user hand parameters the better.
  • even the range of joint motion may also be experimentally determined, and used to replace one or more joint motion range default parameters.
  • the biomechanical and/or anatomical model of the human hand used in the embodiments of the present invention for finger identifying algorithms may be based on the following observations.
  • the various metacarpophalangeal joints (mcp) (1316) may hereinafter also be referred to as the "finger roots". Finger roots will be represented by the variable "r". Alternatively, finger roots may be referred to as the junction between the finger and the palm. [0154] Third, due to the relatively invariant shape of the palm, the orientation of the user's palm and its angle with respect to other hand stmctures, such as the relative orientation of the fingers (e.g. middle finger (1330) is relatively constant. In particular, the orientation or position of the various "finger roots” (1316) may define a palm line direction (1332) that will in turn, when the angle of the palm line with respect to the coordinates of the touchpad are known, help to define the location of the various fingers and fingertips.
  • the touch pad data will include various touchpad touch points, identified in (x, y) coordinates in later figures, which will often but not always correspond to the area underneath the uppermost bone of the user's finger and thumb (1304), hereinafter also referred to as the "finger tips".
  • the touchpad observed location of any given finger or thumb tip will often be referred to as (x t , v.), where x and y are the observed touchpad data, and "i" refers to or is associated with the finger that ultimately produced the touch pad touch data.
  • the raw touch pad data does not include such (x;, y;) labels. Instead, the system embodiments may have to make sense of various incoming touch pad data, attempt to make sense of the data using the underlying biomechanical and/or anatomical model of the human hand, and then generate a virtual hand model that is consistent, with both the touchpad data and the underlying biomechanical and/or anatomical hand model.
  • a simplified exemplary palm angle ( ⁇ ) rotation transformation may help the system relate raw touchpad data to a standard biomechanicai and/or anatomical model of the human hand, in accordance with one embodiment of the present invention. If the user touches both the tips of all fingers and thumb and the base or finger root of all fingers and thumb onto the touchpad, then the raw touchpad data would include a series of (x ⁇ , ' ⁇ values for the finger tips, and a series of (x n , y ri ) values for the finger roots.
  • the system may determine how much the user's hand is rotated relative to the coordinate system of the touch pad using palm angle ⁇ , then the process of mapping the raw data into the biomechanicai and/or anatomical model of the human hand may be simplified.
  • palm angle € is useful.
  • Figure 18 depicts more exemplary details of the relationship between the hand 's palm direction or palm angle and the tips of the user's fingers, in accordance with one embodiment of the present invention.
  • touchpads users will often touch the pad with the fleshy area of the palm underneath their finger roots (1822). If the finger root touch area information is detected by the touch pad, the system may detect the direction of the palm line of the hands (1322) from the finger root touch area.
  • the system may use a relaxed finger position direction depicted as dashed - dotted line (1825) from, touchpad touch point (1810) on F5 to touchpad touch point (1820) on F3, or a relaxed finger position direction from touchpad touch point (1810) on Fl to touchpad touch point (1830) on F4 to approximate the palm line direction (1322) and adjustment angle ⁇ between the relaxed finger position direction and the palm line direction, e.g., between line (1 825) and palm line ( 1322). The system may then determine the angle ⁇ between the palm line, and the touch pad coordinates such as the touchpad x-axis.
  • Figures 16A - 16B depict how a simplified exemplary palm angle rotation
  • the process of rotation transforming the raw touchpad data (x,, yj) into palm angle corrected touchpad data (x'j, y'j) may be done using the following coordinate rotating formula;
  • the system may find the palm angle using the finger root and/or finger touch points as shown in Figures 16A and 18.
  • Figure 16A depicts the before the rotation transformation (correction) and the before and after results of this rotation transformation or correction and Figure 16B depicts the results after the rotation transformation or correction depicting the palm line direction being substantially parallel to the touchpad's x-axis.
  • the word substantially herein refers to an accuracy sufficient to achieve proper guidance of the virtual finger(s) displayed on the display screen to the extent that the user may be able to guide the hand to properly strike a virtual key or other control object displayed on the screen and not intending to imply any more accuracy than so required.
  • more accuracy is required for smaller virtual keys or control objects than for larger virtual keys or control objects but exact anatomical matching to the hand is not required.
  • the system may also determine the finger root ( ⁇ committee, y ri ) location coordinate for one or more of the fingers Fl , F2, F3, F4. Then the system may perform the analysis often based on the assumption that the Fl root coordinate (x r! , y, f ) is the most available (i.e. most frequently found in the raw touchpad data), which is often true because the finger 1 finger root commonly touches the touchpad surface. Alternatively, because the palm does not bend much, the finger Fl root coordinate may be calculated from the other palm touch points, i.e. other finger roots (x r ;, ⁇ 3 ⁇ 4).
  • Figure 17 depicts more exemplary details of the relationship between the finger roots (Xri, y d ), i.e., roughly finger joint region (1316), and the hand 's overall palm angle, in accordance with one embodiment of the present invention.
  • Xri, y d finger roots
  • y d roughly finger joint region
  • palm angle overall palm angle
  • the root coordinates for finger 1 are available (x rl , y rl ), then based on hand anatomy considerations, the position of the finger 2 root is likely to be, or may be calculated to be: x r2 — x r + ⁇ (w x + w )co5 ⁇ 9 and
  • Figure 19 depicts how simplified exemplary biomechanicai and/or anatomical model data pertaining to the width of the fingers, such as L 12 , may be used to help interpret raw touchpad data, in accordance with one embodiment of the present invention.
  • a palm line vertical direction (1930) may be defined running substantially through touchpad touch point (1820) on finger F2 and substantially perpendicular to palm lines (1322).
  • the intersection of palm lines (1322) and palm line vertical direction (1930) passing through the longitudinal axis of F2 may pass through the finger root for F2 at (x r2 , y r2 ), which may be used for the origin of the coordinate rotation axes X, Y.
  • the system may also calculate the likely finger root coordinates for fingers F3 and F4 (in this simplified approximation, the model may assume that the finger roots are substantially on the same palm line (1322) as per Figure 16 A, 16B, 19, Figure 13, and elsewhere).
  • the system may also calculate the new coordinates for any given finger "i" root assuming that the hand is rotated at palm angle ⁇ by also using rotation formula (1).
  • the rotation transformed finger root locations may be expressed as:
  • Figure 20 depicts how in a more accurate exemplary model, the location of the various finger roots will be displaced to some extent from the palm line (which forms the palm angle) by various amounts 6ri, in accordance with one embodiment of the present invention.
  • any of finger roots F2, F3 and F4 might be displaced somewhat from the palm line (1322) by a. small amount, represented by ⁇ ,; as depicted in Figure 20.
  • 6 rl may be either positive or negative.
  • the system may also perform the inverse transformation using formula (1) to calculate the raw touchpad data root position coordinates ( ⁇ rid, y r j) in the original touch pad coordinate system.
  • This later technique is often especially useful for determining if any of the raw touchpad data might represent the thumb root location (x,o, y f o)-
  • the raw thumb root touchpad data is often difficult to obtain because sometimes the thumb root does not touch the surface of the touchpad.
  • the system may make sense of the raw touchpad data by sorting the set of rotation transformed fingertip positions ⁇ (x ⁇ , y'j) ⁇ and finger root positions ⁇ (x'ri, y'o) ⁇ according to ascending (or descending) x value, and then attempt to pair the rotation transformed possible fingertip data (x'j, y ) with the rotation transformed possible finger root data ( ⁇ 'êt, y' ri ).
  • a unique touchID may be assigned for each continuous touch.
  • a. finger "i" was previously touched to the touchpad and was lifted later, one may use the touchpad history data obtained by the system at earlier time points (usually a. fraction of a second earlier, i.e. time (t-i)) to determine the missing finger.
  • time data may also be used in another alternative approach, to be discussed shortly. For example, at time (t-1) (i.e. the previous history of stored touchpad data in a time indexed stack of such touchpad data, with fingers F0-F4 identified, one has:
  • the system has a raw set of data for just three touch points from three fingers, such as fingers F0, Fl , F2 (although this example is using and numbering fingers F0, Fl , F2, other fingers and other finger Fi could be alternatively used).
  • the raw data would be:
  • the history data may not be available.
  • one should determine the missing (e.g. elevated) fingers by one or more various alternate methods, such as the methods described below.
  • Figure 21 depicts how the simplified exemplary system may attempt to correlate detected fingertip data from some fingers with finger root data from other fingers, determine that some fingertip data is missing, and thus deduce that these fingers are elevated above the touchpad, in accordance with one embodiment of the present invention.
  • the "missing fingers" include fingers Fl and F2, which are depicted with shading. Missing fingers include a finger that might have been elevated too far above the touchpad in order to register a touch point imprint on the touchpad.
  • the system may operate as follows. First, from the touchpad data set, the system may calculate the palm angle ⁇ coordinates for the various fingertips and finger roots. Second, for each fingertip, the system may check if the position of the fingertip is inside of the range of likely finger root j positions using a formula such as: where wj is the width of finger i.
  • the system will attempt to match the fingertip "i" with roof j.
  • the system may for example, even attempt to match potential fingertip locations from one finger with the potential finger root data from another finger.
  • the system may attempt to match the fingertip data (x'j, y'i) with the finger root position (x' r3 , [0184]
  • the range may be calculated as follows:
  • the system is also incorporating finger length (i.e. the length between the fingertip (x'i, y';) and the finger root (x'n, y'n)) into its biomechanical and'Or anatomical model of the human hand.
  • the system may mark that finger as missing (i.e. likely raised above the surface of the touchpad, rather than touching the touchpad). See for example, fingers Fl and F2 in Figure 21.
  • the shading of fingers 1 and 2 shows that the system has recognized that the fingers tips are missing, probably because the finger tips are elevated a sufficient distance above the touchpad.
  • the missing finger "i" when, as will frequently be the case, the finger position data is insufficient, but the missing finger "i" may be identified by using the previous algorithm or other methods, one may approximate the missing finger's position by assuming that as per a normal biomechanical and/or anatomical hand, the change x and y position of the missing finger's neighboring fingers (e.g. neighboring change ⁇ and Ay) will also pertain to any change in location of the missing finger as well, as described in the following examples. [0189
  • the current position for the missing finger "i" may be calculated as follows.
  • the system may use various algorithms to help with this decision.
  • the system may use the range information on the coordinates after rotating the data by palm, angle ⁇ , as is shown on Figure 21.
  • all touch points within the following range may be assumed (i.e. mapped into) one hand.
  • the criteria here may be:
  • the system may also use the touch angle information for touch points and palm line angles to help assign the raw touchpad data to one hand or the other.
  • the system may assume that both hands belong to the same individual, and essentially extend the biomechanicai and/or anatomical model of the human hand to also put in some simplified human anatomical constraints regarding the relationships between the angles of one hand and the angles of the other hand.
  • Figure 22 depicts how the simplified exemplary system may further assign raw touchpad data to two different hands (left hand (2202) including FOL through F4L, and right hand (2204) including FOR through F4R) of the same user, based on the assumption that the range of possible hand angles for the same user is limited by the user's anatomy, in accordance with one embodiment of the present invention.
  • the touch angle of a touch point may also be determined along the long touch side defined as follows. That is, usually a finger will touch in a roughly oval pattern with the long axis of the oval, i.e. the long touch side, corresponding to the touch angle a of a touch point.
  • the angle a between the touch point directions D4L, D2R. and the associated respective palm line vertical direction (2220, 2230) will generally be in the range of [0, 90] degrees.
  • the palm line vertical direction (2220, 2230) is substantially perpendicular to associated palm lines left and right (1322, 2222) respectively.
  • palm line vertical direction (2220) may be associated with finger F4L through touch point (2224) and palm line vertical direction (2230) may be associated with finger F2R through touch point (2234)
  • the system may also partition the touchpad area (for example, split the touchpad area into a left half and a right half) and assign some or all of the touch pad data from the left half to the user's left hand, and assign some of all of the touch pad data from the right side of the touchpad to the user's right hand.
  • partition the touchpad area for example, split the touchpad area into a left half and a right half
  • FIG. 23 depicts a first simplified exemplar ⁇ ' example of angle based finger matching algorithms, in accordance with one embodiment of the present invention.
  • Angle based methods may be used to match raw touchpad data with specific user fingers.
  • These alternative, angle- based, finger matching algorithms may be implemented as follows. First, perform or do a best fit between the touchpad data and the biomechanical and/or anatomical model of the user's hand, and second, use this best fit biomechanical and/or anatomical model of the user's hand, to find a point substantially along a mid-line (2310) of middle finger F2.
  • mid-line of middle finger F2 pass through the center of the palm, e.g.
  • palm center point (x c , y c ) or other point on the palm (any palm center point (x ce nier, venter) may be used so long as it is inside a region bounded by the five metacarpophalangeal joints (xn, ⁇ 3 ⁇ 4)) ⁇
  • the coordinates of the palm center point may be calculated based on the finger model, and known finger positions.
  • the angle calculated by ataii2 has a range within about - ⁇ to + ⁇ .
  • Figure 24 depicts a second simplified exemplary example of angle based finger matching algorithms, in accordance with one embodiment of the present invention.
  • sort c3 ⁇ 4 in the same order as the 3 ⁇ 4.
  • match the corresponding finger to the associated angle as per Figure 24.
  • the advantage of this approach is that one does not need to perform coordinate rotation to match the fingers. Instead, the atan2 calculations may be done by computationally faster methods, even by table lookup methods, as needed.
  • the hand and finger analysis software discussed above may operate by an iterative process.
  • the software may make tentative assignments between the raw touchpad data and one possible set of fingertip, finger root, or palm touch points on the previously discussed biomechanical and/or anatomical model of the human hand (here the user's hand), and score the results according to how close the raw touchpad data, either before or after various possible transformations, may fit with a known hand configuration.
  • the software may then explore other possible hand configurations and transformations, and then select or choose the hand configuration and/or transformation (e.g. rotations, translocations, missing fingers, and the like) that produces the highest overall score.
  • the software will then use the highest scoring hand configuration and orientation model for virtual hand display purposes.
  • the system may then use this information to make corresponding changes in its biomechanical and/ or anatomical model of the human hand.
  • the model may include calibration information associated with an image of at least a portion of the hand of the user,
  • Figure 14 depicts how the simplified exemplary user's hand or hands may be photographed by the device's camera or other camera, and this image information may be used to refine the default parameters of the biomechanical and/or anatomical model of the user's hand, in accordance with one embodiment of the present invention.
  • a standardized background such as a series of distance markings, grid, graph paper, and the like (1400) in order to better calibrate the image of the hand and correct for image distortions.
  • This standardized background may additionally include various color, shades of gray, and resolution test targets as well.
  • the background may be conveniently provided by, for example, electronically providing one or more background image sheets (e.g. a jpeg, png, pdf or other image file) for printing on the user's printer.
  • the user may put each hand on background (1400), and take a photo of the hand(s) ( 1402) with either the computerized device's camera or other camera. This image may then be analyzed, preferably by an image analysis program. The background image will help correct for any image distortions caused by different camera angles, and the like.
  • the user hand image analysis may be done onboard the user's handheld computerized device, but it need not be.
  • the user may upload one or more images of the hand taken by any imaging device to an external image analyzer, such as a remote internet server.
  • the image analyzer will analyze the user's skin or hand outline appearance (1404), deduce the most probable lengths one or more bones of the user's hand, such as the user's various finger and thumb bones, and send this data or other data to correct the default biomechanical and/or anatomical model of the user's hand(s) back to the user's computerized device, such as for example during calibration step 906 referenced in Figure 9 above.
  • the user may calibration the touchpad by firmly pressing a portion or all of the user's hand on the touchpad, and allowing a highly capable touchpad to in turn precisely render the resulting handprint.
  • a compute program may then analyze the touchpad-derived handprint, extract parameters such as finger joint positions, probabl e finger and hand bone lengths, and the like and derive the same information as previously discussed for the photographic calibration step above.
  • the model includes calibration information in accordance with pressing a portion of the hand of the user on the touchpad.
  • information on the user's finger placement may be obtained using optical methods.
  • the touchpad sensor may be an optical method such as one or more cameras. These camera(s) may keep track of the user's hand and finger positions, and this data may then be fed into the biomechanical and/or anatomical model of the human hand(s) to compute a graphical representation of at least the user's fingers as described previously.
  • image information may also be used to refine the biomechamcal and/or anatomical model of the user(s ) hands in real time while the user is using the touchpad.
  • FIG. 15 depicts how an exemplary device camera ( 1500) may be used to obtain a partial image of the user's hand (1506) while using the device's touchpad (1508), and this information also used to update and refine the biomechamcal and/ or anatomical model of the user's hand, in accordance with one embodiment of the present invention.
  • the rear mounted device camera (1500) which often will have a very limited field of view at close range (1502), may nonetheless be used to obtain a real time video image of a portion or part (1504) of the user's hand (1506) while the user is using a rear mounted touch pad (1508) using a touch pad mounted on the back of the computerized device (1510).
  • touch pad data gives the position of the user's index finger (1512) as a strong touch pad signal, and the position of the user's middle finger (1514) as a weaker touch pad signal (1514),
  • the portion of the hand (1504) that may be directly visualized by video camera (1500) does not include any image information at all pertaining to the position of the user's fingers, the image information ( 504) does provide a useful series of further constraints upon the biomechamcal and/or anatomical model of the user's hands.
  • the partial hand image information in conjunction with the touch pad data (1512), (1514), and optionally with a refined biomechanical and/or anatomical model of this user's hand (if available) obtained in Figure 14, above, may improve the accuracy of the depiction of the user's hand and fingers.
  • the user may not wish to have a fully accurate anatomical model of the user's virtual hand displayed on the screen, but, may instead prefer a variant, such as a realistic depiction of a "monster hand" with fingers being replaced by claws, fur, or pads, and the like, or of a skeleton hand that sho ws the underlying biomechanical and/or anatomical estimation of the user's hand bones as per Figure 13.
  • the system software may also be configured to render the user's fingers and hands as various hand variants when displayed.
  • these hand variants will still provide realistic information pertaining to the user's hand and finger placement, but will also provide this information as various user artistic options that often may be customized according to user preference.
  • touchpad controls to a computerized system have focused on two dimensional finger gesture controls requiring finger contact on the locally two-dimensional touchpad surface, even if that surface as a whole may be curved or otherwise project into the third dimension to some extent.
  • the embodiments of the present invention which may operate using a biomechanical and anatomical model of the human hand, may include a. three dimensional gesture component that enables various types of three dimensional multi-touch gesture controls described below.
  • Three dimensional multi-touch gesture controls may be advantageous in applications where the user needs to touch a portion of the touchpad continually, such as for example, when the user holds a. handheld computerized device including a touchpad on the backside of the device.
  • the three dimensional multi-touch gesture controls may help the computerized system differentiate touches on touchpad control regions intended as control inputs from touchpad touches used to merely hold the device.
  • the three dimensional sensing aspects of the present invention may be used to control virtual keyboard data, entry to a computerized system by various "lift and tap", or “lift and drag”, or “lift and other gesture” type modes for data input. More complex variants can also implement other commands, such as “lift and tap and rotate, e.g. with two fingers”, “lift and tap, and enlarge, e.g. with two fingers", and so on.
  • the biomechanical and anatomical model of the user's hand may inform the system when one or more user fingers are positioned on the touchpad so as to be above a corresponding control region of the touchpad, such as above a key of a virtual keypad, virtual keyboard, or above a hyperlink, and/or the like, but not yet touching the corresponding control region. Because the model of the hand accurately determines the location of the one or more user fingers even when the user's finger is not touching the surface of the touchpad, the "off-touchpad" finger location may be used for three-dimensional gesture control.
  • the control region of the computerized system may be on a touchpad including an integrated display screen located in substantially the same location.
  • integrated touchpads may include both display screen and touchpad built in layers and accessible from the same surface or side of the computerized device and thus located substantially in the same location even though the layers may be separated by small dimensions relative to the touchpad surface length or width.
  • the control region of the computerized system may be on a separate and/or additional touchpad being located in a location that is different from the location of the display screen as previously described in reference to Figure 8.
  • integrated touchpads may include both display screen and touchpad built, in layers and accessible from the same surface or side of the computerized device and thus located substantially in the same location even though the layers may be separated by small dimensions relative to the touchpad surface length or width,
  • the user moves a finger onto a control region of the touchpad, and this finger is in contact with the touchpad.
  • the computerized system may determine if the user wants to activate that control region, e.g. press the virtual key or control region to generate an input to the computerized system using, for example, a "lift and tap” type control scheme described as follows.
  • a "lift and tap” type control scheme described as follows.
  • Figure 26 depicts a simplified exemplary flowchart of a "lift and tap” technique of key entry for controlling an input from a user to the computerized system, in accordance with one embodiment of the present invention.
  • Figure 26 depicts the "lift and tap” technique includes obtaining (2510) data from a touchpad, the data being associated with the location and movement of a finger and/or hand of the user and not associated with an image of the finger of the user from an image sensor, when the user operates the computerized system using the touchpad.
  • the "lift and tap” technique further includes communicating (2620) the data from the touchpad to the computerized device and analyzing (2630) the data in accordance with a model of a human hand, such as referenced in Figure 9 - Figure 25.
  • Figures 27A - 27F depict a series of simplified exemplary display screen shots of the "lift and tap” technique of key entry depicted in Figure 26 being used to type the first two letters of a "Hello World” message on the computerized system, in accordance with embodiments of the present invention.
  • Figures 27A - 27F were obtained as screen shots or grabs in that respective order from a few seconds of video showing the display screen of a prototype handheld computerized device while the user proceeded to type at touch-typing speed using the "lift and tap” technique referenced in Figure 26, The system has already assigned the touch data from the touchpad to at least one of the multitude of fingers of the model, computed a.
  • hand (2701) including fingers (Fl , F2, F3) is displayed clearly as a virtual, i.e. computer-generated, hand because the palm, includes square edges, the fingers include straight sides, and the joints between the fingers and the palm, are not continuous.
  • the system determines (2640), in accordance with the touchpad data and the model of the human hand, that at least one user finger (Fl , F2, F3) is initially touching, or is in contact with, the region of the touchpad corresponding to a virtual key (2702, 2703, 2705, 2725) or other control region.
  • the system when the system first detects that a particular user finger initially touches or is in contact with a virtual key or other control region, the system may optionally generate a graphical representation associated with the control region being touched on display screen (2700) of the computerized system.
  • Figures 27A - 27F further depict the system is generating and displaying on display screen (2700) a graphical representation of a virtual keyboard including a multitude of virtual keys, including virtual keys (2702, 2703, 2705, 2725), corresponding to control regions on the touchpad.
  • the system may then change the appearance of the graphical representation of the touched virtual key or other control region to sho w or indicate the control region is being initial touched thus providing confirmative feedback to the user.
  • the change of the display image of the touched virtual key may be shown as a change of size, color, shape, slight displacement of the position of the control region image, slight change in display type such as distortion of the control region image, flashing, and/or the like.
  • Figure 27A depicts virtual keys (2702, 2703, 2705), which are being initially touched by respective user fingers (Fl , F2, F3) are temporarily displayed including a slightly larger size and a slightly bolder upper border than the remaining untouched keys,
  • the user may next lift the at least one user finger, e.g. graphically represented by finger F2 in Figure 27B, and finger F3 in Figure 27E, and the system determines if the now missing in touch contact or lifted finger is most likely positioned above the same previously touched virtual key or other control region on the touchpad.
  • the system will use the biorneehanieal and anatomical model of the user's hand to make the above determination.
  • the system determines (2650), using the model that at least one finger of the user, e.g. graphically represented by finger F2 in Figure 27B, and finger F3 in Figure 27E, is positioned above but not touching the control region of the touchpad, e.g.
  • the model may be used to determine, for example, that although the touchpad may no longer directly sense that the user's finger is in contact with that particular virtual key or other control region, nonetheless the finger is positioned directly above the control region, in accordance with touch data from other regions of the user's hand and/or fingers and the constraints of the model of the human hand.
  • step 2640 described earlier may be an optional step in some embodiments because the model may determine the fingertip locations even with the at least one finger not initially in contact with the touchpad but hovering o ver the control region using the constraints of model of the human hand.
  • the system may temporarily - for example, a first time interval between 0.05 and 5 or even 10 seconds, change the appearance of the graphical representation of the virtual key or other control region the at least one finger is hovering over.
  • the system may display the graphical representation of the control region with a different appearance than the prior appearance of the control region.
  • the difference in the graphical representation may be an enlargement of the represented virtual key or other control region, as show r n at (2720) and (2730), or the difference may be another visual change such as a change of size, color, shape, slight displacement of the position of the control region image, slight change in display type such as distortion of the control region image, flashing, and/or the like.
  • the system may instead produce or generate a sound signal audible to the user instead of, or in addition to, the changed appearance of the graphical representation of the control region when the at least one finger of the user is positioned above but not touching the control region.
  • the graphical representation of virtual keys (2702, 2705) remains displayed as in Figure 27A because user fingers (F l , F3) continue to touch the touchpad in Figure 27B.
  • the user when the user lowers the at least one finger (e.g. graphically represented by F2 in Figure 27C, and F3 in Figure 27F) back onto or touching the touchpad in that region of the touchpad that corresponds to the control region (e.g. graphically represented by virtual key "H” in Figure 27C, and virtual key “E” in Figure 27F).
  • the user's finger lowering may optionally be required to happen within a certain time interval or "first period of time" (usually on the order of between 0.01 seconds and 5 or even 10 seconds after the system optionally changes the appearance of that particular key or other control region at the start of step 2650.
  • the system may then verify, using the biomechanical and anatomical model of the user ' s hand, that the user's at least one finger has now "struck” or “tapped” the particular virtual key or other control region. In other words, the system determines (2660) that the at least one finger is subsequently touching the control region in accordance with the data and the model. In one embodiment, the system may record, or register that the appropriate virtual key has been pressed by storing a record of that action in memory. In Figures 27A - 27C the user is inputting a command to the computerized system to type the letter "H".
  • the system recognizes by the lift and tap action of user finger F2 that the user is commanding the system to type the letter "H”, and the system generates and displays a corresponding letter "H” (2722) on display screen (2700) to confirm the execution of the command.
  • the user is inputting a command to the computerized system to type the letter "E”.
  • the system recognizes by the lift and tap action of user finger F3 that the user intends to command the system to type the letter "E”, and the system generates and displays a corresponding letter "E” (2732) to confirm the execution of the command.
  • the system may optionally change the displayed appearance of the graphical representation of the struck or tapped virtual key or other control region, often back to either its original appearance (as depicted in Figure 27C and Figure 27F) or an optional different altered appearance (e.g. a "key struck" appearance) to visually confirm the control region is touched or struck.
  • the altered appearance may include visual change such as a change of size, color, shape, slight displacement of the position of the control region image, slight change in display type such as distortion of the control region image, flashing, and/or the like.
  • the optional different altered appearance may be displayed for a short second period of time or time interval, often in the 0,05 to 5 second range, but may extend longer, such as up to 10 seconds.
  • the system may also generate an auditory signal that the user's actions have resulted in the pressing of a particular virtual key or other control region.
  • Figure 28 depicts a simplified exemplar ⁇ ' flowchart of a "lift and drag” technique of key entry for controlling an input from a user to the computerized system, in accordance with one embodiment of the present invention.
  • the abo ve embodiments may be extended to other input techniques, such as "lift and drag".
  • the control region may include an elongated control region with a length substantially greater than the longitudinal length of the surface region of the user's at least one finger when contacting the touchpad.
  • Such an elongated control region may include a virtual slider (e.g. a virtual linear control region), virtual rectangular or circular rotation (e.g. a virtual control knob) or virtual expansion-contraction control region, and the like.
  • control region may include a multitude of file names, images, icons, and the like, so that the user may drag these file names, images, icons and the like via the drag technique to, for example, execute a move and'Or copy command on the corresponding files or virtual objects.
  • the user may move or slide the at lea st one finger on the elongated control region of the touchpad.
  • the system may then verify, using the biomechanical and anatomical model of the user's hand, that the user is moving or sliding the at least one finger over the elongated virtual key or other control region. In other words, the system, determines (2810) that the at least one finger is subsequently touching the first control region in accordance with the data and the model.
  • the system may then store (2820) in memory a record of the moving or sliding of the at least, one finger (e.g. register that, for example, a slider has been moved, and the like), and then optionally change the appearance of the elongated virtual key or other control region, such as file names, images, icons and the like, to visually confirm, the command action was executed (e.g. move a knob on a slider control).
  • the system may also give an auditory signal that the user's actions have resulted in the actuating of the drag command and associated result.
  • the user may lift two or more fingers.
  • the system may determine using the model, that a multitude of fingers of the user are positioned above but not touching the control region of the touchpad. The user may then lo was the two or more fingers to the touchpad. In other words, the system may determine that the multitude of fingers are subsequently touching the control region in accordance with the data and the model.
  • the system may determine a motion of a first finger in relation to a motion of a second finger different than the first finger and assign a command to control the computerized system in accordance with the determined motion.
  • the user may either move the two fingers further apart on the touchpad to change the displayed image, e.g. magnify or zoom-in on a displayed image, move the two fingers closer together to zoom-out, or rotate the fingers around a rotation point intermediate between the two fingers to rotate an image.
  • the system may be configured to do corresponding display screen operations under touchpad control where the image on the display screen expands or contracts, or rotates according to the rotation direction of the fingers when the relative motions of the two fingers are assigned to the respective commands.
  • the system may not require the use of a virtual key or other control regions. Instead, the system may operate as if the entire screen is a control region that may be subject to zoom-in, zoom-out, and/or rotation controlled as described above,
  • Most existing two dimensional multi-touch gestures may be similarly extended or modified into corresponding three-dimensional counterparts that incorporate the finger lift gesture component described above.
  • Examples of existing multi-touch gestures that may be modified for additional finger lift functionality include various Apple OXS gestures, such as, but not limited to: swipe behind full-screen apps, two-finger scroll, tap to zoom, pinch to zoom, swipe to navigate, open launch pad, show desktop, look up, app expose, rotate, three-finger drag, tap to click, secondary click, notification center, and show web browser tabs.
  • the embodiments of the present invention may be extended or modified for use with to uchpads that, are capable of sensing the force exerted by a finger in contact with or touching the touchpad.
  • a force sensing touchpad embodying the present invention may use a touch sensing device commercially available from Synaptics Inc., San Jose, California and commonly known as the ForcePadTM family of products.
  • the touchpad With a force-sensing touchpad, the touchpad not only may determine a finger touch location on the surface of the touchpad, but also may sense and determine how much force per finger is being applied to the surface of the touchpad.
  • the dimension of force per finger in the touchpad data may be used by the system instead of or in addition to sensing when a finger of the user is lifted off the surface of the touchpad.
  • Figure 29 depicts a simplified exemplary flow chart of a "lift and tap" technique of key entry modified to use force applied per finger for controlling an input from a user to the computerized system, in accordance with one embodiment of the present invention.
  • Figure 29 includes the same features as Figure 26 with the following exceptions.
  • the system obtains (2910) data from the touchpad the data may be be additionally associated with the force of a finger and/or hand of the user upon the touchpad.
  • the system may display a graphical representation of at least one touch point associated with the force per finger (2743, 2750, 2753, 2765, 2770, 2775) applied upon the surface of the touchpad by the user at the corresponding location on the touchpad where the fingers touch the surface of the touchpad.
  • the force per finger 2743, 2750, 2753, 2765, 2770, 2775
  • the graphical representation of the touch point may be depicted as a solid circle including a diameter associated with the amount of force per finger applied upon the surface of the touchpad.
  • shapes other than a solid circle may be used and/or other display attributes than size may be associated with the force per finger.
  • the force per finger may be associated with at least one of a size, a color, a position, a shape, or a display type depicted on the display screen.
  • the amount of force per finger may be associated with a flashing type display where the rate of flashing may be associated with the amount of force.
  • the system may optionally determine (2940) that at least one finger is initially touching the control region using force within a range "A" in accordance with the data and the model.
  • force range A may include a range of force per finger above about 0.5 newton or about 50 gram weight equivalent units (on the surface of the Earth), corresponding to when the user touches the surface of the touchpad by pressing firmly.
  • the system may optionally display the graphical representation of the at least one touch point associated with the force per finger (2705, 2765) corresponding to force range A depicted as a solid circle having a relatively large diameter close to the pitch between adjacent virtual control surfaces, i.e. virtual keyboard keys, and/or close to the width of user fingers (F2, F3), respectively.
  • an audible signal may be generated in addition to or instead of the graphical representation of the touch point when the user's finger is initially touching the first control region using force range A.
  • an mechanical response may be generated by the system in addition to or instead of the graphical representation of the touch point when the user's finger is initially touching the first control region using force range A.
  • the computerized system may generate a. haptic feedback response from the computerized system to the user when the at least one finger of the user is touching the first control region using the first force.
  • the mechanical response may include a haptic response such as mechanically shaking or vibrating a portion of the computerized system using a. mechanical actuator such that the touchpad is shaken by being mechanically coupled to the shaken portion of the computerized system.
  • the shaken touchpad may provide haptic feedback to the user's fingers in any combination of the visual, i.e. graphical representation, and/or audible feedback to the user indicating the user's action has been registered by the computerized system.
  • a different portion of the computerized system than the touchpad may be mechanically shaken or vibrated such a portion of the computerized system in mechanical contact with the user, e.g. a wearable device, or a device supporting the user such as a chair, seat, backrest, elbow rest and/or the like.
  • Haptic feedback may be useful when audible feedback is undesireable or ineffective, such as in an audibly noisy environment.
  • the system may determine (2950), using the model, that at least one finger of the user is touching a control region of the touchpad using a force range "B" different than force range A.
  • a force per finger from force range A may be greater than a force per finger from force range B.
  • force range B may include a range of force per finger between zero newton and about 0.5 newton.
  • the system may change the display of the graphical representation of the at least one touch point associated with the force per finger (2750, 2770) to a different size, color, position, shape, or display type.
  • the touch point associated with the force per finger (2750, 2770) may be depicted on the display screen as a smaller solid circle corresponding to a lighter touch of the user's finger than when the finger was initially touching the control region in Figure 27A and Figure 27D.
  • an audible signal may be generated in addition to or instead of the graphical representation of the touch point when the user's finger is touching the control region using force range B.
  • an mechanical response may be generated by the system in addition to or instead of the graphical representation of the touch point when the user's finger is touching the control region using force range B.
  • the computerized system may generate a. haptic feedback response from the computerized system to the user when the user's finger is touching the control region using force range B.
  • the finger is still in contact with the surface of the touchpad but with a. lighter touch than the initial touch.
  • the users lighter touch action may be interpreted by the system in similar fashion as to the earlier embodiments which described the user's finger being lifted completely off the touchpad surface, i.e. force equal to zero newton, the lift in the present embodiment include lifting the fmger merely to reduce the force exerted by the finger but not completely lifting the finger off the surface of the touchpad.
  • the lighter force lift technique may take less computational resources, provide faster system speed, and/or better reliability than when the finger is lifted completely off the touchpad surface because the lighter force touch point, location of the user's finger is directly available without having to estimate the position of a finger lifted completely off the touchpad surface.
  • the system may determine that the at least one finger is subsequently touching the control region using force range A. in accordance with the data and the model.
  • the system may change the displ ay of the graphical representation of the at l east one touch point, associated with the force per finger (2753, 2775) to a different size, color, position, shape, or display type.
  • the at least one touch point associated with the force per finger (2753, 2775) may be depicted on the display screen as a larger solid circle corresponding once again to a more forceful touch of the user's finger in force range A similar to that when the finger was initially touching the control region in Figure 27A and Figure 27D.
  • Figure 30 depicts a simplified exemplar ⁇ ' flowchart of a modified "lift and tap" technique of key entry modified to use a third force applied per finger for controlling an input from a user to the computerized system, in accordance with one embodiment of the present invention.
  • Figure 30 depicts the same features as Figure 29, except Figure 30 depicts the system may determine that the at least one finger is subsequently touching (3060) the control region using a force range "C" in accordance with the data and the model.
  • the third force applied per finger may be within a range of feree per finger from force range C.
  • Force range C may include a range of force per finger that is different than both force range A and force range B.
  • force range C may include a range of feree per finger that is greater than force range A, which in turn may be greater than the range of force per finger from force range B.
  • the computerized system may respond to a touchpad input sequence from a finger of a user that includes a medium force A, followed by a small force B, followed by a large force range C similar to what the user may use when typing on mechanically actuated keys. It is understood that other combinations of the three force ranges may be used in sequence to provide an input to the computerized system.
  • an audible signal may be generated in addition to or instead of the graphical representation of the touch point when the user's finger is subsequently touching the first, control region using the force range A.
  • an mechanical response may be generated by the system in addition to or instead of the graphical representation of the touch point when the user's finger is subsequently touching the first control region using the force range A.
  • the computerized system may generate a haptic feedback response from the computerized system to the user when the user's finger is subsequently touching the first control region using the force range A.
  • the sequence of steps for a user's finger actuating a command area on the touchpad in the embodiments above included the system responding to an optionally stronger force range, followed by a weaker force range, followed by a stronger force range, in that order.
  • the inverse sequence of force by the user's finger may be used where the system responds to a weak force range (optionally) applied by the user's finger, followed by a stronger force range, followed by a weaker force range, in that order.
  • the system may recognize and respond to any sequence of a first force range followed by a second force range that is different from the first force range applied by the user's finger to actuate a command area on the touchpad.
  • the embodiments of the present invention may be extended or modified for use with not only force-sensing touchpads that may directly determine the force exerted by a finger in contact with or touching the touchpad, but also with capacitive sensing touchpads.
  • a capacitive sensing touchpad may indirectly determine the force using a contact area included in the data from the touchpad.
  • force-sensing touchpads directly determine the force without using a contact area included in the data from the touchpad.
  • Figures 31 A - 3 IB respectively depict simplified exemplary side and top views of a portion of the touchpad (3110) using the contact area (3120) resulting from a first force FA, in accordance with one embodiment of the present invention.
  • Figures 31A depicts a user's finger (3130) pressing on the touchpad with a force FA using force range A, in accordance with one embodiment of the present invention as described above.
  • Force range A deforms the soft tissue of user's finger (3130) between the user's bone and the touchpad surface, which is more rigid than the soft tissue, forming a contact area (3120) at the touch point on the touchpad.
  • Figures 32A - 32B respectively depict simplified exemplary side and top views of a portion of the touchpad (3110) using the contact area (3220) resulting from a second force FB, in accordance with one embodiment of the present invention.
  • Figures 32A depicts user's finger (3130) pressing on the touchpad with a force FB using force range B, which is different than force FA, in accordance with one embodiment of the present invention as described above.
  • force range B may be less than force range A.
  • force FB deforms the soft tissue of user's finger (3130) between the user's bone and the touchpad surface to a lesser extent than when force FA is applied, forming a contact area (3220) at the touch point on the touchpad that has smaller area than contact area (3120) as depicted respectively in Figures 32B and 3 I B.
  • the system may use the contact area data from the touchpad to then indirectly calculate or determine the force range applied by the finger and/or hand against the touchpad.
  • the contact area information requires the soft and/or resilient tissue of the hand to be in contact with the touchpad without the touchpad supplying the force data directly, which for example, precludes the use of a rigid stylus to enter the touchpad data instead of a user's hand.
  • the system may then use the calculated force information in the same embodiments described in reference to Figures 27A - 30.
  • the user may designate a. portion of the touchpad surface area as being reserved for non-control purposes, e.g. "holding” purposes, hereinafter also referred to as a "non-control" region of the touchpad.
  • the system enables the user to designate or lock out a portion of the touchpad temporarily as a non-control region for holding the handheld computerized device without controlling an input when the user touches the non-control region.
  • the system enables the user to designate some or all of a touchpad as being at least temporarily a non-control region or off limits from a device control perspective by including an actuating button - either real or virtual.
  • certain user hand gestures such as a swipe border gesture followed by a swipe "x" gesture within the border, may be assigned and recognized by the system as temporarily turning off touch control within the portion of the touchpad covered by the border and the swiped "x". The user may then safely hold the handheld computerized device or other device by the non-control region of the touchpad.
  • the user may then, in one embodiment, actuate a corresponding "restore” (real or virtual) button, or implement an appropriate "restore control” gesture or set of gestures designated to execute the restore control command.
  • Figure 35 depicts a simplified exemplary flowchart of a "push and lift" technique of key entry for controlling an input from a user to the computerized system, in accordance with one embodiment of the present invention.
  • Figure 33 has the same features as Figure 26 with the following exceptions.
  • Figure 33 depicts that the computerized system first obtains (3310) data from the touchpad.
  • the data may be associated with the location and timed movement of a finger and/or hand of the user and not associated with an image of the finger of the user from an image sensor, when the user operates the computerized system using the touchpad.
  • the computerized system determines that at least one finger is initially touching a first control region on the touchpad in accordance with the data and the model.
  • the user may initially slide, touch, or tap one or more fingers over a key area or control region on the touchpad.
  • the system may use the biomechanical and anatomical model of the human hand to recognize this sliding, touching, or tapping action by the user's finger on the touchpad.
  • the computerized system may respond by, in turn, changing the visual appearance of the key or control area on the display as described above. Then, the computerized system determines (3350), using the model , that, at least one finger of the user is positioned above but not touching a first control region of the touchpad during a predetermined time window. In other words, the key or control region on the touchpad becomes activated or actuated when the system determines, again using the biomechanical and anatomical model of the human hand, that the user has then subsequently lifted their fingers or other portion of their hand from the key or control region of the touchpad.
  • Figure 34 depicts a simplified exemplary time-line of the "push and lift" technique depicted in Figure 33, in accordance with one embodiment of the present invention.
  • the computerized system may use a predefined delay time Td and a predefined time window Tw to provide the system a technique to distinguish between deliberate user control actions, and random signals that might be caused by the user simply holding the device by the input surface area of the touchpad without intending to send control signals.
  • the computerized system may be configured so that when the system determines (3340) the user initially slides, touches, or taps their fingers or other portions of the hand on a key or control region at time To, there is then only a limited predefined time window, Tw, during or in which the system may determine (3350) that a subsequent lifting of the user's fingers or other portion of their hand is considered to be a deliberate control signal so as to store a record of the lifting of the user's fingers in memory as described in the embodiments previously described.
  • time window Tw may initially commence or open at a first delay time Td after the system determines (3340) the initial sliding, touching, or tapping motion of the user's finger or hand over the key or control region on the touchpad at time T 0 . Once open, time window Tw may then remain open until a time equal to To + Td + Tw when the time window then closes. In other words, the predetermined time window closes at the sum of the first delay time and window duration time Tw.
  • time window Tw closes or after time To + Td + Tw, then if the system determines a finger of the user positioned above but not touching the first control region of the touchpad, then the computerized system may not consider such "finger lift" as a deliberate control signal and may not store a. record of such a finger lift.
  • the time duration for delay time Td may be between about 0 and 1 seconds after sliding was first detected at To, or in other words, predetermined time window Tw commences after determining that the at least one finger is initially touching the first control region, which is at To. In one embodiment, predetermined time window Tw commences at delay time Td after determining that the at least one finger is initially touching the first control region. In one embodiment, delay time Td includes a range of time equal to or less than 1 second. In one embodiment, the time duration for window duration time Tw may include a range of time between about 0.25 second to 30 seconds.
  • delay time Td and window duration time Tw may also be possible so long as Td and Tw are chosen so as to enable the computerized system to differentiate between finger lifts intended as inputs versus finger lifts resulting in unintended inputs, such as for example, during finger repositioning merely to better grip or support the touchpad.
  • determining (3350), using the model, using predetermined time window Tw enables the computerized system to differentiate between an intended input versus an unintended input by the at least one finger when the at least one finger is positioned above but not touching the first control region of the touchpad and when predetermined time window Tw is not open, such as before time To + Td and/or after time To + Td + Tw,
  • predetermined time window Tw is not open, such as before time To + Td and/or after time To + Td + Tw
  • Figure 35 depicts a simplified exemplary flowchart of a "hover and tap" technique of key entry for controlling an input from a user to the computerized system, in accordance with one embodiment of the present invention.
  • Figure 35 depicts features similar to Figure 26 with the following exceptions.
  • the computerized system obtains (3510) data from the touchpad.
  • the data may be associated with the location and movement of a multitude of fingers and/or a hand of the user. But the data, is not associated with an image of a. first finger of the multitude of fingers from an image sensor, when the user operates the computerized system using the touchpad.
  • Figures 36A - 36F depict a series of simplified exemplar ⁇ ' display screen shots of the "hover and tap" technique of key entry depicted in Figure 35 being used to type and enter the numeral "0" on a prototype computerized system, in accordance with embodiments of the present invention.
  • the computerized system may optionally determine (3540) that first finger Fl is initially touching a control region on the touchpad, in accordance with the data and the model.
  • the system displays a graphical representation of virtual touch-point (3610) of finger Fl, touching between virtual keys "9" and/or "0" (3620, 3630) respectively.
  • the computerized system may generate a special graphical representation of the control region when first finger Fl is initially touching a control region on the touchpad.
  • the computerized system may determine (3550), using the model and the data from the touchpad, that finger Fl of the user is positioned above but not touching, hereinafter also referred to as "hovering", a control region such as virtual key "9" of the touchpad, which in one embodiment may in turn cause the system to display the control region or virtual key "9” in a highlighted fashion as described previously.
  • the biomechaiiical and anatomical model of the human hand may be used to determine when a. user's finger is likely hovering above a key or other control area, which when detected may cause the computerized system to highlight the graphical representation of the displayed control area for virtual key "9" (3625).
  • the computerized system may predict (3545) the location of finger Fl in accordance with the analyzed data and the location of at least another finger of the multitude of fingers, such as for example finger F2 different than finger Fl and/or hand (2701).
  • the touch-point data from finger F2 and/or hand (2701), which is touching the touchpad, is used with the bioniechanicai and anatomical model of the human hand by the computerized system to predict the location of finger Fl even when direct real-time finger Fl touch-point location data is absent because finger Fl is hovering above the touchpad without making touch contact with the touchpad.
  • the prediction (3545) step may precede determining when a finger hovers step (3550), while in alternative embodiments the prediction (3545) step may come after determining when a finger hovers step (3550)
  • the changed positions of the touch-points of fingers F2 and F3 are used to predict the new real time position of still hovering finger Fl .
  • the computerized system in transition briefly enlarges or highlights the displayed graphical representation of both the previous key "9” (3640) and key "0" (3645).
  • the computerized system determines (3560) that finger Fl is subsequently touching the control region key "0" on the touchpad in accordance with the data and the model.
  • the user is now actuating the desired key "0” by touching the corresponding "key 0" region of the touch pad.
  • the system is indicating a. virtual key “strike” by altering the appearance of the graphical representation as a. enlarged or highlighted region of virtual key "0" 3665, and that additionally the system now registers or stores in memory and displays the number "0" (3670) on the top of the screen.
  • the user actuating a key or other control area by hovering and then pressing or tapping a key or control area is, from a human factors standpoint, easy for the user to learn and use because the user need merely tap his finger on a hover-highlighted key or control area in order to then actuate the corresponding virtual key or control area.
  • One advantage of this approach is that that the user does not have to do a separate tap and lift motion to inform the system about his interest in actuating a particular key. Rather, when the user's finger hovers above a given virtual key, but does not actually touch it, the system will enlarge or otherwise highlight the key by predicting the location of the user's hovering finger without having to first touch the touchpad at the desired control region location.
  • the hovering finger can more easily “hover- slide” onto a particular virtual key of interest, as depicted by the user both hovering and sliding between the key “9" and the key “0" locations in Figures 36B - 36D,
  • the user may experience a more natural keyboard typing experience, although the motions and positions of the fingers are otherwise different since the touchpad may be obscured from direct user view.
  • the embodiments described in reference to Figures 35 - 36F may be combined in any combination with the embodiments referenced in Figures 1 - 34 described previously.
  • the position and orientation of the virtual keyboard can be set to track the position and orientation of a portion of the user's hand and/or finger as the user operates the touchpad.
  • the virtual keyboard generation software may optionally still perform a certain amount of time averaging of the position and orientation of the portion of the user's hand and fingers, this time averaging may be relatively short, on the average of less than a tenth of a second to a few seconds, or even not at all, in some examples. Making the time averaging either very short or zero enables the system to essentially annotate the image of the moving virtual hand and fingers with corresponding virtual keys displayed in a display screen of the computerized system. In an embodiment, and as will be discussed in detail below, these keys may be assigned to the different fingers of the user in accordance with a standard QUERTY format, or other keyboard formats.
  • a portion of the virtual hand and/or the virtual finger may be within immediate striking range of the various virtual keys, thus enabling the user to start typing immediately regardless of the orientation of the user's hand relative to the touchpad or the device display screen.
  • the computerized system may assign touch data, pertaining to a position of a portion of the hand of the user when the user operates the touchpad to at least one of a multitude of fingers of the model of the human hand and compute a graphical representation of the user's fingers in accordance with the model.
  • the computerized system may be configured to identify a set of virtual keys to be associated with each of the user's virtual (i.e., graphical representation of) fingers.
  • the assignment of the set of virtual keys to a user's virtual finger may be perfonned in various ways.
  • each virtual finger may be associated with a pre-defined set of keys.
  • each virtual finger may be associated with a group of keys that the user is familiar with, such as based on a QIJERTY keyboard configuration.
  • the left hand index finger may be associated with the keys "R ⁇ 'F', 'V, ' ⁇ ', ⁇ ', ' ⁇ ', which may be the same set of keys that this finger would be expected to strike if the finger were operating a real QUERTY keyboard.
  • the right hand middle finger may be associated with the keys T, and ' ' and other QUERTY keys that the right hand middle finger would be expected to strike if this finger were operating a. real QUERTY keyboard. While the above discussion relates to mapping keys arranged in a QUERTY configuration to the different fingers of the user, it should be appreciated that other finger-key mapping schemes may also be implemented, in other embodiments. For example, other methods to assign keys to specific fingers of the user's virtual hand may be used such as numeric keypad schemes, alternative non-QUERTY keyboard schemes, such as Dvorak, Colemak, as well as various variants of the QUERTY keyboard technique.
  • the computerized system may be configured to generate a control region comprising the set of virtual keys and display the control region in a first, position of the display screen of the computerized system.
  • the control region may- display the set, of virtual keys associated with the fingers of the user.
  • the computerized system may be configured to update the position of the set of virtual keys associated with the movement of that particular user' s finger on the touchpad, on the display screen.
  • the control region may also display a graphical representation of the user's fingers and the different sets of virtual keys associated with each finger, on the display screen.
  • the computerized system may be configured to update the position of the virtual finger, as well as the group of virtual keys associated with the virtual finger in the control region in accordance with the movement of the user's finger on the to chpad.
  • the virtual keys 'U', , 'M' as well as ⁇ ', ⁇ ', and ' ' associated with the user's right index finger displayed in the control region may also move, in some examples, the user's right index finger may be appropriately positioned in the control region to strike these keys.
  • resting keys i.e. keys, such as ' ⁇ ', 'S', 'D', 'F' (left hand) and , ' ', ' ⁇ ', and '; ' (right hand) that a relaxed user's hand would normally contact on a QUERTY keyboard when not pressing a. key may also move when the fingertip of the user's finger moves.
  • 'resting keys' may refer to the set of keys on a QUERTY keyboard that the user's fingers may typically rest on when not typing.
  • the resting keys may move in addition to the movement of the set of virtual keys that are associated with this particular finger that would normally be expected to be struck by the user's finger.
  • the computerized system may be configured to position the user's virtual fingers and the corresponding sets of virtual keys associated with the virtual fingers to be located anywhere in the display screen according to the hardware limitations of the touchpad and the display screen.
  • the keys associated with this finger are both frozen or fixed into position, as well as set to an 'enabled' configuration.
  • the 'enabled' keys may be displayed by visually highlighting these keys on the display screen. For example, the 'enabled' keys may be highlighted by showing these keys as brighter, or more enlarged, or in a different color, and so on.
  • the touchpad may include pressure sensitive sensors that are capable of sensing the pressure of a touch of a portion of the hand of the user on the touchpa d.
  • the computerized system may obtain touch data, pertaining to the amount of pressure applied by the user when the user operates the touchpad. Based on the obtained touch data, the computerized system may determine whether to enable a virtual key associated with the user's finger by visually highlighting the virtual key. For example, the computerized system may 'enable' a virtual key associated with the user's virtual finger in the control region if the obtained touch data pertaining to the amount of pressure applied by the user indicates that, the pressure is within a pre-determined threshold value.
  • this pre-determined pressure threshold value may be specified by the user of the computerized system.
  • the user places the tip of the lifted finger back on to an enabled virtual key of an associated group of enabled virtual keys (i.e., when the user's finger subsequently touches the touchpad), then that particular key is selected (i.e. considered pressed).
  • This feature thus enables a user to move his or her hand freely on the touchpad surface, while still allowing the user to type using conventional typing techniques.
  • the user can precisely strike a. virtual key according to the standard relative position of that key on a standard keyboard.
  • the disclosed technique enables the user to leverage off of a user's long standing 'muscle memory' of relative key positions on a standard keyboard.
  • Figure 37 depicts a simplified exemplary flowchart of a method 3700 for controlling a control region on a display screen of a computerized system, in accordance with an embodiment of the present invention.
  • the method includes obtaining, using the computerized system, first data from a. touchpad, the first data being associated with a position of a portion of the hand of a. user when the user operates the computerized system using the touchpad at 3702, In an example, the first data is not associated with an image of a finger of the user from an image sensor.
  • the method then includes transmitting the first data from the touchpad to the computerized device at 3704.
  • the touchpad may be located in a location that is different from the location of the display screen.
  • the method may include analyzing the first, data in accordance with a model of a human hand and assigning the first data to at least one of a plurality of fingers of the model.
  • the method may include computing a graphical representation of a first finger of the portion of the hand of the user on the touchpad in accordance with the model of the hum an hand.
  • the method may include identifying a first set of virtual keys to be associated with the first finger.
  • the method may include generating a control region comprising the first set of virtual keys.
  • the control region may include the first set of virtual keys associated with the first finger of the user.
  • the method may include displaying the control region in a first position on the display screen in accordance with the position of the first finger. Additional details of the manner in which the method of Figure 37 may be implemented are discussed in detail in Figures 38 and 39 below.
  • Figures 38A-38F depict a series of simplified exemplary illustrations, 3800, 3802, 3804, 3806, 3807 and 3810 of controlling a control region displayed in a display screen of a handheld computerized device by a user operating a touchpad, in accordance with embodiments of the present invention.
  • the control region displays different sets of keys associated with different fingers of the user's hand as the user proceeds to type at touch-typing speed using, for example, the 'lift and tap' technique discussed in Figure 26. It may be noted that at this point, the system has already assigned the touch data, from the touchpad to at least one of the multitude of fingers of the model and has computed a. graphical representation of the at least one finger of the user, in accordance with the model.
  • the system determines (in accordance with the touchpad data and the model of the human hand discussed in Figure 37) that the four fingers (Fl, F2, F3, F4) of the user's right hand 3804 are contacting the touchpad on the back side of the computerized device display, but neither the fingers nor the thumb of the user's left hand 3806, nor the thumb of the user's right hand 3804, are yet contacting the touchpad. Based on the obtained touchpad data, the system may then identify different sets of keys to be associated with each of the virtual fingers (Fl, F2, F3, F4) of the user's hand.
  • the system may then generate a control region comprising a set of virtual keys to be associated with a finger (Fl, F2, F3, F4) of the user's hand.
  • the system may then display the control region in a first position in the display screen 3802.
  • the virtual keys ( ⁇ ', ' ⁇ ', ' ⁇ ') and (' ⁇ ', , ' ⁇ ') associated with the finger Fl may comprise a first control region.
  • the virtual keys (T, 'K', ',') associated with the finger F2 may comprise a second control region
  • the virtual keys ('()', 'L', '.') associated with the finger F3 may comprise a third control region and so on.
  • the letters corresponding to the sets of keys associated with the fingers (Fl, F2, F3, F4) are shown as being positioned near or close to or on top of the user's fingers,
  • the system may determine that the four fingers of the user's left hand 3806 are also now touching the touchpad on the back side of the computerized device, and the system therefore generates and displays a control region in the display screen 3802 comprising the virtual representation of the four fingers of the user's left hand and the sets of different keys associated with each of the four fingers of the user's left hand, on the display screen 3802.
  • the sets of virtual keys associated with each hand may be separated and rotated.
  • the example illustrated in Figure 38B further indicates that the user has raised the little finger of the right hand 3804 above the touchpad, causing the letters (' ⁇ ', ';' and '/") normally associated with the right little finger to become highlighted.
  • Figure 38C illustrates all four fingers of each hand of the user touching the touchpad.
  • system may not highlight any of the keys associated with the user's fingers and may depict them in their normal sizes, thus allowing the user to freely move his hands while operating the touchpad.
  • the system may not highlight a particular key (e.g., 'P') associated with the user's finger if the system determines that the user's finger is kept raised (without subsequently touching the touchpad) for a period of time that is greater than a predetermined time out, period (e.g., 0-30 seconds) or if the user's finger touches the outside of the highlighted area associated with the particular virtual key.
  • a predetermined time out, period e.g., 0-30 seconds
  • Figure 38D the user has raised the little finger of the left hand 3806 above the keyboard, causing the keys, 'Q, 'A' and ' ⁇ ', associated with this finger to become activated.
  • the letters, 'Q', ⁇ ', ' ⁇ ,' are shown as activated by highlighting and /or enlarging these letters.
  • the letters, 'Q', ⁇ ', ' ⁇ ,' may be fixed (i.e., frozen) into position so that the user can precisely strike any one of these keys according to the standard relative positions of those keys on the display screen.
  • Figure 38E the user has just pressed the location on the touchpad that corresponds to the location of the key 'A' on the display screen with the tip of his left little finger.
  • the system may then be configured to depict the newly entered letter 'A' at a position (e.g., the top) 3809 of the computerized device screen, as well as start to shrink the enlarged highlighted letters ('Q', ⁇ ', ' ⁇ ') since the tip of the left little finger is now in contact with the touchpad again.
  • the system since the tip of the left middle finger is again in contact with the surface of the touchpad, the system no longer highlights the keys, 'Q', 'A' and 'Z', and depicts them in their normal sizes.
  • the system may be configured to store the letter 'A' just typed by the user in system memory, as well as shows the letter as being displayed (e.g., 3809) in the display screen 3802.
  • FIGS 39A-38F depict a series of simplified exemplary illustrations, 3900, 3902, 3904, 3906, 3908 and 3910 of controlling a control region displayed in a display screen of a handheld computerized device by a. user operating a touchpad, in accordance with another embodiment of the present invention.
  • the touchpad is deta ched from its normal position on the back of the handheld computerized device and positioned in front of the handheld computerized device. It is to be appreciated that by placing the touchpad in front of the device, a user may simultaneously observe the position of his hands and fingers, as well as the corresponding motions of his virtual hand and fingers and the action of the pressed virtual keys on the display screen of the handheld computerized device,
  • Figure 39A is an exemplary illustration of a user detaching the touchpad 3901 from its normal location on the back of the computerized device to reposition the touchpad to the front of the computerized device. As discussed above, by re -positioning the touchpad to the front of the device, the user may simultaneously observe both the actions of the user's real fingers on the touchpad as well as the corresponding actions of the user's virtual hand in the display screen.
  • Figure 39B is an exemplar ⁇ ' illustration of a user touching the detached touchpad 3901 near the bottom of the touchpad using the four fingers of the user's right hand.
  • the user's thumb may not actually touch the touchpad.
  • the system based on the model of the user's hand, may determine the position of the user's thumb and display the user's virtual thumb in a corresponding location of the display screen.
  • the system may be configured to display the user's virtual thumb in an alternate color on the display screen. The alternate color may enable the user to recognize that, this part of the user's hand is not touching the touchpad.
  • the system may be configured to display a more realistic virtual hand of the user by displaying portions of the user's hand that are not touching the touchpad.
  • the user's left hand may be visually distinguished from the user's right hand.
  • the system may be configured to assign a set of virtual keys to be associated with the user's virtual thumb, in this implementation, the user's virtual thumb and its associated virtual keys may also move in accordance with the user's thumb when the user's thumb touches and moves around the touchpad.
  • Figure 39B also displays a control region in a first position in the display screen 3903.
  • the keys ( ⁇ ', ⁇ ', ' ⁇ ') associated with the finger Fl may comprise a first virtual region.
  • the keys f'U', 'J', 'M') associated with the finger F2 may comprise a second control region, the keys (T, " K ' .
  • associated with the finger F3 may comprise a third control region, and so on.
  • the letters corresponding to the sets of keys associated with the fingers are shown as being positioned near or close to or on top of the user's fingers.
  • Figure 39C is an exemplary illustration of the user moving his left hand to a. different position of the touchpad.
  • the system may be configured to obtain touchpad data that indicates a position (e.g., a location and/or movement) of a portion of the user's hand and/or the user's fingers on the touchpad. Based on the obtained data, in one embodiment, the system may then be configured to determine an angular position of the portion of the user's hand and/or the user's fingers on the touchpad. The system, may then be configured to reposition the control region comprising the user's virtual fingers and its associated virtual keys in accordance with the angular position of the portion of the user's hand and/or the user's fingers on the touchpad.
  • Figure 39D is an exemplary illustration of a user not yet, raising his left middle finger 3907 above the touchpad. However, since the system has not yet received inputs according to the 'lift and tap' technique discussed above, the system has not yet responded by highlighting the virtual keys, E', 'D', 'C associated with this finger.
  • Figure 39E is an exemplary illustration of the system highlighting (for e.g., by enlarging) the keys ⁇ ', T)', 'C ⁇ normally associated with the user's left middle finger on the QUERTY keyboard when the system detects that the user has raised his left middle finger 3907 above the touchpad. It may be observed that the system has also locked in (i.e., frozen or fixed) the position of these keys so that they now do not move (on the display screen) when the user moves this finger or a portion of his hand on the touchpad. Figure 39E also indicates that the user is in the process of starting to bring the tip of his left middle finger 3907 back onto the touchpad in the touchpad position corresponding to the letter ' ⁇ '.
  • the system since the tip of the left middle finger is again in contact with the surface of the touchpad, the system no longer highlights the keys, ⁇ ', 'D' and 'C, and depicts them in their normal sizes.
  • the system may be configured to register a particular key (e.g., ⁇ ') struck by the user and store the letter ⁇ ' in system memory, as well display the letter ⁇ ' in a position 3907 in the display screen 3903.
  • the computerized system e.g., the handheld computerized device 100
  • the computerized system may be configured to detect an interaction of a portion of the hand of the user when the user operates the touchpad and/or the display screen of the
  • the computerized system may then be configured to cause a property of an object displayed on the display screen to be controlled in accordance with the interaction.
  • the object may correspond to a multipage application (e.g., an electronic book), a page oriented application (e.g., a word processing application) or the like, displayed on the display screen of the computerized system.
  • a multipage application e.g., an electronic book
  • a page oriented application e.g., a word processing application
  • the computerized system may be configured to detect a finger swipe of the user's finger on the touchpad as an action indicative of the user's desire to turn a page of the electronic book.
  • the computerized system may then be configured to change the page number of the electronic book in response to the finger swipe.
  • the computerized system may be configured to detect a finger swipe of a plurality of fingers of the user on the touchpad when the u ser interacts with the multipage application (e.g., an electronic book) displayed on the display screen.
  • the computerized system may then be configured to change the page number by a pre-determined number of pages in response to the finger swipe. For instance, a single finger swipe may cause the computerized system to change the page number of the multipage appl ication by a single page, a two finger swipe may cause the computerized system to change the page number of the mul tipage application by two pages, a three finger swipe may cause the computerized system to change the page number of the multipage application by three pages, and the like.
  • the computerized system may be configured to determine the pre-determined number of pages to be changed as a function of the number of individual fingers used in the finger swipe.
  • the pre-determined number of pages to be changed may be determined as a function of a power of two raised to the power of the number of individual fingers used in each finger swipe, minus 1 .
  • the pre-determined number of pages to be changed may be represented by the function: 2"" geraum"1 , where fingernum represents the number of lingers used in the finger swipe.
  • a single finger swipe may translate to 2' to change the page number by a single page
  • a 2 finger swipe may translate to 2 1 to change the page number by two single pages, a.
  • 3 finger swipe may translate to 2 ' to change the page number by four pages, a 3 finger swipe may translate to 2 J to change the page number by 8 pages and so on.
  • other base numbers may also be used, such as a base 10 function, for example, when scrolling thorough extremely large documents. For instance, using a base 1 0 function, a single finger swipe may translate to 10° to change the page number by a single page, a 2 finger swipe may translate to 10 1 to change the page number by 10 pages, a 3 finger swipe may translate to 10 to change the page number by 100 pages, a 3 finger swipe may translate to 0 3 to change the page number by 1000 pages, and so on.
  • the computerized system may be configured to detect a velocity of the finger swipe of the user on the touchpad.
  • the computerized system may then be configured to change the page number by a pre-determined number of pages in response to the velocity .
  • the computerized system may be configured to change the page number by a greater amount (e.g., two pages) when it detects a relatively fast finger swipe and change the page number by a smaller amount (e.g., one page) when it detects a relatively slower finger swipe.
  • the computerized system may be configured to determine a first direction of the finger swipe and increment the page number by a pre-determined number of pages in response to the first direction. For instance, the computerized system may be configured to increment the page number by a pre-determined number of pages when the computerized system detects a finger swipe in the 'right' or 'forward' direction. Similarly, the computerized system may be configured to decrement the page number by a pre-determined number of pages when the computerized system detects a finger swipe in the 'left or 'backward direction.
  • Figure 40 depicts simplified exemplar ⁇ ' illustrations, 4000 and 4002, that indicate the manner in which a computerized system (e.g., the handheld computerized device 100) may interpret a single finger swipe from a. user operating a touchpad on the back side of the computerized device, in accordance with an embodiment of the present invention.
  • the handheld computerized device 100 may be configured to detect a single finger swipe of the user's finger on the touch pad 200 to control a property (e.g., to change a page number of a virtual page) of an object (e.g., an electronic book) displayed on the display screen 102.
  • a property e.g., to change a page number of a virtual page
  • an object e.g., an electronic book
  • the user's thumb 4004 is shown resting on the edge of the device 100 or on the display screen 102 while the user's other fingers are shown located behind the device 100 and resting on the device's rear touchpad 200 (not, shown in FIG. 40).
  • the display screen 102 shows a first page, e.g., page 1 from an electronic book.
  • the user interacts with the electronic book by extending a finger, 4006, on the touchpad 200 to perform a single finger swipe to turn page 1 of the electronic book.
  • the handheld computerized device 100 detects the single finger swipe as a command to change the page number to page 2 of the book.
  • the device 100 displays page 2 to the user as shown in the illustrated example, 4002.
  • FIG. 45 depicts simplified exemplary illustrations, 4100 and 4102, that indicate the manner in which a handheld computerized device may interpret a multiple finger swipe from a user operating a touchpad on the back side of the computerized de vice, in accordance with another embodiment of the present invention.
  • the handheld computerized de vice 500 may be configured to detect a multiple finger swipe of a plurality of the user's fingers on the touch pad 200 to control a property (e.g., to change the page number by a pre-determined number of pages) of an object (e.g., an electronic book) displayed on the display screen 102.
  • a property e.g., to change the page number by a pre-determined number of pages
  • an object e.g., an electronic book
  • the user's thumb 4104 is shown resting on the edge of the device 100 or on the display screen 102 while the user's other fingers are shown located behind the device 100, and resting on the device's rear touchpad 200 (not shown in FIG. 41).
  • the display screen 102 shows a first page, e.g., page 1 from the electronic book.
  • the user interacts with the book by extending two fingers, 4106 and 4108 on the touchpad 200 to perform a two finger swipe across the touchpad.
  • the handheld computerized device 100 detects the two finger swipe and interprets the two finger swipe as a command to change the page number from page 1 to page 3 of the book.
  • the device 100 displays page 3 to the user as shown in the illustrated example, 4102,
  • the computerized system may be configured to detect an interaction of a first finger of the user on a first touchpad of the computerized system with a second finger of the user.
  • the second finger may be located on a second touchpad of the computerized system.
  • the second touchpad may be located in a location that is different from, the first touchpad.
  • the second finger may also be located on the display screen.
  • the first touch pad, the second touchpad and the display screen may all be located in different locations and need not be physically connected to each other.
  • the display screen may be in nearly any location, such as on a regular monitor, TV screen, projector screen, or on a virtual heads-up eyeglass display worn by the user (e.g. a device similar to Google Glass).
  • the computerized system may be configured to identify a first position of the first finger on the first touchpad and identify a second position of the second finger on the second touchpad.
  • the computerized system may then be configured to detect a selection of an object, by the user.
  • the object may correspond to a finger controlled device such as a watch dial, a camera, a robotic arm, a two-dimensional (2-D) object, a three- dimensional (3-D) object or other device displayed on the display screen.
  • the user may select the object displayed on the display screen using the first finger.
  • the user may select the object using the second finger on the second touchpad.
  • the user may also select the object using both the first finger on the first touchpad as well as the second finger on the second touchpad ,
  • the computerized system may then be configured to detect a movement of the first position of the first finger relative to the second position of the second finger and cause a property of an object displayed on the display screen to be controlled and/or altered in response to the detected movement.
  • the computerized system may also be configured to detect a movement of the second position of the second finger on the second touchpad relative to the first position of the first finger on the first touchpad and cause a property of an object displayed on the display screen to be controlled and/or altered in response to the detected movement.
  • the computerized system may be configured to rotate the displayed object about an axis of rotation, change a display characteristic (e.g., color) of the object, alter the size (e.g., enlarge or diminish) of the object, move the object and the like, based on the detected movement.
  • a display characteristic e.g., color
  • alter the size e.g., enlarge or diminish
  • Figure 42 depicts exemplary illustrations, 4200, 4202 and 4204 of the manner in which a computerized system (e.g., the hand held computerized system, 100) may detect finger gestures from a user using multiple touch pads and/or the display screen of the computerized system to control an object, displayed on the display screen, in accordance with an embodiment of the present invention.
  • the user may interact with an object (e.g., 4206) displayed on the display screen 102 by placing a first finger 4208 on the first touchpad (e.g., 200) located on the back side of the device 100 and a second finger 4210 on the display screen 102 (e.g., the front touch screen of the device 100).
  • the user may then move the first finger 4208 relative to the second finger 4210 to alter a desired property (e.g., an axis of rotation, a color, a size of the object and the like).
  • a desired property e.g., an axis of rotation, a color, a size of the object and the like.
  • the user may move the first finger 4208 relative to the second finger 4210 to rotate the object about an axis of rotation, change the color of the object, enlarge or diminish the object, control an angle at which the object is displayed and the like.
  • the computerized system may then be configured to cause a particular property (e.g., an axis of rotation, a color, a size of the object and the like) of the object to be controlled and/or altered as discussed below.
  • the user ' s first finger 4208 may be located on the touchpad 200 and the user's second finger may be located on the display screen 102.
  • the user may move the first finger 4208 in a first direction (e.g., right) and the second finger 4210 in a second direction (e.g., left).
  • the computerized system may interpret the movement 4212 as a twist about the axis (not shown) of the object and cause the object to rotate about its axis.
  • the user's first finger 4208 and the user's second finger 4210 are located on different touchpads or touch surfaces.
  • the user's first finger may be located on a first touchpad, 4216 and the user's second finger may be located on a second touchpad, 4218 located in a different location from the first touchpad 4216.
  • the user's first finger may be located on the second touchpad 4218 and the user's second finger may be located on the first touchpad 4216.
  • the user may move the first finger 4208 in a downward direction, and a second finger 4210 in an upward direction.
  • the computerized system may cause the object to rotate about its axis, 4220.
  • the user's first finger 4208 is located on the touchpad 200 and the user's second finger 4210 is located on the display screen (e.g., 102),
  • the computerized system may interpret this movement as a twist around the object's axis 4222 and cause the object to rotate about its axis, 4222.
  • the device 100 may be configured to cause various other properties of the displayed object to be controlled in response to a detected movement, in other embodiments.
  • the device 100 may be configured to alter a display characteristic (e.g., color) of the object in response to the detected movement.
  • the device 100 may be configured to alter the size (e.g., enlarge or diminish) of the object, move the object and the like in response to the detected movement.
  • the computerized system may be configured to identify a first position of the first finger of the user on the first touchpad and identify a second position of the second finger on the second touchpad.
  • the computerized system may then be configured to detect a selection of a point of rotation of an object displayed on the display screen.
  • the object may correspond to a virtual joystick, a virtual hand or mechanical claw displayed on the display screen, in some examples, the user may select the object displayed on the display screen using the first finger.
  • the user may also select the object using the second finger on the second touchpad that is located at a. different location from the first touchpad.
  • the user may also select the object using both the first finger on the first touchpad as well as the second finger on the second touchpad.
  • the computerized system may then be configured to detect a movement of the second finger on the second touchpad relative to the first position of the first finger on the first touchpa d.
  • the computerized system may also be configured to detect a movement of the first position of the first finger on the first touchpad relative to the second position of the second finger on the second touchpad.
  • the computerized system may then be configured to cause a property of the object to be controlled based on the detected movement. For instance, the computerized system may cause the object to move around the stationary point of rotation based on the detected movement.
  • the computerized system may push the handle of the virtual joystick in a direction and magnitude defined by a corresponding direction and magnitude of the movement of the second position of the second finger relative to the first position of the first finger.
  • Figure 43 depicts exemplary illustrations, 4300 and 4302 of the manner in which a computerized system (e.g., the handheld computerized device 100) may detect finger gestures from a user using multiple touch pads and/or the display screen of the computerized system to control an object displayed on the display screen, in accordance with an embodiment of the present invention.
  • the user may interact with an object (e.g., virtual joystick 4312) displayed on the display screen 102 by placing a first finger 4304 in a first position 4308 on the touchpad 200 located on the back side of the device 100 and a second finger 4306 in a second position 4310 on the display screen 502 of the device 100.
  • an object e.g., virtual joystick 4312
  • the user may hold the first finger 4304 stationary, while moving the second finger 4306 relative to the first finger to alter a desired property (e.g., handle 4354) of the displayed object (i.e., virtual joystick 4312.
  • a desired property e.g., handle 4354
  • the user may move the first finger relative to the second finger while holding the second finger stationary to alter the desired property (e.g., handle 4314) of the displayed object (i.e., virtual joystick 4312).
  • the user may move the moving finger (i.e., the first finger or the second finger) in an upward, downward, left, right or circular direction 4316 around the first position or the second position to alter the desired property (e.g., point of rotation 4316 of the handle 4314) of the displayed object (i.e., virtual joystick 4312).
  • the computerized system may be configured to push the handle 4314 of the virtual joystick 4312 in the direction and magnitude as defined by the direction and magnitude of the movement (e.g., 4316) of the first finger in relation to the second finger,
  • the computerized system may be configured to select a stationary point of rotation (i.e. center 4316) of the virtual joystick 4312 and its handle 4314 based on a second position of the user's second finger 4306 on the display- screen 102.
  • the computerized device may further be configured to interpret the movement of the user's first finger 4304 to issue a command to the virtual joystick 4312 to control the operation of the virtual joystick, (e.g., to move the virtual joystick's handle 4314 around the joystick handle's stationary point of rotation 4316)
  • the computerized system may be configured to identify a first contact position of the first finger of the user and detect a selection of an object displayed on the display screen using the first finger.
  • the first, finger may be located on the touchpad 200 located on the back of the device 100.
  • the computerized system may then be configured to detect a change in a characteristic of the first contact, position and cause at least one property of the object to be controlled based on the change in the characteristic of the first contact position.
  • the characteristic of the first contact position may include an area, size or shape of the contact position.
  • the computerized system may be configured to detect a movement of the first finger away from the first contact position, a change in the angl e of the first finger in the first contact position, a change (increase or decrease) in a touch area of the first contact position. In response to the detected movement, the computerized system may then be configured to apply corresponding pressure and/or a corresponding momentum to the displayed object. [0325] In certain embodiments, the computerized system may be configured to identify a first contact position of the first finger of the user on a first touchpad and a second contact position of a second finger of the user on a second touchpad, located in a location that is different from the first touchpad.
  • the computerized system may then be configured to detect a selection of an object displayed on the display screen using both the first finger and the second finger and detect a change in a characteristic of the first contact position and the second conta ct position. In some embodiments, the computerized system may then be configured to cause at least one property of the object to be controlled based on the change in the characteristic of the first contact position and the second contact position.
  • Figure 44 depicts exemplary illustrations, 4400, 4402 and 4404 of the manner in which a computerized system (e.g., the hand held computerized system, 100) may detect finger gestures from a user using multiple touch pads and/or the display screen of the computerized system to control an object, displayed on the display screen, in accordance with an embodiment of the present invention.
  • the user may interact with an object (e.g., virtual bow and arrow 4418 or a virtual squeeze ball 4420) displayed on the display screen 102 by placing a first finger 4406 on the touchpad 200 located on the back side of the device 100.
  • the computerized system may then be configured to detect a selection of the object (e.g.
  • the computerized system may then be configured to detect a change in a characteristic of the contact area 4408 and cause a property of the displayed object to be controlled based on the detected change in the contact area.
  • the computerized system may be configured to detect an increase (e.g., 50%, 100% or the like) in the contact or touch area of the first contact position 4408, for example, when the user presses the touchpad.
  • the computerized system may then be configured to determine a corresponding force and/or pressure or load to be applied to the displayed object based on the detected increase in the touch area.
  • the computerized system may be configured to apply a first, relatively smaller force 4414 to the string of the virtual bow and arrow 4418 based on the first contact position 4408 of the user's finger 4406 and a second, relatively larger force 4416 to the string of the virtual bow and arrow 4418, based on detecting an increase 4410 to the touch area of the first contact position 4408.
  • the computerized system may be configured to apply a first, relatively smaller pressure 4420 to the virtual squeeze ball 4424 based on the first contact position 4408 of the user's finger and a second, relatively larger pressure 4422 to the virtual squeeze ball 4424, based on detecting an increase 4410 to the touch area of the first contact position 4408.
  • the computerized system may be configured to reduce the pressure (4416, 4422) applied to the virtual object (4418, 4424) when the system detects a decrease in the touch area 4410 of the first conta ct position.
  • the computerized system may also be configured to detect a movement of the first finger away from the first contact position, a. change in the angle of the first finger in the first contact position and the like to alter a property of the displayed object such as a change in the object's momentum, a force applied to the object or a change to other parameters of the object.
  • the computerized system may be configured to detect a movement of a first finger of the user from a first position to a second position on the first touchpad. Based on the detected movement, the computerized system may further be configured to reposition an object displayed in the display screen and enable, for the user, an interaction with the re-positioned object. In some examples, the computerized system may be configured to reposition the object, accordance with a direction of the movement of the first finger from the first, position to the second position. In other examples, the computerized system may be configured to reposition the object in accordance with an amount of movement of the first finger from the first, position to the second position.
  • a. user may wish to operate the handheld computerized device using a single hand.
  • the user may use a thumb (often in front of the device) and another finger, such as an index finger (often on the back of the device) to grip or otherwise ho ld the device.
  • the remaining fingers of the user may be positioned on the back of the device, and somewhat free to move (for e.g., over the touchpad 200 located on the back of the handheld device (100)), but may typically be constrained by the user's hand geometry to reside near the same edge of the device where the user's thumb and index finger are gripping the device,
  • the user may not be able to reach an object displayed on the display screen from the side of the device using the user's three remaining free (non-gripping) fingers.
  • the user may drag or scoop (i.e. reach out, and then pull in) one or more of the remaining free (non-gripping) fingers to reach the displayed object.
  • the computerized system may be configured to detect this dragging or scooping movement of the user's finger and/or fingers towards the location of the object displayed on the display screen.
  • the computerized system may then be configured to reposition the object to a location on the display screen that is within a pre-determined distance from the position of the user's gripping finger.
  • Figure 45 depicts exemplary illustrations, 4500 and 4502 of the manner in which a.
  • the computerized system may detect finger gestures of a user from multiple touch pads and/or the display screen of the computerized system to control an object displayed on the display screen, in accordance with an embodiment of the present invention.
  • the user may interact with an object (e.g., a set, of virtual keys 4504) displayed in a virtual control region on the display screen 102 by gripping the device using a thumb 4510 and an index finger 4512 and using the remaining free fingers to interact with the object.
  • the user may drag or scoop (i.e. reach out, and then pull in) the middle (non- gripping) finger 4506 to reach the set of virtual keys 4504.
  • the computerized system may be configured to detect the movement of the finger 4506 towards the set of virtual keys 4504 and re -position the set of virtual keys 4504 to location that is closer (i.e., is within a predetermined distance) to the index finger 4512 or the thumb 4510 of the user on the touchpad.
  • the location of the virtual keys is brought to within striking range of the free (non-gripping) finger 4506
  • the user may strike the virtual keys 4504 using the free (non- gripping) finger 4506, and thus interact with the object.
  • the user may move the virtual keys 4504 within striking range of the non- gripping finger 4506 by moving the finger 4506 over the touchpad 200, in a direction 4508 either towards or away from the position of the set of virtual keys 4504.
  • Figure 46 depicts a simplified exemplary flowchart of a method 4600 for controlling a control region on a display screen of a computerized system, in accordance with an embodiment of the present invention.
  • the method includes obtaining, using the computerized system, first data from a first touchpad, the first data being associated with a position of a portion of the hand of a user when the user operates the computerized system using the first touchpad at 4602.
  • the first data is not associated with an image of a finger of the user from an image sensor.
  • the method then includes transmitting the first data from the first touchpad to the computerized device at 4604.
  • the first touchpad may be located in a. location that is different from the location of the display screen.
  • the method may include analyzing the first data in accordance with a. model of a human hand and assigning the first data to at least one of a plurality of fingers of the model.
  • the method may include detecting an interaction of a portion of the hand of the user on the first touchpad with the object displayed on the display screen.
  • the method may include causing, by the computerized system, at least one property of the object to be controlled in accordance with the interaction of the portion of the hand of the user on the first touchpad.
  • the method may include detecting a finger swipe of a first finger of the user on the first touchpad when the user interacts with the at least one object on the display screen.
  • the object may correspond to a multipage application displayed on the display screen and the property of the object may correspond to a page number in the multipage application.
  • the method may include changing the page number of the multipage application in response to the finger swipe.
  • the method may include detecting a finger swipe of a plurality of fingers of the user on the first to uchpad when the user interacts with the object, on the display screen.
  • the method may include changing the page number by a pre-determined number of pages in response to the finger swipe of the plurality of fingers.
  • the method may include detecting a velocity of the finger swipe and changing and the page number by the pre-determined number of pages in response to the velocity.
  • the method may include determining a first direction of the finger swipe and incrementing the page number by a pre-determined number of pages in response to the first direction.
  • the method may include determining a second direction of the finger swipe and decrementing the page number by a pre-determined number of pages in response to the second direction.
  • the first direction may be different from the second direction.
  • the method may include detecting an interaction of at least a second finger of the user with the first finger.
  • the second finger may be located a second touchpad.
  • the second touchpad may be located in a location different from the first touchpad.
  • the second finger may also be located on the display screen.
  • the method may include identifying a first position of the first finger on the first touchpad, detecting a selection of the object, identifying a second position of the second finger on the second touchpad and detecting a movement of the first position of the first finger relative to the second position of the second finger.
  • the method may include detecting the selection of the object by the first finger on the first touchpad or the second finger on the second touchpad. In some embodiments, the method may include rotating the object based on the detected movement, altering an axis of rotation of the object based on the detected movement, altering the size of the object based on the detected movement, altering a display characteristic of the object based on the detected movement or moving the object based on the detected movement.
  • the method may include identifying a first position of the first finger on the display screen, detecting a selection of a point of rotation of the object, identifying a second position of the second finger on the second touchpad and detecting a movement of the second position of the second finger relative to the first position of the first finger.
  • the method may include detecting the selection of the point of rotation of the object by the first finger on the first touchpad or the second finger on the second touchpad.
  • the method may include moving the object around the point of rotation of the object based on the detected movement.
  • the object may correspond to a virtual joystick and the method may include pushing the handle of the virtual joystick in a direction and magnitude defined by a corresponding direction and magnitude of the movement of the second position of the second finger relative to the first position of the first finger.
  • the method may include detecting a selection of the object, identifying a first contact position of the first finger, detecting a change in a characteristic of the first contact position and causing the at least one property of the object to be controlled based on the change in the characteristic.
  • the method may include detecting a movement of the first finger away from the first contact position, detecting a change in the angle of the first finger in the first contact position, detecting an increase in a touch area of the first contact position and the like.
  • the characteristic may comprise at least one of the area, the size or the shape of the first contact position.
  • the method may include applying a. corresponding pressure or a load to the displayed object based on the change in the characteristic of the first contact position.
  • the method may include detecting a movement of the first finger from a. first position to a second position on the first touchpad, repositioning the object in the display screen in accordance with a direction of the movement of the first finger or an amount of movement of the first finger from the first position to the second position and enabling, for the user, an interaction with the re-positioned object.
  • the object may correspond to a set of virtual keys in a virtual control region in the display screen.
  • the invention has been described with reference to a handheld computerized device by way of an example, it is understood that the invention is not limited by the type of computerized device or system wherever the device or system may benefit by differentiating between a user's touch o a touchpad for command input, and a user's touch on a touchpad for merely holding the device by the touchpad.
  • the invention has been described with reference to certain user fingers touching the touchpad by way of an example, it is understood that the invention is not limited by which user fingers are touching the touchpad.
  • the invention has been described with reference to a touchpad located on the back of a handheld device including a display at the front of the device by way of an example, it is understood that the invention is not limited by where the touchpad is located.
  • the invention has been described with reference to a capacitive touchpad used for data entry by way of an example, it is understood that the invention is not limited by the type of input device.
  • the invention has been described with reference to a sequence of strong, weak, strong or medium, small, large force applied by a user's finger used for data entry by way of examples, it is understood that the in vention is not limited by those two sequences of forces applied.

Abstract

A method for controlling a control region on a display screen of a computerized system is presented. The method includes obtaining data from a touchpad. The data is associated with a position of a portion of the hand of a user when the user operates the touchpad. The method includes transmitting the data from the touchpad to the computerized device and analyzing the data in accordance with a model of a human hand. In certain embodiments, the method includes detecting an interaction of at least the portion of the hand of the user on the first touchpad with at least one object displayed on the display screen causing at least one property of the object to be controlled in accordance with the interaction of the portion of the hand of the user on the first touchpad.

Description

METHOD FOR DETECTING USER GESTURES FROM ALTERNATIVE TOUCHPADS OF A HANDHELD COMPUTERIZED DEVICE
CROSS-REFERENCES TO RELATED APPLICATIONS
[0001] This application claims priority to U.S. Patent Application No. 14/568,492, titled, "METHOD FOR DETECTING USER GESTURES FROM ALTERNATIVE TOUCHPADS OF A HANDHELD COMPUTERIZED DE VICE," filed, December 12, 2014 and priority, under 35 U.S.C. § 1 19(e), to U.S. Provisional Patent Application No. 61/916,168, titled, "METHOD FOR USER INPUT FROM ALTERNATIVE TOUCHPADS OF A HANDFIELD COMPUTERIZED DEVICE," inventor Tong Luo, filed December 14, 2013. This application is also a continuation- in-part of U.S. Patent Application No. 14/491 ,671 , titled, "METHOD FOR CONTROLLING A CONTROL REGION OF A COMPUTERIZED DEVICE FROM A TOUCHPAD," inventor Tong Luo, filed September 19, 2014, which claims priority, under 35 USC § 119(e), to U.S. Provisional Patent Application No. 61/880,629, "METHOD FOR USER INPUT FROM
ALTERNATIVE TOUCHPADS OF A HANDHELD COMPUTERIZED DEVICE", inventor Tong Luo, filed September 20, 2013; this application is also a continuation-in-part of U.S. Patent Application 14/341 ,326, "METHOD OF CONTROLLING A VIRTUAL KEYBOARD FROM A TOUCHPAD OF A COMPUTERIZED DEVICE," inventor Tong Luo, filed July 25, 2014, which claims priority, under 35 USC § 119(e), from U.S. Provisional Patent Application No. 61/858,223, entitled "METHOD FOR USER INPUT FROM ALTERN ATIVE TOUCHPADS OF A HANDHELD COMPUTERIZED DEVICE," inventor Tong Luo, filed July 25, 2013; this application is also a continuation-in-part of U.S. Patent Application 14/289,260, "METHOD FOR USER INPUT FROM ALTERNATIVE TOUCHPADS OF A HANDHELD COMPUTERIZED DEVICE", inventor Tong Luo, filed May 28, 2014, which claims priority, under 35 USC § 119(e), from U.S. Provisional Patent Application No. 61/828,683, entitled "METHOD FOR USER INPUT FROM ALTERNATIVE TOUCHPADS OF A HANDHELD COMPUTERIZED DEVICE," filed on May 30, 2013; this application is also a continuation-in- part of U.S. Patent Application No. 14/284,068, entitled "METHOD USING A FINGER ABO VE A TOUCHPAD DURING A TIME WINDOW FOR CONTROLLING A COMPUTERIZED SYSTEM," filed o May 21, 2014, which claims priority, under 35 USC § 119(e), from U.S. Provisional Patent Application No. 61/825,621 , entitled "METHOD FOR USER INPUT FROM ALTERN ATIVE TOUCHPADS OF A HANDHELD COMPUTERIZED DEVICE," filed on May 21, 2013: this application is also a continuation-in-part of U.S. Patent Application No. 14/282,331, entitled "METHOD USING FINGER FORCE UPON A
TOUCHPAD FOR CONTROLLING A COMPUTERIZED SYSTEM," filed on May 20, 2014, which is a continuation-in-part of U.S. Patent Application No, 14/268,926, entitled "METHOD USING A FINGER ABOVE A. TOUCHPAD I OR CONTROLLING A COMPUTERIZED SYSTEM," filed on May 2, 2014, which claims priority, under 35 USC § 1 19(e), from U.S. Provisional Patent Application No. 61/819,615, entitled "METHOD FOR USER INPUT FROM ALTERNATIVE TOUCHPADS OF A HANDHELD COMPUTERIZED DEVICE," filed on May 5, 2013; this application is also a continuation-in-part ofU.S. Patent Application No.
14/260,195, entitled "METHOD FOR USER INPUT FROM ALTERNATIVE TOUCHPADS OF A HANDHELD COMPUTERIZED SYSTEM," filed on April 23, 2014, which claims priority, under 35 USC § 1 19(e), from U.S. Provisional Patent Application No. 61 /815,058, entitled "METHOD FOR USER INPUT FROM ALTERNATIVE TOUCHPADS OF A
HANDHELD COMPUTERIZED DEVICE," filed on April 23, 2013; this application is also a continuation-in-part of U.S. Patent Application No. 13/770,791, entitled "METHOD FOR USER INPUT FROM ALTERNATIVE TOUCHPADS OF A HANDHELD COMPUTERIZED DEVICE," filed on February 19, 2013; this application is also a continuation-in-part of U.S. Patent No. 8,384,683 B2, entitled "METHOD FOR USER INPUT FROM THE BACK PANEL OF A HANDHELD COMPUTERIZED DEVICE", filed on May 4, 2010, which claims priority, under 35 USC § 119(e), from U.S. Provisional Patent Application No. 61/327,102, entitled "METHOD, GRAPHICAL USER INTERFACE, AND APPARATUS FOR USER INPUT FROM THE BACK PANEL OF A HANDHELD ELECTRONIC DEVICE," filed on April 23, 2010; the contents of all of all of the above references applications are incorporated herein by reference in their entirety. U.S. Patent Application No. 13/770,791 referenced above is also a continuation-in-part of U.S. Patent Application No. 13/223,836, entitled "DETACHABLE BAC MOUNTED TOUCHPAD FOR A HANDHELD COMPUTERIZED DEVICE", filed on September 1, 2011, which is a continuation-in-part of U.S. Patent No. 8,384,683 B2, entitled "METHOD FOR USER INPUT FROM THE BACK PANEL OF A HANDHELD
COMPUTERIZED DEVICE", filed May 4, 2010, the contents of all of which are incorporated herein by reference in their entirety.
BACKGROUND OF THE INVENTION
[0002] The present disclosure generally relates to a computerized device including a touchpad installed on the back panel or other portion of the body other than the display screen. More particularly, the present disclosure relates to a. method and graphical user interface that enables the user to see the user's finger position and motion from the back or other portion of the device, superimposed on a keyboard layout on the display screen. This makes it easier for a user to input keystrokes and mouse actions from a touchpad that is installed on the back panel or other portion of a handhold device. In an embodiment, the user can also control and manipulate a virtual keyboard shown in the display screen. Although embodiments of the invention are described with reference to a handheld computerized device by way of an example, it is understood that the invention is not limited by the type of computerized device or system.
[0003] Handheld computerized devices, i.e., devices including microprocessors and sophisticated displays) such as cell phones, personal digital assistants (PDA), game devices, tabletPCs, such as iPad, wearable computerized devices, and the like, are playing a more and more important role in everyday life, and are becoming more and more indispensible. With the advance of technology, and improvements in the handheld computerized devices' processing power, both function, and memory space is increasing at an amazing pace. Meanwhile the size of the handheld computerized devices continues to get smaller and smaller making the touchpad and display on the device smaller and more challenging to use.
[0004] To meet the challenge of a smaller device display and touchpad, the designers of handheld computerized devices typically use two approaches. One approach is to make the keyboard keys smaller and smaller, miniaturizing the keys. Additionally the keyboard keys may be given multiple functions - i.e. overloaded, and more complex function keyboard keys may be introduced as well.
{00051 The other approach is to use touch screen keyboards, or so called "soft keys", on the front panel. Here a user may use a stylus pen or finger to select the soft keys through a graphical user interface. However due to the optical illusions introduced by the display screen, and the fact that the user's fingers often are on top of the various display screen soft keys, hence blocking the keys from direct viewing, the soft keys should not be too small. Another problem is that when the soft keys are too small, often a single finger press will activate multiple keys. As a result, the designer may have to divide the keys into different groups and hierarchies, and just display a small number of keys at a time on the screen.
{0006] Both current approaches have some drawbacks: the user input area may occupy a significant portion of the front panel, and the user input process, although requiring a large amount of user attention to operate, still is very error prone.
[0007] Often a user may use one hand to hold the handheld computerized device, and use the other hand to input data, thus, occupying both hands. A user will often have to go through a long sequence of key strokes, and switch back and forth among different user interface screens, in order to complete a fairly simple input. As a result, there is a significant learning curve for a user to learn the overloaded keys, function keys, key grouping, and key hierarchies in order to operate the handheld computerized devices efficiently.
[0008] Previo us designs including sensors on the back of the device and representations of the user's fingers on the front, of the device, however, this work failed to adequately describe a procedure by which the indicia of the user's fingers or hands are displayed on the display panel.
[0009] Systems have been described in which image sensors would obtain an image of the user's fingers while operating the device, and use this image data to better determine which real or virtual keys the user's fingers were striking. Such methods rely, however, on image sensors that are positioned in such a way as to be capable of viewing the tips of the user's fingers. This type of image sensor pl acement is often diffic ult to implement on many types of handhel d user computerized devices. Another drawback of the previous image sensor approach is that it is difficult to implement in low light situations. This approach may also be difficult to implement in situations where there is limited smooth and flat desk or table space.
BRIEF SUMMARY OF THE INVENTION
[0010] According to one embodiment of the present invention, a method for controlling a control region on a display screen of a computerized system is presented. The method includes obtaining first data from a touchpad. The first data is associated with a position of a. portion of the hand of a. user when the user operates the touchpad. In an example, the first data is not associated with an image of a finger of the user from an image sensor. The method then includes transmitting the first data from the touchpad to the computerized system. In an example, the touchpad is located in a location that is different from the location of the display screen. In some embodiments, the method further includes analyzing the first data in accordance with a model of a human hand and assigning the first data to at least one of a plurality of fingers of the model. In some embodiments, the method may include detecting, by the computerized system, an interaction of at least the portion of the hand of the user on the first touchpad with at least one object, displayed on the display screen. The method may then include causing, by the
computerized system, at least one property of the object to be controlled in accordance with the interaction of the portion of the hand of the user on the first touchpad.
[0011] In some embodiments, the method may include detecting a finger swipe of a first finger of the user on the first touchpad when the user interacts with the at, least one object on the displa - screen. In an example, the object may correspond to a multipage application displayed on the display screen and the property of the object may correspond to a page number in the multipage application. In an embodiment, the method may include changing the page number of the multipage application in response to the finger swipe.
[0012] In some embodiments, the method may include detecting a finger swipe of a plurality of fingers of the user on the first touchpad when the user interacts with the object on the display screen. In an example, the method may include changing the page number by a pre-determined number of pages in response to the finger swipe of the plurality of fingers. In some examples, the method may include detecting a velocity of the finger swipe and changing and the page number by the pre-determined number of pages in response to the velocity. In some examples, the method may include determining a first direction of the finger swipe and incrementing the page number by a pre-determined number of pages in response to the first direction. In some examples, the method may include determining a. second direction of the finger swipe and decrementing the page number by a pre-determined number of pages in response to the second direction. In an example, the first direction may be different from the second direction.
[0013] In accordance with some embodiments, the method may include detecting an interaction of at least a second finger of the user with the first finger. In some examples, the second finger may be located a second touchpad. In some examples, the second touchpad may be located in a location different from the first touchpad. In some examples, the second finger may also be located on the display screen. In some embodiments, the method may include identifying a first position of the first finger on the first touchpad, detecting a selection of the object, identifying a second position of the second finger on the second touchpad and detecting a movement of the first position of the first finger relative to the second position of the second finger. In some examples, the method may include detecting the selection of the object by the first, finger on the first touchpad or the second finger on the second touchpad. In some embodiments, the method may include rotating the object based on the detected movement, altering an axis of rotation of the object based on the detected movement, altering the size of the object based on the detected movement, altering a display characteristic of the object such as color, size, font and the like based on the detected movement or moving the object based on the detected movement.
[0014] In accordance with at least, some embodiments, the method may include identifying a first position of the first finger on the display screen, detecting a selection of a point of rotation of the object, identifying a second position of the second finger on the second touchpad and detecting a mo vement of the second position of the second finger relative to the first position of the first finger. In some examples, the method may include detecting the selection of the point of rotation of the object by the first finger on the first touchpad or the second finger on the second touchpad. In some examples, the method may include moving the object around the point of rotation of the object based on the detected movement. In some examples, the object may correspond to a virtual joystick and the method may include pushing the handle of the virtual joystick in a direction and magnitude defined by a corresponding direction and magnitude of the movement of the second position of the second finger relative to the first position of the first finger.
{0015| In accordance with at least some embodiments, the method may include detecting a selection of the object, identifying a first contact position of the first finger, detecting a change in a characteristic of the first contact position and causing the at least one property of the object to be controlled based on the change in the characteristic. In an embodiment, the method may include detecting a movement of the first finger away from the first contact position, detecting a. change in the angle of the first finger in the first conta ct position, detecting an increase in a. touch area of the first contact position and the like. In an example, the characteristic may comprise at lea st one of the area, the size or the shape of the first contact position. In an embodiment, the method may include applying a. corresponding pressure or a load to the displayed object based on the change in the characteristic of the first contact position.
{0016] In accordance with at least some embodiments, the method may include detecting a movement of the first finger from a. first position to a second position on the first touchpad , repositioning the object in the display screen in accordance with a direction of the movement of the first finger or an amount, of movement of the first finger from the first position to the second position and enabling, for the user, an interaction with the re-positioned object. In an example, the object may correspond to a set of virtual keys in a virtual control region in the display screen.
[0017] In accordance with another embodiment, a computer-readable storage medium is disclosed. The computer-readable storage medium comprises instructions to obtain first data from a first touchpad, the first data being associated with a position of a portion of the hand of a user when the user operates the computerized system using the first touchpad, the first data not being associated with an image of a finger of the user from an image sensor, transmit, the first data from the first touchpad to the computerized device, the first touchpad being located in a location that is different from the location of the display screen, analyze the first data in accordance with a model of a human hand and assigning the first data to at least one of a plurality of fingers of the model, detect an interaction of at least the portion of the hand of the user on the first touchpad with at least one object displayed on the display screen and cause at least one property of the object to be controlled in accordance with the interaction of the portion of the hand of the user on the first touchpad.
(0018] In accordance with another embodiment, a system for controlling a control region on a display screen of a computerized system is disclosed. The system obtains first data from a first touchpad, the first data being associated with a position of a portion of the hand of a user when the user operates the computerized system using the first touchpad, the first data not being associated with an image of a finger of the user from an image sensor, transmits the first data from the first touchpad to the computerized device, the first touchpad being located in a location that is different from the location of the display screen, analyzes the first data in accordance with a model of a human hand and a ssigning the first data to at least one of a plurality of fingers of the model, detects an interaction of at least the portion of the hand of the user on the first touchpad with at least, one object displayed on the display screen and causes at least one property of the object, to be controll ed in accordance with the interaction of the portion of the hand of the user on the first touchpad.
BRIEF DESCRIPTION OF THE DRAWINGS
[0019] Figure 1 depicts a simplified exemplary front panel view of the handheld
computerized device displaying the position and motion of the user's fingers holding the back panel, in a ccordance with one embodiment of the present invention.
[0020] Figure 2 depicts a simplified exemplary back panel view of the handheld computerized device depicted in Figure 1 , in accordance with one embodiment of the present invention.
[0021] Figure 3 depicts a simplified exemplary front panel view of the handheld computerized device depicted in Figure 1 displaying a. multitude of groups of keys, in accordance with one embodiment of the present invention,
[0022] Figure 4 depicts a simplified exemplary front panel view of the handheld computerized device depicted in Figure 1 displaying the position and motion of the fingers holding the back panel and the multitude of groups of keys depicted in Figure 3 in the same time, in accordance with one embodiment of the present invention. [00231 Figure 5 depicts a simplified exemplar}' front panel view of a smaller handheld computerized device displaying the position and motion of at least one finger in contact with the touchpad of the back panel, in accordance with one embodiment of the present invention.
[0024] Figure 6A depicts a simplified exemplary front panel view of the smaller handheld computerized device depicted in Figure 5 displaying the position and motion of at least one user's finger in contact with the touchpad of the back panel at the touchpad touch points and a multitude of groups of virtual keyboard keys similarly depicted in Figure 3 in the same time, in accordance with one embodiment of the present invention.
[0025] Figure 6B depicts a simplified exemplary front panel view of a smaller handheld computerized device displaying the position and motion of one finger in contact with the touchpad of the back panel, in accordance with one embodiment of the present invention. The position of a small virtual keyboard, composed of a plurality of groups of keys, is also shown. Here this virtual keyboard was previously software aligned to correspond to the direction of the user's fingers and hand.
[0026] Figure 6C depicts a simplified exemplary front panel view of a smaller handheld computerized device displaying the position and motion of one finger in contact with the touchpad of the back panel, in accordance with one embodiment of the present invention. The position of a small virtual keyboard, composed of a plurality of groups of keys, is also shown. Here the keys were previously software aligned to correspond to the direction of the user's fingers and hand, and the spacing between the keys has also been user adjusted by software.
[0027] Figure 6D depicts a simplified exemplary flowchart, how biomechanical models of hand and finger movement may be used to display a virtual image of at least, a portion of a hand of a user on a display screen of the computerized system of Figure 8, in accordance with one embodiment, of the present invention.
[0028] Figure 7 depicts a simplified exemplary front panel view of the handheld computerized device displaying another embodiment, of the layout of virtual keys as the standard virtual keyboard, in accordance with one embodiment of the present invention. (0029| Figure 8 depicts a simplified exemplar}' block diagram of a computerized system capable of executing various embodiments of the invention, in accordance with one embodiment of the present invention.
Figure 9 depicts a simplified exemplary flowchart how biomechanical models of hand and finger movement may be calibrated and adapted to help turn the raw touchpad data into an accurate model of the user's hand and finger positions, in accordance with one embodiment of the present invention.
[0031] Figure 10 depicts a simplified exemplary flowchart how predictive typing methods may be used to improve the accuracy of the appearance of the virtual hand and fingers while typing, in accordance with one embodiment of the present invention.
(00321 Figure 1 1 depicts a simplified exemplar}' flowchart how dynamic changes in touchpad sensitivity may, for finger proximity touchpads, assist in highlighting the virtual keys about to be struck by a user while typing on the virtual keyboard, in accordance with one embodiment of the present invention.
I] Figure 12 depicts a simplified exemplary flowchart for generating images of the virtual hand and fingers on the device's graphics display screen, in accordance with one embodiment of the present invention.
[0034] Figure 13 depicts a simplified exemplary biomechanical and/or anatomical model of the human hand, showing the internal skeletal structure with a skin overlay, in accordance with one embodiment of the present invention.
55] Figure 14 depicts howr the simplified exemplar}' user's hand or hands may be photographed by the device's camera or other camera, and this image information may be used to refine the default parameters of the biomechanical and/or anatomical model of the user's hanc in accordance with one embodiment of the present invention.
¼] Figure 15 depicts how an exemplary device camera may be used to obtain a partial image of the user's hand while using the device's touchpad, and this information also used to update and refine the biomechanical and/or anatomical model of the user's hand, in accordance with one embodiment of the present invention.
50 {0037] Figures 16A - 16B depict how a simplified exemplary palm angle rotation
transformation may help the system relate raw touchpad data to a standard biomechanical and/or anatomical model of the human hand, in accordance with one embodiment of the present invention.
{0038] Figure 17 depicts more exemplary details of the relationship between the finger roots and the hand's overall palm angle, in accordance with one embodiment of the present invention.
(0039| Figure 18 depicts more exemplary details of the relationship between the hand's palm direction or palm angle and the tips of the user's fingers, in accordance with one embodiment of the present invention.
[0040] Figure 19 depicts how simplified exemplary biomechanical and/or anatomical model data pertaining to the width of the fingers may be used to help interpret raw touchpad data, in accordance with one embodiment of the present invention.
{00411 Figure 20 depicts how in a more accurate exemplary model, the location of the various finger roots will be displaced to some extent from the palm line (which forms the palm angle) by various amounts δπ, in accordance with one embodiment of the present invention.
[0042] Figure 21 depicts how the simplified exemplary system may attempt to correlate detected fingertip data from some fingers with finger root data from other fingers, determine that some fingertip data is missing, and thus deduce that these fingers are elevated above the touchpad, in accordance with one embodiment of the present invention,
[0043] Figure 22 depicts how the simplified exemplar}' system may further assign raw touchpad data to two different hands of the same user, based on the assumption that the range of possible hand angles for the same user is limited by the user's anatomy, in accordance with one embodiment of the present invention,
[0044] Figure 23 depicts a first simplified exemplary example of angle based finger matching algorithms, in accordance with one embodiment of the present invention.
[0045] Figure 24 depicts a second simplified exemplary example of angle based finger matching algorithms, in accordance with one embodiment of the present invention.
5 1 [0046] Figure 25 depicts a simplified exemplar}' flowchart how hiomechanicai models of hand and finger movement may he used to display a virtual image of at least a portion of a hand of a user on a display screen of the computerized system of Figure 8, in accordance with one embodiment of the present invention.
[0047] Figure 26 depicts a simplified exemplary flowchart of a "lift and tap" technique of key entry for controlling an input from a. user to the computerized system, in accordance with one embodiment of the present invention,
[0048] Figures 27A - 27F depict a series of simplified exemplar}' display screen shots of the "lift and tap" technique of key entry depicted in Figure 26 being used to type the first two letters of a "Hello World" message on the computerized system, in accordance with embodiments of the present invention.
[0049] Figure 28 depicts a simplified exemplary flowchart of a "lift and dra g" technique of key entry for controlling an input from a user to the computerized system, in accordance with one embodiment of the present invention.
[0050] Figure 29 depicts a simplified exemplary flowchart of a "lift and tap" technique of key entry modified to use force applied per finger for controlling an input from a user to the computerized system, in accordance with one embodiment of the present invention.
[0051] Figure 30 depicts a simplified exemplary flowchart of a modified "lift and tap" technique of key entry modified to use a third force applied per finger for controlling an input from a user to the computerized system, in accordance with one embodiment of the present invention.
[0052] Figures 31 A - 3 IB respectively depict simplified exemplary side and top views of a portion of the touchpad using the contact area resulting from a first force, in accordance with one embodiment of the present invention.
[0053] Figures 32A - 32B respectively depict simplified exemplary side and top views of a portion of the touchpad using the contact area resulting from a second force, in accordance with one embodiment of the present invention.
52 (0054| Figure 33 depicts a simplified exemplary flowchart of a "push and lift" technique of key entry for controlling an input from a user to the computerized system, in accordance with one embodiment of the present invention.
[0055] Figure 34 depicts a simplified exemplary time-line of the "push and lift" technique depicted in Figure 33, in accordance with one embodiment of the present invention.
[0056] Figure 35 depicts a simplified exemplary flowchart of a "hover and tap" technique of key entry for controlling an input from a user to the computerized system, in accordance with one embodiment of the present invention.
{00571 Figures 36A - 36F depict a series of simplified exemplary display screen shots of the "hover and tap" technique of key entry depicted in Figure 35 being used to type and enter the numeral "0" on the computerized system, in accordance with embodiments of the present invention.
{0058] Figure 37 depicts a simplified exemplar}' flowchart of a method 3700 for controlling a control region on a. display screen of a computerized system, in accordance with an embodiment of the present invention.
[0059] Figures 38A-38F depict a series of simplified exemplar}' illustrations, 3800, 3802, 3804, 3806, 3808 and 3810 of controlling a control region displayed in a display screen of a handheld computerized device by a. user operating a touchpad, in accordance with embodiments of the present invention.
[0060] Figures 39A-38F depict a series of simplified exemplar}' illustrations, 3900, 3902, 3904, 3906, 3908 and 3910 of controlling a control region displayed in a display screen of a handheld computerized device by a. user operating a touchpad, in accordance with another embodiment of the present invention,
[0061] Figure 40 depicts simplified exemplar}' illustrations, 4000 and 4002, that indicate the manner in which a computerized system (e.g., the handheld computerized device 100) may interpret a single finger swipe from a. user operating a touchpa d on the back side of the computerized device, in accordance with an embodiment of the present invention.
53 [0062] Figure 41 depicts simplified exemplary illustrations, 4100 and 4102, that indicate the manner in which a handheld computerized device may interpret a multiple finger swipe from a user operating a touchpa d on the back side of the computerized device, in accordance with another embodiment of the present invention.
[0063] Figure 42 depicts exemplary illustrations, 4200, 4202 and 4204 of the manner in which a computerized system (e.g., the hand held computerized system, 100) may detect finger gestures from a user using multiple touch pads and/or the display screen of the computerized system to control an object displayed on the display screen, in accordance with an embodiment of the present invention.
[0064] Figure 43 depicts exemplary illustrations, 4300 and 4302 of the manner in which a computerized system (e.g., the handheld computerized device 100) may detect finger gestures from a user using multiple touch pads and/or the display screen of the computerized system to control an object displayed on the display screen, in accordance with an embodiment of the present invention,
[0065] Figure 44 depicts exemplary illustrations, 4400, 4402 and 4404 of the manner in which a computerized system (e.g., the hand held computerized system, 100) may detect finger gestures from a user using multiple touch pads and/or the display screen of the computerized system to control an object displayed on the display screen, in accordance with an embodiment of the present invention.
[0066] Figure 45 depicts exemplary illustrations, 4500 and 4502 of the manner in which a computerized system (e.g. , the hand eld computerized device 100) may detect finger gestures of a user from multiple to uch pads and/or the display screen of the computerized system to control an object displayed on the display screen, in accordance with an embodiment of the present invention.
[0067] Figure 46 depicts a simplified exemplary flowchart of a method 4600 for controlling a control region on a display screen of a computerized system, in accordance with an embodiment of the present invention.
54 DETAILED DESCRIPTION OF THE INVENTION
[0068] The embodiments of the present invention relate to a handheld computerized device including a bit mapped display screen on the front panel, and a touchpad installed on the back panel, side panel, or other area other than that of the display screen. More particularly, the embodiments of the present invention relate to a method and graphical user interface that enable the user to see the user's finger position and motion from behind the device or other portion of the device superimposed upon a virtual keyboard layout on the front panel.
[0069] It is therefore desirable to have a more efficient and user-friendly way to do user input for handheld computerized devices. The embodiments of the present invention present an effective solution for these above problems. The embodiments of the present invention free the original keyboard space on the front panel for applications by utilizing the previously mostly unused back panel space for user input. The embodiments of the present invention are able handle both keyboard input and mouse input. The embodiments of the present invention present, a stunning graphic user interface on the front panel screen where a user may see the real-time position and motion of his/her fingers holding the back panel, on top of the display of keyboard layout, which is also referred to as a virtual keyboard. The embodiments of the present invention are more precise than current, touch screen keyboards by removing the display layer that presently exists between the fingers and touch pad. The embodiments of the present invention also move the user's fingers away from the front panel, so that the user's fingers will not block the view of the soft key or area that the finger is presently operating on. For smaller handheld devices, such as cell phone, iPhone™ or iPad™, the hand that holds the device may now also do input, hence freeing the other hand for other activities.
[0070] Thus an object of the embodiments of the present invention are to provide a method for a more efficient and user-friendly user input for a handheld computerized device.
[0071] Another object of the embodiments of the present invention are to free up the space currently occupied by the keyboard on the front panel of small electronic devices, and utilize the mostly unused space on the back panel of the handheld devices for user input purposes.
[0072] Another object of the embodiments of the present invention are to present a visually compelling user-interface design that enables the real time position and motion of the fingers that
55 hold the device, which normally would be hidden from view by the device itself, to be displayed on the front panel as "virtual fingers" together with an optional display of a virtual keyboard layout. The user's finger positions and keyboard layout may be displayed either as background image, or a s a transparent layer on top of some of all of the applications currently running on the handheld device. These semi-transparent representations of the user's finger positions and virtual keyboard allow the user to easily enter data while, at the same time, continuing to allow the user unimpeded access to the various applications running on the handheld device. Thus, for example, applications originally written for a computer device that had a physical keyboard may be easily run, without code modification, on a. tablet computer device that lacks a physical keyboard. Thus, these virtual semi-transparent keyboards and methods that also give information of finger motion of the user may be highly useful.
[0073] Another object of the embodiments of the present invention are to enable the hand that is holding the device to also do user input operations, hence freeing the other hand for other inputs or other purposes.
[0074] According to one embodiment, a device and method include a display screen on the front panel, which may be a bit-mapped display screen, a touchpad embedded on the back panel capable of sensing the user's finger positions and motion, and a graphical user interface. This graphical user interface will normally include both software and optional graphics acceleration hardware to enable complex graphics to be rapidly displayed on the display screen. The device also has an optional virtual keyboard processor that displays the keyboard layout, as well as computes and displays the user's finger positions on a real-time basis. The user's finger position and motion on the touchpad of the back panel may thus be computed and displayed on the front display screen as a layer, which may be a semi-transparent layer, on top of all of the other applications. The virtual keyboard processor may also interpret the finger motions, i.e. strokes, and invoke corresponding operations based on the known location of the finger position on the keyboard.
[0075] Unlike previous approaches, the user's fingers do not need to be constrained to fit onto particular regions of the touchpad, but rather may be disposed in any arbitrary location. Unlike some previous approaches, although embodiments of the invention may be aided to some extent by real-time video that may provide video information pertaining to at least some portion of the user's hand, visualization of the user's fingers, in particular the tips of the user's fingers is not necessary. This makes it feasible to use handheld device video cameras designed for general photographic purposes to be used to help in visualizing the user's hand, without requiring that much of the user's hand in fact be photographed. There is no requirement at all that the user's fingertips be photographed while operating the device.
[0076] Figure 1 depicts a simplified exemplary front panel view of a handheld computerized device (100) displaying the positio and motion of the user's fingers (108) holding the back panel, in accordance with one embodimnet of the present invention. The user is holding the handheld electronic device (100), similar to an Apple iPad™ or equivalent pad device. The front panel of the device is occupied by a large graphics display screen (102), which may be a bitmapped graphics display screen. In some embodiments, the whole front panel screen or front panel may be occupied by this graphics display screen (102). The user is holding the handheld computerized device (100) using his or her hands (104), where a portion of the user's thumb (106) is in front of the device over a portion of the front panel, and the user's fingers (108) are behind the device. Although device (100) is not transparent, nonetheless the graphics display screen (102) is shown representing a graphical representation of the user's fingers (108) as well as regions where the user's fingers are apparently touching an obscured from, view or "invisible" surface at touchpad touch points (1 10) at the back panel of the device. Each of the touchpad touch points (1 10) may correspond to a real time finger print image of the tip of the user's finger.
[0077] Figure 2 depicts a simplified exemplary back panel view of the handheld computerized de vice (100) depicted in Figure 1 , in accordance with one embodiment of the present in vention. In contrast to the front panel of device (100), previously depicted in Figure 1, which included a large graphics display screen, the back panel of the handheld computerized device as depicted in Figure 2 does not include a large graphics display screen, but instead includes a large touchpad (200). As may be seen, the user's fingers (208) may now be seen positioned above the touchpad with the tips of the user's fingers (210) touching the touchpad. It is understood that the expression "above the touchpad" refers to the relative position of the fingers with respect to the touchpad when the touchpad is facing upward. {0078] Note that in some embodiments, this back touchpad may be provided as a retrofit or add-on to a handheld computerized device that otherwise lacks such a back touchpad. Such methods and systems, such as "clip on" back touchpads, are described at more length in parent application 13/223,836, the contents of which are incorporated herein by reference in its entirety.
{0079] Figure 3 depicts a simplified exemplary front panel view of the handheld computerized device depicted in Figure 1 displaying a multitude of groups of keys (300, 302, 304), in accordance with one embodiment of the present invention. Figure 3 depicts one possible optional multitude of groups of keys, i.e. a "virtual keyboard," being displayed on graphics display screen (102) of device (100). In this example, the "virtual keyboard" includes a symbol keypad (300), a numeric keypad (302), and a QUERTY keypad (304). Note that in many embodiments, the keys may be drawn in outline or semi-transparent form so as not to obscure any other graphical applications running on graphics display screen (102).
{0080] The scheme depicted in Figure 3 allows the user to optionally use a. touchpad keypad on the back of the device to input keystrokes and mouse actions, and these inputs will be reflected on the display screen on the front of the handheld computerized device as "virtual fingers" or equivalent. As previously discussed, this virtual keyboard layout, displayed on graphics display screen (102) at the front panel may be a standard or modified QUERTY keyboard or keypad, a numeric keyboard or keypad, i.e. number entry keyboard, or alternatively some less standard keyboard or keypad such as a musical keyboard, a Qwerty, Azerty, Dvorak, Colemak, Neo, Turkish, Arabic, Armenian, Greek, Hebrew, Russian, Moldovan, Ukranian, Bulgarian, Devanagari, Thai, Khmer, Tibetan, Chinese, Flangul (Korean), Japanese, or other type of keyboard. Often, this keypad will be a semi-transparent keypad in order to allow the user to continue to view various application programs that are running on display screen (102) below the virtual keyboard.
[0081] Figure 4 depicts a simplified exemplary front panel view of the handheld computerized device (100) depicted in Figure 1 displaying the position and motion of the user's fingers (108) holding the back panel and the multitude of groups of keys (300, 302, 304) depicted in Figure 3 in the same time, in accordance with one embodiment of the present invention. Figure 4 depicts an example of how a user, typing on a touchpad mounted on the back of the electronic device, may see a graphical representation of his or her fingers (108) displayed on graphics screen (102) of device (100), as well as a display of virtual keyboard layout (300, 302, 304). The user's ability to enter input data to the handheld computerized device (100) is thus enhanced because the user may visually judge the distances between his or her fingers (108) and the keypad keys of interest (300, 302, 304) and move his or her fingers appropriately so as to hit the desired key. The user may also click on hyperlinks, such as iinkl , link2, and the like, or other clickable objects or command icons.
[0082] Because the user's operating fingers are moved away from the display screen, the fingers will not block the view of the display screen's soft keys, soft buttons, links or other areas. These areas on the display screen may now be seen more precisely, which in turn allows for more precise operation of the device.
[0083] The virtual display of the user's fingers may be a valuable feature for some of the newer tablet computers, such as the Microsoft Surface™ series, Windows 8, and the like, which may alternate operating modes between a first tablet operating mode designed for traditional touch input, and a second desktop operating mode, derived from legacy desktop operating systems, that, is optimized for more precise mouse input. By enabling such tighter control, it becomes more feasible for a user to operate such "Surface" like devices in legacy desktop mode without the need to use a mouse or other hand operated pointing instrument.
[0084] Because a front keyboard, i.e. mechanically actuated keys, is no longer necessary, the embodiments of the present, invention free up the space on the device that might othenvise have been used for original mechanical keyboard space on the front panel, and create room for additional larger displays and applications. The embodiments of the present invention make use of the presently mostly unused back panel space, thus, enabling the front display to show substantially larger virtual keys, or virtual keys including more space between them that are easier for the user to use.
[0085] The embodiments of the present, invention may create compelling visual effects, as well as useful visual effects, because the user may see his or her fingers (108), which are holding the back panel and thus normally blocked from view, being virtually displayed on the front panel along with a virtual, i.e. computer generated, keyboard layout display (300, 302, 304). Because
59 both the user's finger position, finger touch area, each depicted as a circle surrounding a cross, finger motion and the virtual keyboard are visible from the front panel, the user finger inputs on the touch panel located on the back panel of the device are both intuitive and easy to use. There will be no learning curve, and no need for special training. The user input methods of the embodiments of the present invention are more precise than traditional touch screen keyboards because these methods remove the obscuring layer between the finger and touchpad, and the operating fingers will not block the view of the display screen. For small handheld devices such as cell phones and iPhones, the current embodiments of the present invention enable the hand that holds the device to perform text input and other commands, hence freeing the other hand for other activities,
[0086] Note that although often a virtual keyboard will be presented, alternative data entry points of interest, such as hyperlinks on an internet browser, and the like, may also be used according to these methods as well.
[0087] In one embodiment, the layout of a. multitude of groups of virtual keyboard keys (300, 302, 304), including numbers, letters, and symbols may be displayed on an area separated from concurrently running other software applications that are being displayed simultaneously on the screen of the front panel (much like the traditional separately displayed area often used for soft keys near the bottom of the display screen). The virtual keyboard keys (300, 302, 304) may be advantageously displayed in different size or in locations that are not the same locations that are determined by the other software applications and/or programs because the virtual keyboard keys (300, 302, 304) may be displayed translucently so as to display both the virtual keyboard keys (300, 302, 304) and the underlying concurrently running application or program display content.
[0088] Devices and systems utilizing the virtual fingers and optional virtual keyboard embodiments of the present invention advantageously need not have mechanically actuated and/or permanently dedicated physical QWERTY keypads or QWERTY keyboards, or any other type of mechanical ly actuated and/or permanently dedicated physical keypad or keyboard such as one dedicated to number entry. Eliminating mechanically actuated and/or permanently dedicated keypads or keyboards impro ves device ergonomics, allow for larger graphics display screens, and also reduces device costs. {0089] Figure 5 depicts a simplified exemplar}' front panel view of a smaller handheld computerized device (500) displaying the position and motion of at least one finger in contact with the touchpad of the back panel, in accordance with one embodiment of the present invention. Smaller handheld computerized device (500) may include a cellular phone sized device (e.g. an Apple iPhone™ sized device) including a smaller graphics display screen (502) virtually displaying the position and motion of a multitude of fingers (108) in contact with the touchpad touch points (110) at the back panel of smaller handheld computerized device (500).
[0090] Figure 6A depicts a simplified exemplary front panel view of the smaller handheld computerized device (500) depicted in Figure 5 displaying the position and motion of at least one user's finger (108) in contact with the touchpad of the back panel at touchpad touch points (110), and a multitude of groups of virtual keyboard keys (300, 302, 304) similarly depicted in Figure 3 in the same time, in accordance with one embodiment of the present invention. Figure 6 may include similar features as Figure 4 with the exception of using smaller handheld computerized device (500) being held in just one user hand (104), the other hand of the user being free to do other tasks.
[0091] Adjusting the virtual keyboard
[0092] Because the virtual keyboard (or keypad) is software generated, it need not always be presented in the same position. However, in embodiments of the invention, at least temporary persistence of the various keys of the virtual keyboard is desirable, so that the user always knows the relative location of the key that they are going to strike, and so that the system can accurately compute the relative distance between the user's various fingers (as predicted by the anatomical and biomechanical model of the user's hand) and the various keys.
[0093] In some embodiments, it may be useful to allow the user to adjust, the position, orientation, and spacing between the virtual keys of the virtual keyboard either prior to beginning a typing session, or even during a typing session. Here, for example, a user may indicate to the system by keypress, voice command, virtual key selection, other touchpad input, etc. that virtual keyboard repositioning is desired. The system may then use various options to reposition the virtual keyboard. [0094| In some embodiments, the virtual keyboard will be essentially allowed to float on the screen, and the user can then rotate the virtual keyboard, stretch and shrink it, change key spacing, etc., by multi-touch type commands either on a front display panel touchpad, a back panel touchpad, or other touchpad device.
[0095] In other embodiments, the user may control the position, orientation, and spacing between the virtual keys of the virtual keyboard by verbal commands such as "rotate right 30 degrees", or "go up one inch", and so on,
[0096] In some embodiments, the position and orientation of the virtual keyboard can be set to track the position and orientation of the user's hand(s). In embodiments of the invention, design tradeoffs are taken into consideration. If the position and orientation of the virtual keyboard tracks the position and orientation of the user hand(s) too closely, then the ability of the software to determine which virtual key the bioniechanical and anatomical model of the user's hand is striking may be reduced. Thus, in some embodiments, it may be useful to set the virtual keyboard generation software to track a time averaged position and orientation of the user's hand (usually over periods of at least several seconds, or even minutes).
[0097] In other embodiments, the virtual keyboard generation software can be set to initialize the position and orientation of the virtual keyboard based on the position and orientation of the user's hand during a specified time. This can be as simple as having the user place his or her (hands) on the back touchpad and giving an "initialize keyboard orientation and position" command, either verbally, by pressing a real or virtual key, or by other activation system. Once so initialized, the virtual keyboard can then maintain its position and orientation until the user then decides to reset it.
[0098] Figure 6B depicts a simplified exemplary iron: panel view of a smaller handheld computerized device displaying the position and motion of one finger in contact with the touchpad of the back panel, in accordance with one embodiment of the present invention. The position of a small virtual keyboard, composed of a plurality of groups of keys, is also shown. Here this virtual keyboard was previously software aligned to correspond to the direction of the user's fingers and hand.
2 {0099] Figure 6C depicts a simplified exemplar}' front panel view of a smaller handheld computerized device displaying the position and motion of one finger in contact with the touchpad of the back panel, in accordance with one embodiment of the present invention. The position of a small virtual keyboard, composed of a plurality of groups of keys, is also shown. Here the keys were previously software aligned to correspond to the direction of the user's fingers and hand, and the spacing between the keys has also been user adjusted by software,
{0100] Figure 6D depicts a simplified exemplary flowchart how biomechaiiical models of hand and finger movement may be used to control a. virtual keyboard on a display screen of the computerized system of Figure 8, in accordance with one embodiment of the present invention. Figure 6D depicts the method includes obtaining data, from a touchpad, the data being associated with the location and movement of a finger and/or hand of a user when the user operates the computerized system using the touchpad, the data not being associated with an image of the finger of the user from an image sensor (620). The method further includes communicating the data from the touchpad to the computerized device, the touchpad being located in a location that is different from, the location of the display screen (622). The method further includes analyzing the data in accordance with a model of a human hand, and assigning the data to at least one of a plurality of fingers of the mode (624), computing a graphical representation of at least one finger of the user in accordance with the model (926), generating a virtual keyboard on the display screen (628), and repositioning the virtual keyboard according to either a verbal command from the user or a user input from the touchpad (630).
[0101] Figure 7 depicts a simplified exemplary front panel view of handheld computerized device (100) depicted in Figure 1 displ aying another embodiment of the layout of virtual keys (700) as the standard virtual keyboard, in accordance with one embodiment of the present invention. Figure 7 depicts another embodiment of the layout of virtual keys (700) may include a modified QWERTY keyboard or keypad that includes splitting the keyboard in half and displaying each half of the keyboard at an angle adapted for better ergonomic typing than the keyboards depicted previously.
{0102] In one embodiment, a computer-implemented method includes a handheld
computerized device, including a screen on the front of the device capable of displaying a graphical user interface, and a touch sensitive back panel or side panel or other area other than the display screen, and a user interface, such as a two dimensional touch sensor. For simplicity, this touch sensitive panel, which need not necessarily be flat, and need not necessarily be mounted on the back side of the device, hereinafter also referred to as a "touchpad," "touch sensor," or "touch sensitive back panel", but this use is not intended to be limiting.
[0103] The touch sensor will determine the motion of the fingers in real time, and the computerized system's or device's software and processor(s) will use the touch sensor data to compute the real time position and motion of the user's fingers that are touching the touch sensor on the back panel. These "virtual fingers" will then be displayed on the device's graphical user interface on top of a static background where optionally a multitude of groups of keys, including numbers, letters, and symbols (e.g. a virtual keyboard) or hyperlinks may be displayed. By watching the motion of the user's virtual fingers on the virtual keyboard, the user may easily operate the device, and optionally determine precisely where to strike a finger in order to hit an intended virtual key.
[0104] In one embodiment, the back panel user interface (Ul) may be outlined in a distinctive yet non-obstructive color and displayed as a transparent layer over the current applications; hence all the details of current application and back panel Ul are shown to the user at, the same time.
[0105] In one embodiment, the real time position and motion of the fingers holding the back panel may be displayed on the screen of the front panel.
[0106] In one embodiment, the layout of a multitude of groups of keys, including numbers, letters, and symbols may be display ed on the screen of front panel as background of real time position and motion of the fingers holding the back panel.
[0107] In one embodiment, the real time position and motion of the fingers holding the back panel may be displayed on the static background of a multitude of groups of keys, including numbers, letters, and symbols, enabling the user to precisely strike a finger on an intended key. (0108] In one embodiment, the display of the virtual hand may be creative and artistic. For example, the display may instead show a skeleton, an animal claw, a furry hand, a tattooed hand, and the like to achieve more compelling or amusing effects.
[0109] in one embodiment, a computer-implemented method, including a handheld computerized device includes a touchpad installed on the back panel. The touchpad is able to sense the touch point positions, movement, and stroke motion data of a multitude of fingers. The data information of the finger motion of one or a multitude of fingers, including the motion type, e.g., touch, movement, and stroke patterns, and the like, and motion position, is passed to a virtual keyboard processor, such as a computer processor. The virtual keyboard processor may analyze the finger motion, compare the finger positions with the registered position of the keys, hereinafter referred to as virtual keys, as well as the hyperlinks and other touch buttons of the application program, e.g., generically the "user entry area", and then will decide which item in the user entry area was stroked or actuated. The virtual keyboard processor may then invoke the corresponding operation. The virtual keyboard processor may also update the real time image of the fingers, or finger pads or touch points, or indeed the user hand(s) on the front screen after each finger motion.
[0110] In one embodiment, the touchpad may be installed on the back panel of the handheld computerized device, and may be able to sense the touch, movement, and stroke motion of a multitude of user fingers.
[0111] In one embodiment, the information pertaining to the finger motion of a multitude of user fingers, including the motion type, e.g., touch, movement, and stroke action, and the like, as well as motion position, may be passed to a virtual keyboard processor.
[0112] In one embodiment, the virtual keyboard processor may analyze the finger motion, compare finger position with the registered position of the keys, determine which key was stroked, and invoke the corresponding operation,
[0113] In one embodiment, virtual keyboard processor may update the real time position and m otion of the fingers holding the back panel. [0114] One embodiment of the present invention includes a graphical user interface (GUI) for a handheld computerized device. The interface may include a display of a multitude of groups of keys, including numbers, letters, and symbols. The keys may be displayed on a. graphical user interface on the front panel display screen, and indeed this display area may occupy the whole screen. Thereby, the content of the graphic user interface is not blocked by applications, and is shown together with the applications.
[0115] One embodiment of the present invention includes a graphical user interface for a handheld computerized device. This interface includes a display of the real time position and motion of the fingers holding the back panel. Here the display is on the front panel screen, and in fact may occupy the whole screen. Due to the advantages of this approach, the content of the user's finger position and motion is not blocked by applications, or by the display of groups of keys, including numbers, letters, and symbols,
[0116] One embodiment of the present invention includes a method of assisting user data entry into a handheld computerized device. This handheld computerized device includes at least one touchpad in one embodiment being located on a side of the handheld computerized device that, is behind the side of the device that holds the graphics display screen, at least one graphics display screen, at least one processor, memory, and software. Often, however, the handheld
computerized device will lack a mechanically actuated and/or permanently dedicated physical QWERTY keypad or keyboard, and may also lack a mechanically actuated and/or permanently dedicated physical numeric keypad or keyboard as well. The method will usually include displaying at least one data entry location on the at least one graphics display screen of the de vice. Often this at least one data entry location will be a graphical display of a keyboard or keypad that may be included of a multitude of data entry locations. Here, the system wil l use the touchpad to obtain data on the location and movement of the user's fingers and/or hand. The system may analyze the data on the location and movement of the user's fingers and/or hand according to a biomechanical and/or anatomical model of a human hand, and will assign data on the location and movement of the user's fingers and/or hand to specific fingers on this biomechanical and/or anatomical model of a human hand (usually the user's hand). The system may then use this biomechanical and/or anatomical model of the human hand to compute a graphical representation of at least the user's fingers, and frequently both the user fingers and the user hand(s). The system will then display the graphical representation of at least the user's fingers (and again frequently both the user's finger and hand), on the device's graphics display screen. Thus the distance between the graphical representation of the user's virtual fingers on the graphics display screen, and the virtual data entry location (such as the virtual keyboard) will give information that will help the user properly position his or her real fingers and/or hand on the touchpad, which in turn will facilitate data, entry,
[0117] Figure 8 depicts a simplified exemplary block diagram of a computerized system 800S capable of executing various embodiments of the invention, in accordance with one embodiment of the present invention. Computerized system 800S includes software and hardware that may be used to implement one embodiment of the invention such as a front panel screen (804), a back panel touch pad 800, a virtual keyboard processor (802), an application process (806), and a device memory(808). Finger position and motion data are first, collected from back panel touch pad (800), and then passed to virtual keyboard processor (802). The virtual keyboard processor (which will often be implemented by a combination of software and hardware such as a microprocessor, graphics processor, touchpad controller, and memory) displays the virtual finger position and motion together with the keyboard layout on front panel screen (804). The virtual keyboard processor also analyzes the finger position and motion information data, compares the data with the registered position of the keys (or hyperlinks) and invokes proper operation in application process (806). The keyboard position information may be programmed in a virtual keyboard process, or alternatively may be saved in system memory (808), The key-press or hyper-link information that the user intends to relay to the applications may be passed to the virtual keyboard controller either through memory, or alternatively through inter-process communications.
[0118] Other touchpad and screen locations.
[0119] In one embodiment, the display screen may be located at some distance from the touchpad. Indeed, the display screen and the touch pad may not even be physically connected at all. Rather the touchpad may transmit data pertaining to the user's hand position to a processor, which in turn may then generate the virtual image of the user's hand and display the virtual hand on the display screen, and neither touchpad, processor, or display screen need to be physically connected (although they may be). For example, data pertaining to the user's hand and finger position relative to the touchpad may be transmitted by a wired, wireless, or optical (e.g.
infrared) method to the processor. The processor in turn may transmit the virtual image of the user's fingers and hand to the display screen by a wired, wireless, or optical (e.g. infrared) technique. As a result, the user's real hand will be moving close to a touchpad at a different place other than the current display screen. The display screen may thus be in nearly any location, such as on a regular monitor, TV screen, projector screen, or on a virtual heads- up eyeglass display worn by the user (e.g. a device similar to Google Glass).
[0120] Touch pads including non-fiat surfaces.
[0121] Although touch pads are often flat and roughly rectangular devices, there is no constraint that the touch pads using embodiments of the present invention be either flat or rectangular. Indeed in some embodiments, there is advantage to employing touch pads that include variably shaped and curved surfaces. Such curved and/or variably shaped touch pads could be then placed on various non-traditional locations, such as on the surface of a ball or cylinder, on the surface of various common devices such as glasses frame stems for virtual heads-up displays such as windshields, eyeglasses, and the like, other wearable computerized devices such as smart watch bands, steering wheels - either for a vehicle or a game interface, joysticks, and the like, and/or, dashboards, instrument panels, and the like.
[0122] Touchpad technology.
[0123] In principle, many different types of touchpad technology may be used for this device, including capacitive sensing, conductance sensing, resistive sensing, surface acoustic wave sensing, surface capacitance sensing, projected capacitance sensing, strain gauges, optical imaging, dispersive signal technology, acoustic pulse recognition, pressure sensing and bidirectional screen sensing. However, in a preferred embodiment, touchpad sensing technology that is capable of sensing multiple finger positions at the same time may be used. Such an ability to sense multiple finger positions or gestures at the same time hereinafter also referred to as "multi touch" or "multi-touch" sensing technology. Touchpads are thus distinguished from previous mechanical keyboards or keypads because touchpads are not mechanically actuated, that is, since the surface of a touchpad is substantially rigid and responds to touch instead of a mechanical deflection, the touchpad gives the user substantially no indication that the immediate surface of the touchpad moves where touched, except perhaps for the entire rigid touchpad moving as a result, eve with pressure sensitive touchpad technology. Touehpads are further distinguished from previous mechanical keyboards or keypads because the shape and/or location of input keys or buttons on a touchpad are not fixed because the keys and/or buttons are instead displayed on an electronically controlled screen with the flexibility of software control and not limited by fixed mechanical elements located on the device.
[0124] One example of a. multi-touch touchpad embodying the present invention may use a touch sensing device commercially available from Cypress Semiconductor Corporation, San Jose, California, and commonly known as the Cypress TrueTouch™ family of products. This family of touchpad products works by projective capacitive technology, and is suited for multi- touch applications. The technology functions by detecting the presence or proximity of a finger to capacitive sensors. Because this touchpad system senses finger proximity, rather than finger pressure, it is well suited to multi-touch applications because, depending upon the tuning of the capacitance detection circuit, various degrees of finger pressure, from light to intense, may be analyzed. Although often used on touch screens, the projective capacitive technology method may function with a broad range of substrates.
[0125] Virtual finger and hand position software (virtual keyboard processor)
[0126] As others have noted, one problem with attempting to create "virtual fingers" is that at best, usually just certain regions of the hand, such as the fingertips and perhaps the palms, may usually be detected by conventional multi-touch sensors. To overcome this issue bootstrapping from hand-position estimates has been suggested, which overcomes the invisibility of structures that link fingertips to palms. Suitable algorithms could be obtained by using context-dependent segmentation of the various proximity image constructs, and by parameterizing the pixel groups corresponding to each distinguishable surface contact. It was found that by path-tracking links across successive images, those groups which correspond to the same hand part could be determined, and it was possible to reliably detect when individual fingers touched down and lifted from the multi-touch pad surface. It has been proposed that a number of different combinatorial optimization algorithms that used biomechanical constraints and anatomical features to associate each contact's path with a particular fingertip, thumb, or palm of either hand. Such algorithms further operated by assigning contacts to a ring of hand part attractor points, using a squared-distanee cost metric, to effectively sort the contact identities with respect to the ring of hand part attractor points.
[0127] A skeletal linked model of the human hand based software that creates a biology-ba sed (biomechanicai and/or anatomical) model of joint motion and associated set of constraints has been proposed. The skeletal linked model approach also is based on a software model of the skin that may stretch and bulge in order to accommodate this internal skeleton. The software models a natural joint axis for four different types of joints in the human hand, as well as takes into account the relative lengths of the underlying hand bone structure, and also accounts for the space occupied by the hand's muscles and skin.
[0128] Figure 25 depicts a simplified exemplary flowchart how biomechanicai models of hand and finger movement may be used to display a virtual image of at least a portion of a hand of a. user on a display screen of the computerized system of Figure 8, in accordance with one embodiment of the present invention. Figure 25 depicts the flowchart includes obtaining data from a touchpad, the data being associated with the location and movement, of a finger and'or hand of the user and not associated with an image of the finger of the user from an image sensor, when the user operates the computerized system using the touchpad (2510). The flowchart further includes communicating the data from the touchpad to the computerized device, the touchpad being located in a location that is different from the location of the display screen (2520). The flowchart further includes analyzing the data in accordance with a model of a human hand, and assigning the data to at least one of a multitude of fingers of the model (2530), computing a graphical representation of at least one finger of the user in accordance with the model (2540), and displaying the graphical representation on the display screen (2550).
[0129] Figure 9 depicts a simplified exemplary flowchart how biomechanicai models of hand and finger movement may be calibrated and adapted to help turn the raw touchpad data into an accurate model of the user's hand and finger positions, in accordance with one embodiment of the present invention. The system may work with adequate accuracy using standardized models of hand and finger relationships. For exampl e, the system may perform adequately by an initial calibration step where the system invites the user to place his or her hand on the display screen, the system displays various sized hands, and the user is invited to enter in which standardized hand size best fits his or her own hands. The system may then use this data for its various calculations. Even more simply, the system may default to an average hand size for initial use, allowing some degree of functionality to be achieved with no preliminary calibration.
[0130] In one embodiment, it will be useful to better calibrate the system by employing one or more active calibration steps. These steps may refine the initial hand model under actual use conditions, and make appropriate adjustment to the various portions of the hand model as will best fit data that has been obtained under actual use conditions. An example of this active calibration process is shown in Figure 9, Here the system may invite the user to do an active calibration step, or alternatively the user will voluntarily start an active calibration step, in step (900). In one embodiment, the model includes calibration information in accordance with pressing a portion of the user's hand on the touchpad in a specified manner. To facilitate this calibration step, the system may optionally display one or more targets on the screen, which may be keyboard targets, or alternatively may be specially designed calibration targets specifically designed for the active calibration step. Optional photographic calibration steps are described for Figure 14.
[0131] In one embodiment referring to Figure 9, to reduce complexity, the system may optionally request that the user calibrate one hand at a time, and indeed may request that the user operate the fingers on his or her hand in a manner different from normal typing so as to gather additional data. For example, a user may be requested to first extend a specific finger to a maximum length and press, then to a minimum length and press, then to the extreme left and press, then to the extreme right and press and so on, potentially through all fingers and the thumb on a one at a time basis. It should be apparent that such a data set may then naturally be translated into a reasonably detailed model of that particular user's hand and its capabilities to maintain a number of different configurations. During the calibration process, the system will accumulate touch data by invoking touchpad sensing hardware and calibration software (902). The system will also make predictions as to the location of the user's hand and fingers by bootstrapping from various hand position estimates (904). Often the system will track the positions of the hand and fingers across successive time intervals to do the predicting, and compute probable finger paths (906). The system will often use its internal model of the user's hand biomechanical features and anatomical features to do the computing, and to help associate the various projected paths with the user's fingertips and thumb position, which at least during the active calibration process will be known (908). Here a path is understood to be the line or linkage between at least one finger root and the associated fingertip or touchpad touch point for the associated finger. The system will then refine its models of the user's hand biomechanical and/or anatomical features by comparing the predicted results with real data, and determine if its user hand model is working with sufficient accuracy in step (910). If it is, then this user hand model will then be adopted and used for subsequent user virtual keyboard data entry ur o es (914). If the user hand model is not working with sufficient accuracy, then the system will attempt to adjust the hand model by varying one or more hand-model parameters (912), and often will then continue the calibration process until acceptable performance is obtained .
[0132] Thus the calibration software enables the biomechanical and/or anatomical model of the human hand to be calibrated more accurately, so as to match the biomechanical and/or anatomical characteristics of a particular user's fingers and/or hand.
[0133] In one embodiment, the realism of the simulated virtual fingers on the screen may optionally be facilitated by the use of predictive typing models. The predictive typing model approach will be particularly useful when the user is typing text on a virtual keyboard, because the system may scan the previous text that has been entered, and utilize a dictionary and other means, such as the statistical distribution of letters in the particular language, to make educated guesses as to what letter is going to be typed next. This educated guess may then be used to supplement the touchpad data as to last fingertip position and movement to tend to direct the appearance of the simulated finger towards the logical next key. Because this system will occasionally tend to guess wrong, however, the user may find it useful to adjust this predictive typing "hint" to various settings depending upon the user and the situation. Thus, a user who is an experienced touch typist and who tends to type both fairly quickly and fairly accurately will tend to find the predictive typing hints useful, because the predictive approach will tend to work well for this type of user. On the other hand, a user who is more of a slow and uncertain "hunt and peck" typist may find the predictive approach to be less useful, and may wish to either reduce the strength of the hint or potentially even turn the predictive typing "hint" off altogether. [0134| Figure 10 depicts a simplified exemplar}' flowchart how predictive typing methods may be used to improve the accuracy of the appearance of the virtual hand and fingers while typing, in accordance with one embodiment of the present invention. In a predictive typing system, the software will first access both the biomechanical and/or anatomical model data for the user's hands (1000), and the latest fingertip and thumb position data from the touchpad sensors (1002). The system will then use this information to display the user's virtual hands and lingers on the device's display screen (1004). If a predictive typing mode is on (1006), then the system will attempt to deduce (based upon typing speed, as well as the user's consistency in typing speed, and context) what is the most probable letter or letters that the user is likely to type next. The system will also attempt to predict the most probable finger or fingers that the user will use to type this most probable letter (1008). For example, if the user is typing quickly and consistently, and the context of the word or sentence indicates that a vowel such as "e" is likely, then the system may use this factor in its analysis of the somewhat noisy finger position data from the touch sensor to increase the probability that the user's left index finger (often used to type "e" on a keyboard, and which in-fact may not be registering on the touch pad because the user has lifted the left index finger to move to strike the "e" key), is moving towards the "e" key. When used properly, such predictive typing algorithms may help increase the illusion that the user is looking through the display and onto his or her hands below the display even though the
display/computerized device is not actually transparent. Conversely, if the predictive typing mode is turned "off" or set to reduced intensity (1010), then the system will not take the probable next letter into account in its display of the user's hand and fingers and instead just displays using the virtual hand(s) model,
[0135] In one embodiment, the efficiency of the predictive typing may be further enhanced by incorporating the user's history of finger use for each particular key. For example, one user may have a strong tendency to use the right index finger to type the keys "H" and "J", and as another example the same user may have a tendency to user his or her left pinky finger to type the letter's "A" and "Z". Here the system may observe the individual user's typing patterns over time, either as part of an initial calibration step, or later (and in one embodiment even continually) while monitoring the user's typing patterns, and use the user's individualized finger-to-letter correlation habits as part of the predictive typing algorithm. (0136| Thus the predictive typing software enables the computerized device to compute the graphical representation of at least the user's fingers, and often the user's fingers and hands, with better precision by additionally using keystroke predictions, in addition to the data on the location and movement of the user's fingers and/or hand obtained using the touchpad.
(0137] In one embodiment, in order to improve the realism of the virtual fingers, additional "finger hover" algorithms may also be used. As used in this specification, "finger hover" means highlighting or otherwise graphically altering the appearance of a virtual key on a virtual keyboard whenever the system believes that the user's finger is either hovering above that virtual key, or about to strike that virtual key. For this type of algorithm, use of touchpads that may sense relative finger proximity to the touchpad surface, such as projective capacitive technology touchpads, may be particularly useful.
[0138] The sensors and algorithms that detect relative finger-height above a surface may be tuned to various degrees of sensitivity, and indeed this sensitivity level represents an important engineering tradeoff. If the touchpad is tuned to too high a sensitivity, then it will tend to generate spurious (false) signals, and also lack precision as to precisely where on the touchpad a finger is about to land. If the touch pad is tuned to a lower sensitivity, then the touchpad will tend to detect fingertips that are exerting a considerable amount of pressure on the touchpad surface.
[0139] Although many prior art touchpads tend to use a continual or fixed level of touchpad sensitivity at all times, in one embodiment for the "finger hover" option described in this specification, use of a dynamic or variable level of touchpad sensitivity may be advantageous. For example, to detect, finger hovering above a key, a touchpad might first operate at a normal level of sensitivity until it detects that a fingertip within strategic striking distance of a particular key has left the surface of the touchpad. At this point, in order to detect "finger hover" above the key, the touchpad circuitry might temporarily reset its sensitivity to a higher level, designed to more precisely detect when the user's finger is hovering above the key. If the higher level of touchpad sensiti vity detects the fingertip proximity, the key may be highlighted. If the higher level of touchpad sensitivity does not detect the hovering fingertip, then the key will not be highlighted. After a short period of time, about on the order a tenth of a second, the to uchpad may then be reset to the normal level of sensitivity to more precisely determine if the finger has then actually touched the touchpad, or not.
[0140] Figure 11 depicts a simplified exemplar}' flowchart how dynamic changes in touchpad sensitivity may, for finger proximity touchpads, assist in highlighting the virtual keys about to be struck by a user while typing on the virtual keyboard, in accordance with one embodiment of the present invention. In other words, Figure 1 1 depicts an example of an algorithm to detect and indicate "finger hover". Here the system displays the virtual keyboard (1 100), as well as an overlay of the user's virtual fingers on or near this virtual keyboard (1102). When the system detects that a finger, suspected of being a finger about to press a key due to the finger's proximity to the key and or predictive typing considerations, leaves the touchpad (most likely because the user has raised the finger above the touchpad in preparation for striking the virtual key), (1104) the system will momentarily turn the touchpad finger proximity detector to a higher level of sensitivity (1 106), and the software will look to see if finger hover over the suspected key or keys may be detected (1108). If the system does not detect that a finger is suspect of leaving the touchpad, the system returns to step 1102. If a finger hover signal may be detected over the suspected key, then this key will be highlighted to help guide the user (1 1 10). After a period of time that will not normally exceed about a tenth of a second or if no finger hover is detected, the system will once again lower the sensitivity of the finger proximity detector down to the normal le vel (1 112), in order to precisely detect if the finger is about to strike the key (11 14). If the touchpad, now operating at normal sensitivity, now detects that the virtual key has been struck by the user, the system will appropriately indicate the keystroke on the virtual key board by further graphical changes to the key (1 1 16) and optionally may issue an audible keypress or key-click sound as well to give further feedback to the user. Then the system may record the key strike (5 5 58). If the appropriate finger press was not detected at (1 1 14), then the system repeats the flow at step (5 102).
[0141 ] More generally, the finger hover algorithm approach allows at least one data entry location (key) to be highlighted on the device's graphics display screen whenever the computerized device determines that at least one finger on the user's hand has left the touchpad, and the position and motion history of the finger is consistent with an ability of that finger to strike a position on the touchpacl that is consistent with the location of the data entry location (key) on the graphics display screen.
[01421 Graphical representation of the user's human hand(s) and fingers.
[0143] Once the computerized device has obtained data from the touchpad, as well as any additional predictive typing data, hover detection method data, calibration data, and the like, and has updated its internal biomechanical and/or anatomical model of the user's hand or hands (including the fingers) to reflect this new data, then the system may utilize this biomechanical and/or anatomical model of the user's hand or hands to compute a graphical representation of at least the user's fingers, and often the user's hand and figures, suitable for display on the device's graphics display screen.
[0144| A life-like graphical representation of the user's hand and fingers is not necessary. Often, a more shadow-gram like or cartoon-like two-dimensional model (or representation) of the user's hand and fingers will be all that will be necessary. Often these two-dimensional representations of the user's hand and fingers need not include much, if any internal detail. Rather, these representations, may for example, look much like a translucent gray or other colored shadow projection of the user's hands and fingers on a surface. Here, the sharpness and the contrast and the detail of the user's hands and fingers may have reduced sharpness, and have enough distinguishing contrast from other areas of the display screen, so as to enable the user to accurately place his or her hands and fingers on the appropriate virtual buttons or virtual keyboard that is being shown in the graphical display. More fanciful or artistically inspired hand representations are also discussed later in this specification.
[0145] Figure 12 depicts a simplified exemplary flowchart for generating images of the virtual hand and fingers on the device's graphics display screen, in accordance with one embodiment of the present invention. Many ways to graphically represent the user's hands and fingers, or at least the user's fingers, are possible. In one embodiment, based upon the biomechanical and/or anatomical model of the human hand(s) (1200), and optionally specific data on the location and movement of the user's fingers and hand based on the touchpad data (as well as any additional data from predictive typing software, or hover detection) a three-dimensional virtual model ma be constructed in the device's memory that depicts the user's hand(s) and fingers (1202). [0146] Based upon the 3D model, a two-dimensional projection of the general outlines of the user's hand and fingers may be made upon a mathematical surface that corresponds to the surface of the touchpad (1204), This projection may be in the form of a hand and/or finger outline, or alternatively a. virtual hand and finger shadow may be produced. This projection may then be combined with the any other data that is being sent do a memory buffer or graphics display buffer for the display screen of the device, and then displayed to the user (1206).
[0147] Thus, in one embodiment, the graphical representation of at least the user's fingers, and often both the user's hand and fingers, on the graphics display screen may be done by using the previous assignment of the data on the location and movement of the user's fingers and/or hand(s) to specific fingers on the biomechanical and/or anatomical model of the human hand(s) to create a three dimensional model of the user's hand(s) and fingers in the computerized device's memory. Next, a two-dimensional projection of this three dimensional model of the user's hand(s) and fingers in memory may be made. Here the two-dimensional projection may be on an imaginary plane that corresponds in both distance and orientation from the model of the user's fingers to the touchpad. Thus if, for example, the real user's finger is ¼" above the touchpad, then the distance between the three dimensional model of the user's finger and the imaginary plane that corresponds in distance and orientation to the touchpad will also be ¼". This two-dimensional projection on the imaginary "touchpad" plane (virtual touchpad) may be used to generate the graphical representation of at least the user's fingers on the graphics display screen, and often the user's fingers and hand(s) as well.
[0148] Alternatively, in a less computationally intensive scheme, a two dimensional model of the user's hands and fingers may be manipulated to best fit the previously discussed hand and finger position and motion data, and this two dimensional model then used for the graphical representation.
[0149] This two dimensional model may be further user selected according to the user's hand size, and indeed may be calibrated by asking the user to place his or her hand on the touchpad, thus allowing the system to sense the dimensions of the user's hand directly.
[0150] Figure 53 depicts a simplified exemplary biomechanical and/or anatomical model of the human hand, showing the internal skeletal structure with a skin overlay, in accordance with one embodiment of the present invention. This illustration shows the major bones of the hand, with the bones of the index finger and thumb separated in order to allow the joints to be better visualized. The internal skeletal structure of the hand (1300) is depicted, along with an outline of the skin on the left side of the hand (1302). The bones of the fingers include the distal phalanges (1304), the intermediate phalanges (1306), the proximal phalanges (1308) and the metacarpals (1310). The thumb lacks the intermediate phalange.
[0151] Here the various finger joints include the distal inter-phalangeal joint (dip) (1312), the proximal inter-phalangeal joint (pip) (1314), and the metacarpophalangeal joint (mcp) (1316), The thumb lacks the distal inter-phalangeal joint (dip), and instead includes the interphlangeal joint (ip) (1318) as well as the carpometacarpal (cmc) joint (1320). In one embodiment for higher accuracy, it may be useful to replace the default parameter values of at least the lengths of these various bones with actual user hand parameters. In general, the closer the various default parameters of the biomechanical and/or anatomical model of the human are to the actual user hand parameters, the better. In some embodiments, even the range of joint motion may also be experimentally determined, and used to replace one or more joint motion range default parameters.
[0152] Finger identifying algorithms.
[0153] In some embodiments, the biomechanical and/or anatomical model of the human hand used in the embodiments of the present invention for finger identifying algorithms may be based on the following observations. First, the average human hand has four fingers and one thumb. Second, in contrast to the fingers and thumb of the human hand (e.g. Figure 13 bones (1308), (1306), (1304), which are relatively flexible above the metacarpophalangeal joint (mcp) (1316), the palm of the average human hand is relatively inflexible below the metacarpophalangeal joint (mcp) (1316). Indeed the positions of the various metacarpophalangeal joints (mcp) (5316) tend to be relatively invariant with respect to rotation of the hand. The various metacarpophalangeal joints (mcp) (1316) may hereinafter also be referred to as the "finger roots". Finger roots will be represented by the variable "r". Alternatively, finger roots may be referred to as the junction between the finger and the palm. [0154] Third, due to the relatively invariant shape of the palm, the orientation of the user's palm and its angle with respect to other hand stmctures, such as the relative orientation of the fingers (e.g. middle finger (1330) is relatively constant. In particular, the orientation or position of the various "finger roots" (1316) may define a palm line direction (1332) that will in turn, when the angle of the palm line with respect to the coordinates of the touchpad are known, help to define the location of the various fingers and fingertips.
[0155] Fourth, users may generally desire to manipulate symbols using the area, underneath the uppermost bone of the finger or thumb (1304). Here, the touch pad data will include various touchpad touch points, identified in (x, y) coordinates in later figures, which will often but not always correspond to the area underneath the uppermost bone of the user's finger and thumb (1304), hereinafter also referred to as the "finger tips". The touchpad observed location of any given finger or thumb tip will often be referred to as (xt, v.), where x and y are the observed touchpad data, and "i" refers to or is associated with the finger that ultimately produced the touch pad touch data.
[0156] The raw touch pad data does not include such (x;, y;) labels. Instead, the system embodiments may have to make sense of various incoming touch pad data, attempt to make sense of the data using the underlying biomechanical and/or anatomical model of the human hand, and then generate a virtual hand model that is consistent, with both the touchpad data and the underlying biomechanical and/or anatomical hand model.
[0157] It may be simpler to first consider a model of the human hand as it rotates to various positions and touches the touchpad with various fingers, and determine what sort of
mathematical transformations are at work in generating the raw touchpad data. Once the above is determined, the process may be worked in reverse to generate a virtual model of the user's hand.
[0158] In one embodiment, it is desired to be able to determine which touch point, belongs to which finger when the system detects multiple touch points on the touch pad. Naming convention are described as follows. Thumb = finger 0 = F0, index finger = finger I = Fl; middle finger = finger 2 = F2; ring finger = finger 3 = F3, and little finger (pinky) - finger 4 ~ F4. As previously discussed, the finger number may be represented by the variable "i". Thus, when fingers FO to F4 are present on the touch pad, the problem becomes one of determining the coordinates (x., Vj ) of the fingertips of FO to F4, and then mapping the coordinates to the biomechanicai and/or anatomical model of the human hand.
[0159] Neglecting, for the moment, hand location on the touchpad issues, one problem is that users will usually operate the touchpad with the palm line direction of their hands (1322) at an arbitrary angle Θ with respect to the coordinate system of the touchpad. Thus, an early step to making sense of the touchpad data is to determine this angle Θ, and to transform the raw touchpad data by a rotation of angle Θ and see if the raw touchpad data matches up to a sensible biomechanicai and/or anatomical model of the human hand. This transformation problem is depicted in Figure 16A through Figure 19.
[0160] In one embodiment, a simplified exemplary palm angle (Θ) rotation transformation may help the system relate raw touchpad data to a standard biomechanicai and/or anatomical model of the human hand, in accordance with one embodiment of the present invention. If the user touches both the tips of all fingers and thumb and the base or finger root of all fingers and thumb onto the touchpad, then the raw touchpad data would include a series of (x{, 'ø values for the finger tips, and a series of (xn, yri) values for the finger roots.
[0161] In one embodiment, the system may determine how much the user's hand is rotated relative to the coordinate system of the touch pad using palm angle Θ, then the process of mapping the raw data into the biomechanicai and/or anatomical model of the human hand may be simplified. Thus, a determination of palm angle€ is useful.
[0162] Figure 18 depicts more exemplary details of the relationship between the hand 's palm direction or palm angle and the tips of the user's fingers, in accordance with one embodiment of the present invention. When using touchpads, users will often touch the pad with the fleshy area of the palm underneath their finger roots (1822). If the finger root touch area information is detected by the touch pad, the system may detect the direction of the palm line of the hands (1322) from the finger root touch area. Alternatively, the system may use a relaxed finger position direction depicted as dashed - dotted line (1825) from, touchpad touch point (1810) on F5 to touchpad touch point (1820) on F3, or a relaxed finger position direction from touchpad touch point (1810) on Fl to touchpad touch point (1830) on F4 to approximate the palm line direction (1322) and adjustment angle σ between the relaxed finger position direction and the palm line direction, e.g., between line (1 825) and palm line ( 1322). The system may then determine the angle Θ between the palm line, and the touch pad coordinates such as the touchpad x-axis.
[0163] Figures 16A - 16B depict how a simplified exemplary palm angle rotation
transformation may help the system relate raw touchpad data, to a standard biomechanical and/or anatomical model of the human hand, in accordance with one embodiment of the present invention. The process of rotation transforming the raw touchpad data (x,, yj) into palm angle corrected touchpad data (x'j, y'j) may be done using the following coordinate rotating formula;
'λ lcosO -sinO} r*l
where x'— xcos9 — ysinO and y!— xsinO ~ ycosO
y' j l είηθ -cosO
[0164] The system may find the palm angle using the finger root and/or finger touch points as shown in Figures 16A and 18.
[016S| The system may also calculate the finger touch coordinates (¾, y), where i=: 0, I , 2, 3, 4 for each finger, as well as the palm line rotation angle Θ, and the new coordinates (x'j, y'j). These calculations may be done using the formula (1) coordinate rotating formula shown above. Figure 16A depicts the before the rotation transformation (correction) and the before and after results of this rotation transformation or correction and Figure 16B depicts the results after the rotation transformation or correction depicting the palm line direction being substantially parallel to the touchpad's x-axis. It is understood that the word substantially herein refers to an accuracy sufficient to achieve proper guidance of the virtual finger(s) displayed on the display screen to the extent that the user may be able to guide the hand to properly strike a virtual key or other control object displayed on the screen and not intending to imply any more accuracy than so required. Thus, more accuracy is required for smaller virtual keys or control objects than for larger virtual keys or control objects but exact anatomical matching to the hand is not required.
[0166] In one embodiment, the system may also determine the finger root (χ„, yri) location coordinate for one or more of the fingers Fl , F2, F3, F4. Then the system may perform the analysis often based on the assumption that the Fl root coordinate (xr!, y,f) is the most available (i.e. most frequently found in the raw touchpad data), which is often true because the finger 1 finger root commonly touches the touchpad surface. Alternatively, because the palm does not bend much, the finger Fl root coordinate may be calculated from the other palm touch points, i.e. other finger roots (xr;, }¾).
[0167] Figure 17 depicts more exemplary details of the relationship between the finger roots (Xri, yd), i.e., roughly finger joint region (1316), and the hand 's overall palm angle, in accordance with one embodiment of the present invention. Often there will be missing finger root position data.. Here various a ssumptions, based on the anatomical characteristics of the human hand, may be used to fill in the missing data.
[0168] For example, the root coordinates for finger 1 are available (xrl , yrl), then based on hand anatomy considerations, the position of the finger 2 root is likely to be, or may be calculated to be: xr2— xr + ~ (wx + w )co5<9 and
.Yr2 = rl - + W2)s 0, where Wj is the width of finger i and ~ (yv + 2) :::: L12 as depicted in Figure 17.
[0169] Figure 19 depicts how simplified exemplary biomechanicai and/or anatomical model data pertaining to the width of the fingers, such as L12 , may be used to help interpret raw touchpad data, in accordance with one embodiment of the present invention. A palm line vertical direction (1930) may be defined running substantially through touchpad touch point (1820) on finger F2 and substantially perpendicular to palm lines (1322). The intersection of palm lines (1322) and palm line vertical direction (1930) passing through the longitudinal axis of F2 may pass through the finger root for F2 at (xr2, yr2), which may be used for the origin of the coordinate rotation axes X, Y. In the same manner, the system may also calculate the likely finger root coordinates for fingers F3 and F4 (in this simplified approximation, the model may assume that the finger roots are substantially on the same palm line (1322) as per Figure 16 A, 16B, 19, Figure 13, and elsewhere). The system may also calculate the new coordinates for any given finger "i" root assuming that the hand is rotated at palm angle Θ by also using rotation formula (1). Here, for example, for finger roots i= 1, 2, 3, and 4, the rotation transformed finger root locations may be expressed as:
Figure imgf000045_0001
{01701 Figure 20 depicts how in a more accurate exemplary model, the location of the various finger roots will be displaced to some extent from the palm line (which forms the palm angle) by various amounts 6ri, in accordance with one embodiment of the present invention. Referring simultaneously to Figure 57 and Figure 20, the rotation transformed positions (x'n, y'ri) of the various finger roots after rotation by palm angle Θ may be calculated by the following process. First, calculate the rotation transformed finger 1 root position (x'rl, y'rl) from (xrl, yrl) using formula 1 . Second, apply the formulas xr'2 = xr'l + i ( i + w2) and .y 2 = yr'l + Sr2
[0171] Here often, for a still better approximation, the system may assume that any of finger roots F2, F3 and F4 might be displaced somewhat from the palm line (1322) by a. small amount, represented by δ,; as depicted in Figure 20. Note that 6rl may be either positive or negative.
{0172] Referring simultaneously to Figure 17 and Figure 20, similarly for fingers F3 and F4, the system may use the approximation that the finger roots are usually separated by the width of the fingers to make further approximations or educated guesses such as: xri = *r(i-l) + ^^- ' ) + Wi) anA ' i = > r(i-l) + δΠ
[0173] Similarly, for the thumb (finger F0):
1
xr ! (j ~ xr' X— - ( 0 + wx) and yr'Q = yr'l + Srl , where - ( 0 + x) = Loi as depicted in Figures 19 and 20.
[0174] Alternatively, using the various rotation transformed finger root coordinates (x'n, y'f;), the system may also perform the inverse transformation using formula (1) to calculate the raw touchpad data root position coordinates (χ„, yrj) in the original touch pad coordinate system. This later technique is often especially useful for determining if any of the raw touchpad data might represent the thumb root location (x,o, yfo)- The raw thumb root touchpad data is often difficult to obtain because sometimes the thumb root does not touch the surface of the touchpad.
[0175] Using the techniques described above, the system may make sense of the raw touchpad data by sorting the set of rotation transformed fingertip positions {(x\, y'j)} and finger root positions {(x'ri, y'o)} according to ascending (or descending) x value, and then attempt to pair the rotation transformed possible fingertip data (x'j, y ) with the rotation transformed possible finger root data (χ'„, y'ri).
[0176] Missing finger detection algorithms.
[0177| Often the user will have real fingers elevated far enough above the touchpad to produce a "missing finger" problem - that is the touchpad raw data, will lack the coordinates of one or more user fingers, and the system software may have to attempt to deduce the existence and location of these one or more "missing fingers".
[0178] In one embodiment, a unique touchID may be assigned for each continuous touch. Thus when a. finger "i" was previously touched to the touchpad and was lifted later, one may use the touchpad history data obtained by the system at earlier time points (usually a. fraction of a second earlier, i.e. time (t-i)) to determine the missing finger. Such time data may also be used in another alternative approach, to be discussed shortly. For example, at time (t-1) (i.e. the previous history of stored touchpad data in a time indexed stack of such touchpad data, with fingers F0-F4 identified, one has:
¾(r- i >Ό;(ί- 1 touch! Oo,^-!)) - (^(t-D. ytCt-D. toucW ^t-D)}
[0179] Assume, for example, that currently at time t, the system has a raw set of data for just three touch points from three fingers, such as fingers F0, Fl , F2 (although this example is using and numbering fingers F0, Fl , F2, other fingers and other finger Fi could be alternatively used). The raw data would be:
{(Xto. yto. tOUChl Dto) ,.. ,. (. xtll yn touchIDtl touch! Dt2)■■■} 0180| If one finds, for example, that touch! D0^^ = touch! D 0 and touch! 03 ^...^ · = touchlDtl and touch!D, 4,(t-l) = touch! Dt2 , then one may tell from the current data set at time
"t" that; (xt0>yt0> touchiDt0) belongs to finger 0, and (xti,yti> touchIDtl) belongs to finger F3, and (xt2, yt2, touch! Dt2) belongs to figure F4. Then one may further determine that fingers Fl and F2 are missing, i.e. likely elevated from the touchpad. The positions of various combinations of other fingers may also be analyzed by the same methods.
[0181] However at the initial starting time t=0, the history data may not be available. Thus, one should determine the missing (e.g. elevated) fingers by one or more various alternate methods, such as the methods described below.
[0182] Figure 21 depicts how the simplified exemplary system may attempt to correlate detected fingertip data from some fingers with finger root data from other fingers, determine that some fingertip data is missing, and thus deduce that these fingers are elevated above the touchpad, in accordance with one embodiment of the present invention. The "missing fingers" include fingers Fl and F2, which are depicted with shading. Missing fingers include a finger that might have been elevated too far above the touchpad in order to register a touch point imprint on the touchpad. To cope with missing fingers the system may operate as follows. First, from the touchpad data set, the system may calculate the palm angle Θ coordinates for the various fingertips and finger roots. Second, for each fingertip, the system may check if the position of the fingertip is inside of the range of likely finger root j positions using a formula such as:
Figure imgf000047_0001
where wj is the width of finger i.
[01831 if the fingertip "i" is within this range j, then the system will attempt to match the fingertip "i" with roof j. The system, may for example, even attempt to match potential fingertip locations from one finger with the potential finger root data from another finger. For example, the system may attempt to match the fingertip data (x'j, y'i) with the finger root position (x'r3, [0184] For the thumb finger F0 root (x'ro, y'ro), and pinky finger (finger F4) root (x'r4, y'r4), the range may be calculated as follows:
Figure imgf000048_0001
xr4 — -~≤ x4 ≤ length^ , where length) and length* correspond respectively to Lo and L4 and where corresponds to LB as depicted in Figure 21. Note that in this example, the system is also incorporating finger length (i.e. the length between the fingertip (x'i, y';) and the finger root (x'n, y'n)) into its biomechanical and'Or anatomical model of the human hand.
[0185] In the frequent cases where finger tips may not be successfully matched with corresponding finger roots, for each un-matched finger root, the system may mark that finger as missing (i.e. likely raised above the surface of the touchpad, rather than touching the touchpad). See for example, fingers Fl and F2 in Figure 21. Here in Figure 21, the shading of fingers 1 and 2 shows that the system has recognized that the fingers tips are missing, probably because the finger tips are elevated a sufficient distance above the touchpad.
[0186] Missing finger move/display algorithms.
[0187] In order to show a virtual image of the moving finger, along with fingers that are touching the touchpad, in some embodiments it will also be useful for the system to maintain a stack of the latest n (where n≥ 1) sets of finger position history information - i.e. retain a history of the most rece t finger positions.
[0188] In one embodiment, when, as will frequently be the case, the finger position data is insufficient, but the missing finger "i" may be identified by using the previous algorithm or other methods, one may approximate the missing finger's position by assuming that as per a normal biomechanical and/or anatomical hand, the change x and y position of the missing finger's neighboring fingers (e.g. neighboring change Δχ and Ay) will also pertain to any change in location of the missing finger as well, as described in the following examples. [0189| Assume that the current time is time "t", and that the locations of the fingers at earlier times (i.e. within a second or a few fractions of a second) have been saved as frames data such as of t-1 in the stack. The system may compute a weighted mean value such as:
=1(weightj * (xjc - x → )
wei qht j and
i weight j * ((vjt - yj(t-r))
t Weight j where j = [l, n\ are the touching fingers.
)] Using the above scheme, then the current position for the missing finger "i" may be calculated as follows.
*it = ¾t-i) + &xit and ylt - yi t→) + Aytt
Note that for hxit Lyit) calculations, one may also use other mathematical methods such as arithmetic mean, median geometrical mean, harmonic mean, and so on.
[0191] Finger to hand matching algorithms.
[0192] Often, particularly for larger devices, the user may operate the device using two hands. When the user is operating the touch pad with two hands at the same time, the system
additionally should be able to decide to which hand the touch points belong to in order to show the user's two hands properly in the display. The system may use various algorithms to help with this decision.
[0193] In one embodiment, the system may use the range information on the coordinates after rotating the data by palm, angle Θ, as is shown on Figure 21. In this example, all touch points within the following range may be assumed (i.e. mapped into) one hand. The criteria here may be:
For x [xr' 0— length0, Xr4 + length4] and for y: [0, length2] or [0, max[length0 ... length^}] [0194| The system may also use the touch angle information for touch points and palm line angles to help assign the raw touchpad data to one hand or the other. Here, for example, the system may assume that both hands belong to the same individual, and essentially extend the biomechanicai and/or anatomical model of the human hand to also put in some simplified human anatomical constraints regarding the relationships between the angles of one hand and the angles of the other hand.
[0195] Figure 22 depicts how the simplified exemplary system may further assign raw touchpad data to two different hands (left hand (2202) including FOL through F4L, and right hand (2204) including FOR through F4R) of the same user, based on the assumption that the range of possible hand angles for the same user is limited by the user's anatomy, in accordance with one embodiment of the present invention. According to the previously discussed multi- touch protocol, the touch angle of a touch point, may also be determined along the long touch side defined as follows. That is, usually a finger will touch in a roughly oval pattern with the long axis of the oval, i.e. the long touch side, corresponding to the touch angle a of a touch point. For example, based on human anatomical considerations, the angle a between the touch point directions D4L, D2R. and the associated respective palm line vertical direction (2220, 2230) will generally be in the range of [0, 90] degrees. The palm line vertical direction (2220, 2230) is substantially perpendicular to associated palm lines left and right (1322, 2222) respectively. In this example, palm line vertical direction (2220) may be associated with finger F4L through touch point (2224) and palm line vertical direction (2230) may be associated with finger F2R through touch point (2234)
[0196] Alternatively or additionally, in one embodiment, the system may also partition the touchpad area (for example, split the touchpad area into a left half and a right half) and assign some or all of the touch pad data from the left half to the user's left hand, and assign some of all of the touch pad data from the right side of the touchpad to the user's right hand.
[0197] Angle based finger matching algorithms:
[0198] Figure 23 depicts a first simplified exemplar}' example of angle based finger matching algorithms, in accordance with one embodiment of the present invention. Angle based methods may be used to match raw touchpad data with specific user fingers. These alternative, angle- based, finger matching algorithms may be implemented as follows. First, perform or do a best fit between the touchpad data and the biomechanical and/or anatomical model of the user's hand, and second, use this best fit biomechanical and/or anatomical model of the user's hand, to find a point substantially along a mid-line (2310) of middle finger F2. Here, for example, one may have mid-line of middle finger F2 pass through the center of the palm, e.g. palm center point, (xc, yc) or other point on the palm (any palm center point (xcenier, venter) may be used so long as it is inside a region bounded by the five metacarpophalangeal joints (xn, }¾))· The coordinates of the palm center point may be calculated based on the finger model, and known finger positions.
[0199] Continuing the above algorithm with the second step, find (e.g. calculate) the finger root (metacarpophalangeal joint) coordinates (xrQ, yro) ... (xr4,yr4) and calculate the angle ari of finger root "i" to palm center: ari = atan2(xrl - x centeT .,y4 - y center)-
The angle calculated by ataii2 has a range within about -π to +π. Third, sort the aro, ¾..αΓ4 in ascending or descending order.
[0200] Figure 24 depicts a second simplified exemplary example of angle based finger matching algorithms, in accordance with one embodiment of the present invention. Continuing the above algorithm with the fourth step, find all fingertip coordinates (x¾ y0), (xj, Vi) . . . (X4, y4) and calculate the fingertip angle a, to palm center (xc, yc) = (xCenter> Ycenter)- where at = atan2 xt -- -centen Ji "" J center) - Next, sort c¾ in the same order as the ¾. Then, match the corresponding finger to the associated angle, as per Figure 24. The advantage of this approach is that one does not need to perform coordinate rotation to match the fingers. Instead, the atan2 calculations may be done by computationally faster methods, even by table lookup methods, as needed.
[0201] Iterative or best fit embodiments.
[0202] In some embodiments, particularly when the raw touchpad data is cluttered or otherwise confusing, the hand and finger analysis software discussed above may operate by an iterative process. For example, the software may make tentative assignments between the raw touchpad data and one possible set of fingertip, finger root, or palm touch points on the previously discussed biomechanical and/or anatomical model of the human hand (here the user's hand), and score the results according to how close the raw touchpad data, either before or after various possible transformations, may fit with a known hand configuration. The software may then explore other possible hand configurations and transformations, and then select or choose the hand configuration and/or transformation (e.g. rotations, translocations, missing fingers, and the like) that produces the highest overall score. The software will then use the highest scoring hand configuration and orientation model for virtual hand display purposes.
[0203] Optional Imaging,
[0204] In some embodiments, to improve accuracy (that is to replace standard human hand biomechanical and/or anatomical model default parameters with actual user calibration parameters), it will be useful to acquire an image of the user's hands, and to employ various image processing and analysis techniques to analyze this image of the user's one or more hands to better estimate the relative length of the various bones of the user's hands. Indeed, in the event that the user has lost one or more fingers, the system may then use this information to make corresponding changes in its biomechanical and/ or anatomical model of the human hand. In other words, the model may include calibration information associated with an image of at least a portion of the hand of the user,
[0205] Figure 14 depicts how the simplified exemplary user's hand or hands may be photographed by the device's camera or other camera, and this image information may be used to refine the default parameters of the biomechanical and/or anatomical model of the user's hand, in accordance with one embodiment of the present invention. In acquiring such images, often it is useful to have the system provide a standardized background, such as a series of distance markings, grid, graph paper, and the like (1400) in order to better calibrate the image of the hand and correct for image distortions. This standardized background may additionally include various color, shades of gray, and resolution test targets as well. The background may be conveniently provided by, for example, electronically providing one or more background image sheets (e.g. a jpeg, png, pdf or other image file) for printing on the user's printer.
[0206] In one embodiment, the user may put each hand on background (1400), and take a photo of the hand(s) ( 1402) with either the computerized device's camera or other camera. This image may then be analyzed, preferably by an image analysis program. The background image will help correct for any image distortions caused by different camera angles, and the like. The user hand image analysis may be done onboard the user's handheld computerized device, but it need not be. In an alternative embodiment, the user may upload one or more images of the hand taken by any imaging device to an external image analyzer, such as a remote internet server. In either event, the image analyzer will analyze the user's skin or hand outline appearance (1404), deduce the most probable lengths one or more bones of the user's hand, such as the user's various finger and thumb bones, and send this data or other data to correct the default biomechanical and/or anatomical model of the user's hand(s) back to the user's computerized device, such as for example during calibration step 906 referenced in Figure 9 above.
[0207] Alternatively, at least with more sophisticated and possibly next-generation touchpads capable of providing position details for a large number of contact points, the user may calibration the touchpad by firmly pressing a portion or all of the user's hand on the touchpad, and allowing a highly capable touchpad to in turn precisely render the resulting handprint. A compute program may then analyze the touchpad-derived handprint, extract parameters such as finger joint positions, probabl e finger and hand bone lengths, and the like and derive the same information as previously discussed for the photographic calibration step above. In other words, the model includes calibration information in accordance with pressing a portion of the hand of the user on the touchpad.
[0208] Alternatives or supplements to the touchpad .
[0209] In an alternative embodiment, information on the user's finger placement may be obtained using optical methods. Thus in an alternative embodiment, the touchpad sensor may be an optical method such as one or more cameras. These camera(s) may keep track of the user's hand and finger positions, and this data may then be fed into the biomechanical and/or anatomical model of the human hand(s) to compute a graphical representation of at least the user's fingers as described previously.
[0210] Real time video updating [02111 In another embodiment, image information may also be used to refine the biomechamcal and/or anatomical model of the user(s ) hands in real time while the user is using the touchpad.
[0212] Figure 15 depicts how an exemplary device camera ( 1500) may be used to obtain a partial image of the user's hand (1506) while using the device's touchpad (1508), and this information also used to update and refine the biomechamcal and/ or anatomical model of the user's hand, in accordance with one embodiment of the present invention. The rear mounted device camera (1500), which often will have a very limited field of view at close range (1502), may nonetheless be used to obtain a real time video image of a portion or part (1504) of the user's hand (1506) while the user is using a rear mounted touch pad (1508) using a touch pad mounted on the back of the computerized device (1510). At the same time, touch pad data gives the position of the user's index finger (1512) as a strong touch pad signal, and the position of the user's middle finger (1514) as a weaker touch pad signal (1514),
[0213] Note that although the portion of the hand (1504) that may be directly visualized by video camera (1500) does not include any image information at all pertaining to the position of the user's fingers, the image information ( 504) does provide a useful series of further constraints upon the biomechamcal and/or anatomical model of the user's hands. Thus, the partial hand image information, in conjunction with the touch pad data (1512), (1514), and optionally with a refined biomechanical and/or anatomical model of this user's hand (if available) obtained in Figure 14, above, may improve the accuracy of the depiction of the user's hand and fingers.
[0214] In some embodiments, for amusement or artistic purposes, the user may not wish to have a fully accurate anatomical model of the user's virtual hand displayed on the screen, but, may instead prefer a variant, such as a realistic depiction of a "monster hand" with fingers being replaced by claws, fur, or pads, and the like, or of a skeleton hand that sho ws the underlying biomechanical and/or anatomical estimation of the user's hand bones as per Figure 13.
[0215] In one embodiment, the system software may also be configured to render the user's fingers and hands as various hand variants when displayed. Generally, these hand variants will still provide realistic information pertaining to the user's hand and finger placement, but will also provide this information as various user artistic options that often may be customized according to user preference.
[ 021 1 Three dimensional multi-touch gesture controls.
[0217] Commonly, touchpad controls to a computerized system have focused on two dimensional finger gesture controls requiring finger contact on the locally two-dimensional touchpad surface, even if that surface as a whole may be curved or otherwise project into the third dimension to some extent. In contrast, the embodiments of the present invention, which may operate using a biomechanical and anatomical model of the human hand, may include a. three dimensional gesture component that enables various types of three dimensional multi-touch gesture controls described below. Three dimensional multi-touch gesture controls may be advantageous in applications where the user needs to touch a portion of the touchpad continually, such as for example, when the user holds a. handheld computerized device including a touchpad on the backside of the device. The three dimensional multi-touch gesture controls may help the computerized system differentiate touches on touchpad control regions intended as control inputs from touchpad touches used to merely hold the device.
[0218] In some embodiments, the three dimensional sensing aspects of the present invention may be used to control virtual keyboard data, entry to a computerized system by various "lift and tap", or "lift and drag", or "lift and other gesture" type modes for data input. More complex variants can also implement other commands, such as "lift and tap and rotate, e.g. with two fingers", "lift and tap, and enlarge, e.g. with two fingers", and so on.
[0219] In one embodiment, the biomechanical and anatomical model of the user's hand may inform the system when one or more user fingers are positioned on the touchpad so as to be above a corresponding control region of the touchpad, such as above a key of a virtual keypad, virtual keyboard, or above a hyperlink, and/or the like, but not yet touching the corresponding control region. Because the model of the hand accurately determines the location of the one or more user fingers even when the user's finger is not touching the surface of the touchpad, the "off-touchpad" finger location may be used for three-dimensional gesture control. [0220] In one embodiment, the control region of the computerized system may be on a touchpad including an integrated display screen located in substantially the same location. For example, integrated touchpads may include both display screen and touchpad built in layers and accessible from the same surface or side of the computerized device and thus located substantially in the same location even though the layers may be separated by small dimensions relative to the touchpad surface length or width. In an alternative embodiment, the control region of the computerized system may be on a separate and/or additional touchpad being located in a location that is different from the location of the display screen as previously described in reference to Figure 8. For example, integrated touchpads may include both display screen and touchpad built, in layers and accessible from the same surface or side of the computerized device and thus located substantially in the same location even though the layers may be separated by small dimensions relative to the touchpad surface length or width,
[0221] In one embodiment, the user moves a finger onto a control region of the touchpad, and this finger is in contact with the touchpad. The computerized system may determine if the user wants to activate that control region, e.g. press the virtual key or control region to generate an input to the computerized system using, for example, a "lift and tap" type control scheme described as follows. When the system does not receive inputs according to the "lift and tap" or "lift and related type" control schemes described below, the initial finger touch, even if touching a control region of the touchpad, may not generate unwanted control inputs enabling the user to continue just safely holding the touchpad.
[0222] Figure 26 depicts a simplified exemplary flowchart of a "lift and tap" technique of key entry for controlling an input from a user to the computerized system, in accordance with one embodiment of the present invention. Figure 26 depicts the "lift and tap" technique includes obtaining (2510) data from a touchpad, the data being associated with the location and movement of a finger and/or hand of the user and not associated with an image of the finger of the user from an image sensor, when the user operates the computerized system using the touchpad. The "lift and tap" technique further includes communicating (2620) the data from the touchpad to the computerized device and analyzing (2630) the data in accordance with a model of a human hand, such as referenced in Figure 9 - Figure 25. [02231 Figures 27A - 27F depict a series of simplified exemplary display screen shots of the "lift and tap" technique of key entry depicted in Figure 26 being used to type the first two letters of a "Hello World" message on the computerized system, in accordance with embodiments of the present invention. Figures 27A - 27F were obtained as screen shots or grabs in that respective order from a few seconds of video showing the display screen of a prototype handheld computerized device while the user proceeded to type at touch-typing speed using the "lift and tap" technique referenced in Figure 26, The system has already assigned the touch data from the touchpad to at least one of the multitude of fingers of the model, computed a. graphical representation of the at least one finger of the user, in accordance with the model, and ow displays the graphical representation of the fingers (Fl , F2, F3) and hand (2701) of the user on a display screen (2700) of the computerized system.. Note that, hand (2701) including fingers (Fl , F2, F3) is displayed clearly as a virtual, i.e. computer-generated, hand because the palm, includes square edges, the fingers include straight sides, and the joints between the fingers and the palm, are not continuous.
[0224] Referring simultaneously to Figure 26, Figure 27 A, and Figure 27D, in one embodiment, the system, (optionally) determines (2640), in accordance with the touchpad data and the model of the human hand, that at least one user finger (Fl , F2, F3) is initially touching, or is in contact with, the region of the touchpad corresponding to a virtual key (2702, 2703, 2705, 2725) or other control region.
[0225] In one embodiment, when the system first detects that a particular user finger initially touches or is in contact with a virtual key or other control region, the system may optionally generate a graphical representation associated with the control region being touched on display screen (2700) of the computerized system. For example, Figures 27A - 27F further depict the system is generating and displaying on display screen (2700) a graphical representation of a virtual keyboard including a multitude of virtual keys, including virtual keys (2702, 2703, 2705, 2725), corresponding to control regions on the touchpad.
[0226] In one embodiment, the system may then change the appearance of the graphical representation of the touched virtual key or other control region to sho w or indicate the control region is being initial touched thus providing confirmative feedback to the user. For example, the change of the display image of the touched virtual key may be shown as a change of size, color, shape, slight displacement of the position of the control region image, slight change in display type such as distortion of the control region image, flashing, and/or the like. For example, Figure 27A depicts virtual keys (2702, 2703, 2705), which are being initially touched by respective user fingers (Fl , F2, F3) are temporarily displayed including a slightly larger size and a slightly bolder upper border than the remaining untouched keys,
[0227] Referring simultaneously to Figure 26, Figure 27B, and Figure 27E, in one
embodiment, the user may next lift the at least one user finger, e.g. graphically represented by finger F2 in Figure 27B, and finger F3 in Figure 27E, and the system determines if the now missing in touch contact or lifted finger is most likely positioned above the same previously touched virtual key or other control region on the touchpad. In some embodiments, the system will use the biorneehanieal and anatomical model of the user's hand to make the above determination. In other words, the system determines (2650), using the model that at least one finger of the user, e.g. graphically represented by finger F2 in Figure 27B, and finger F3 in Figure 27E, is positioned above but not touching the control region of the touchpad, e.g.
graphically represented by virtual key "H" in Figure 27B, and virtual key "E" in Figure 27E. The model may be used to determine, for example, that although the touchpad may no longer directly sense that the user's finger is in contact with that particular virtual key or other control region, nonetheless the finger is positioned directly above the control region, in accordance with touch data from other regions of the user's hand and/or fingers and the constraints of the model of the human hand.
[0228] It is understood that even if the user initially positions his or her hand and associated fingers in contact with the touchpad such that the at least one finger, e.g. graphically represented by finger F2 in Figure 27B, and finger F3 in Figure 27E, does not initially contact the control region, then the system may still properly determine what control region is positioned directly below the at least one finger not in contact with the control region. In other words, step 2640 described earlier may be an optional step in some embodiments because the model may determine the fingertip locations even with the at least one finger not initially in contact with the touchpad but hovering o ver the control region using the constraints of model of the human hand. (0229| Still referring simultaneously to Figure 26, Figure 27B, and Figure 27E, in one embodiment, the system may temporarily - for example, a first time interval between 0.05 and 5 or even 10 seconds, change the appearance of the graphical representation of the virtual key or other control region the at least one finger is hovering over. In other words, when the at least one finger is positioned above but not touching the first control region, the system may display the graphical representation of the control region with a different appearance than the prior appearance of the control region. The difference in the graphical representation may be an enlargement of the represented virtual key or other control region, as showrn at (2720) and (2730), or the difference may be another visual change such as a change of size, color, shape, slight displacement of the position of the control region image, slight change in display type such as distortion of the control region image, flashing, and/or the like.
[0230] In another embodiment that, might be useful for handicapped individuals, the system may instead produce or generate a sound signal audible to the user instead of, or in addition to, the changed appearance of the graphical representation of the control region when the at least one finger of the user is positioned above but not touching the control region. Note that in Figure 27B the graphical representation of virtual keys (2702, 2705) remains displayed as in Figure 27A because user fingers (F l , F3) continue to touch the touchpad in Figure 27B.
[0231] Referring simultaneously to Figure 26, Figure 27C, and Figure 27F, in one
embodiment, when the user lowers the at least one finger (e.g. graphically represented by F2 in Figure 27C, and F3 in Figure 27F) back onto or touching the touchpad in that region of the touchpad that corresponds to the control region (e.g. graphically represented by virtual key "H" in Figure 27C, and virtual key "E" in Figure 27F). In one embodiment, to prevent false inputs and/or to enable finger re -positioning without command input, the user's finger lowering may optionally be required to happen within a certain time interval or "first period of time" (usually on the order of between 0.01 seconds and 5 or even 10 seconds after the system optionally changes the appearance of that particular key or other control region at the start of step 2650.
[0232] In one embodiment, the system may then verify, using the biomechanical and anatomical model of the user' s hand, that the user's at least one finger has now "struck" or "tapped" the particular virtual key or other control region. In other words, the system determines (2660) that the at least one finger is subsequently touching the control region in accordance with the data and the model. In one embodiment, the system may record, or register that the appropriate virtual key has been pressed by storing a record of that action in memory. In Figures 27A - 27C the user is inputting a command to the computerized system to type the letter "H". The system recognizes by the lift and tap action of user finger F2 that the user is commanding the system to type the letter "H", and the system generates and displays a corresponding letter "H" (2722) on display screen (2700) to confirm the execution of the command. In Figures 27D - 27F the user is inputting a command to the computerized system to type the letter "E". The system recognizes by the lift and tap action of user finger F3 that the user intends to command the system to type the letter "E", and the system generates and displays a corresponding letter "E" (2732) to confirm the execution of the command.
[0233] In one embodiment, the system may optionally change the displayed appearance of the graphical representation of the struck or tapped virtual key or other control region, often back to either its original appearance (as depicted in Figure 27C and Figure 27F) or an optional different altered appearance (e.g. a "key struck" appearance) to visually confirm the control region is touched or struck. In one embodiment, the altered appearance may include visual change such as a change of size, color, shape, slight displacement of the position of the control region image, slight change in display type such as distortion of the control region image, flashing, and/or the like. In one embodiment, the optional different altered appearance may be displayed for a short second period of time or time interval, often in the 0,05 to 5 second range, but may extend longer, such as up to 10 seconds. In one embodiment, alternatively or additionally, the system may also generate an auditory signal that the user's actions have resulted in the pressing of a particular virtual key or other control region.
[0234] Figure 28 depicts a simplified exemplar}' flowchart of a "lift and drag" technique of key entry for controlling an input from a user to the computerized system, in accordance with one embodiment of the present invention. The abo ve embodiments may be extended to other input techniques, such as "lift and drag". In a "lift and drag" technique, in one embodiment, the control region may include an elongated control region with a length substantially greater than the longitudinal length of the surface region of the user's at least one finger when contacting the touchpad. Such an elongated control region may include a virtual slider (e.g. a virtual linear control region), virtual rectangular or circular rotation (e.g. a virtual control knob) or virtual expansion-contraction control region, and the like. In another embodiment, the control region may include a multitude of file names, images, icons, and the like, so that the user may drag these file names, images, icons and the like via the drag technique to, for example, execute a move and'Or copy command on the corresponding files or virtual objects.
[0235] In one embodiment, after determining that the at least one finger is subsequently touching (2660) the first control region in accordance with the data and the model and optionally within a certain third time interval (usually on the order of between 0,0 seconds and 5 or even 10 seconds after the user's finger initially contacts the touchpad), the user may move or slide the at lea st one finger on the elongated control region of the touchpad. The system may then verify, using the biomechanical and anatomical model of the user's hand, that the user is moving or sliding the at least one finger over the elongated virtual key or other control region. In other words, the system, determines (2810) that the at least one finger is subsequently touching the first control region in accordance with the data and the model. The system may then store (2820) in memory a record of the moving or sliding of the at least, one finger (e.g. register that, for example, a slider has been moved, and the like), and then optionally change the appearance of the elongated virtual key or other control region, such as file names, images, icons and the like, to visually confirm, the command action was executed (e.g. move a knob on a slider control). In one embodiment, alternatively or additionally, the system may also give an auditory signal that the user's actions have resulted in the actuating of the drag command and associated result.
[0236] In one embodiment, the user may lift two or more fingers. In other words, the system may determine using the model, that a multitude of fingers of the user are positioned above but not touching the control region of the touchpad. The user may then lo wer the two or more fingers to the touchpad. In other words, the system may determine that the multitude of fingers are subsequently touching the control region in accordance with the data and the model.
[0237] Then in one embodiment, the system may determine a motion of a first finger in relation to a motion of a second finger different than the first finger and assign a command to control the computerized system in accordance with the determined motion. For example the user may either move the two fingers further apart on the touchpad to change the displayed image, e.g. magnify or zoom-in on a displayed image, move the two fingers closer together to zoom-out, or rotate the fingers around a rotation point intermediate between the two fingers to rotate an image. The system may be configured to do corresponding display screen operations under touchpad control where the image on the display screen expands or contracts, or rotates according to the rotation direction of the fingers when the relative motions of the two fingers are assigned to the respective commands. In one embodiment, the system may not require the use of a virtual key or other control regions. Instead, the system may operate as if the entire screen is a control region that may be subject to zoom-in, zoom-out, and/or rotation controlled as described above,
[0238] Most existing two dimensional multi-touch gestures may be similarly extended or modified into corresponding three-dimensional counterparts that incorporate the finger lift gesture component described above. Examples of existing multi-touch gestures that may be modified for additional finger lift functionality include various Apple OXS gestures, such as, but not limited to: swipe behind full-screen apps, two-finger scroll, tap to zoom, pinch to zoom, swipe to navigate, open launch pad, show desktop, look up, app expose, rotate, three-finger drag, tap to click, secondary click, notification center, and show web browser tabs.
[0239] The embodiments of the present invention may be extended or modified for use with to uchpads that, are capable of sensing the force exerted by a finger in contact with or touching the touchpad. One example of a force sensing touchpad embodying the present invention may use a touch sensing device commercially available from Synaptics Inc., San Jose, California and commonly known as the ForcePad™ family of products. With a force-sensing touchpad, the touchpad not only may determine a finger touch location on the surface of the touchpad, but also may sense and determine how much force per finger is being applied to the surface of the touchpad. In one embodiment, the dimension of force per finger in the touchpad data may be used by the system instead of or in addition to sensing when a finger of the user is lifted off the surface of the touchpad.
[0240] Figure 29 depicts a simplified exemplary flow chart of a "lift and tap" technique of key entry modified to use force applied per finger for controlling an input from a user to the computerized system, in accordance with one embodiment of the present invention. Figure 29 includes the same features as Figure 26 with the following exceptions. When the system obtains (2910) data from the touchpad, the data may be be additionally associated with the force of a finger and/or hand of the user upon the touchpad.
[0241] Referring to Figures 27A - 27F in one embodiment, the system may display a graphical representation of at least one touch point associated with the force per finger (2743, 2750, 2753, 2765, 2770, 2775) applied upon the surface of the touchpad by the user at the corresponding location on the touchpad where the fingers touch the surface of the touchpad. In one
embodiment, the graphical representation of the touch point may be depicted as a solid circle including a diameter associated with the amount of force per finger applied upon the surface of the touchpad. In alternative embodiments, shapes other than a solid circle may be used and/or other display attributes than size may be associated with the force per finger. In one
embodiment, the force per finger may be associated with at least one of a size, a color, a position, a shape, or a display type depicted on the display screen. For example, the amount of force per finger may be associated with a flashing type display where the rate of flashing may be associated with the amount of force.
[0242] Referring simultaneously to Figure 27A , Figure 27D, and Figure 29, after analyzing (2630) the data in accordance with a model of a human hand, the system may optionally determine (2940) that at least one finger is initially touching the control region using force within a range "A" in accordance with the data and the model. In one embodiment, force range A may include a range of force per finger above about 0.5 newton or about 50 gram weight equivalent units (on the surface of the Earth), corresponding to when the user touches the surface of the touchpad by pressing firmly. In one embodiment, the system may optionally display the graphical representation of the at least one touch point associated with the force per finger (2705, 2765) corresponding to force range A depicted as a solid circle having a relatively large diameter close to the pitch between adjacent virtual control surfaces, i.e. virtual keyboard keys, and/or close to the width of user fingers (F2, F3), respectively.
[0243] In one embodiment, an audible signal may be generated in addition to or instead of the graphical representation of the touch point when the user's finger is initially touching the first control region using force range A. In one embodiment, an mechanical response may be generated by the system in addition to or instead of the graphical representation of the touch point when the user's finger is initially touching the first control region using force range A. In other words, the computerized system may generate a. haptic feedback response from the computerized system to the user when the at least one finger of the user is touching the first control region using the first force.
[0244] For example, the mechanical response may include a haptic response such as mechanically shaking or vibrating a portion of the computerized system using a. mechanical actuator such that the touchpad is shaken by being mechanically coupled to the shaken portion of the computerized system. In one embodiment, the shaken touchpad may provide haptic feedback to the user's fingers in any combination of the visual, i.e. graphical representation, and/or audible feedback to the user indicating the user's action has been registered by the computerized system. In another embodiment, a different portion of the computerized system than the touchpad may be mechanically shaken or vibrated such a portion of the computerized system in mechanical contact with the user, e.g. a wearable device, or a device supporting the user such as a chair, seat, backrest, elbow rest and/or the like. Haptic feedback may be useful when audible feedback is undesireable or ineffective, such as in an audibly noisy environment.
[0245] Referring simultaneously to Figure 27B , Figure 27E, and Figure 29, in one
embodiment, the system may determine (2950), using the model, that at least one finger of the user is touching a control region of the touchpad using a force range "B" different than force range A. In one embodiment, a force per finger from force range A may be greater than a force per finger from force range B. In one embodiment, force range B may include a range of force per finger between zero newton and about 0.5 newton. In one embodiment, the system may change the display of the graphical representation of the at least one touch point associated with the force per finger (2750, 2770) to a different size, color, position, shape, or display type. For example, the touch point associated with the force per finger (2750, 2770) may be depicted on the display screen as a smaller solid circle corresponding to a lighter touch of the user's finger than when the finger was initially touching the control region in Figure 27A and Figure 27D.
[0246] In one embodiment, an audible signal may be generated in addition to or instead of the graphical representation of the touch point when the user's finger is touching the control region using force range B. In one embodiment, an mechanical response may be generated by the system in addition to or instead of the graphical representation of the touch point when the user's finger is touching the control region using force range B. In other words, the computerized system may generate a. haptic feedback response from the computerized system to the user when the user's finger is touching the control region using force range B.
[0247] In the example depicted in Figure 27B and Figure 27E, the finger is still in contact with the surface of the touchpad but with a. lighter touch than the initial touch. However, the users lighter touch action may be interpreted by the system in similar fashion as to the earlier embodiments which described the user's finger being lifted completely off the touchpad surface, i.e. force equal to zero newton, the lift in the present embodiment include lifting the fmger merely to reduce the force exerted by the finger but not completely lifting the finger off the surface of the touchpad. The lighter force lift technique may take less computational resources, provide faster system speed, and/or better reliability than when the finger is lifted completely off the touchpad surface because the lighter force touch point, location of the user's finger is directly available without having to estimate the position of a finger lifted completely off the touchpad surface.
[0248] Referring simultaneously to Figure 27C , Figure 27F, and Figure 29, in one
embodiment, the system may determine that the at least one finger is subsequently touching the control region using force range A. in accordance with the data and the model. In one embodiment, the system, may change the displ ay of the graphical representation of the at l east one touch point, associated with the force per finger (2753, 2775) to a different size, color, position, shape, or display type. For example, the at least one touch point associated with the force per finger (2753, 2775) may be depicted on the display screen as a larger solid circle corresponding once again to a more forceful touch of the user's finger in force range A similar to that when the finger was initially touching the control region in Figure 27A and Figure 27D.
[0249] Figure 30 depicts a simplified exemplar}' flowchart of a modified "lift and tap" technique of key entry modified to use a third force applied per finger for controlling an input from a user to the computerized system, in accordance with one embodiment of the present invention. Figure 30 depicts the same features as Figure 29, except Figure 30 depicts the system may determine that the at least one finger is subsequently touching (3060) the control region using a force range "C" in accordance with the data and the model. The third force applied per finger may be within a range of feree per finger from force range C. Force range C may include a range of force per finger that is different than both force range A and force range B. Use of three force ranges A, B, and C may further reduce unintended inputs compared to using just two force ranges. In one embodiment, force range C may include a range of feree per finger that is greater than force range A, which in turn may be greater than the range of force per finger from force range B. For example, the computerized system may respond to a touchpad input sequence from a finger of a user that includes a medium force A, followed by a small force B, followed by a large force range C similar to what the user may use when typing on mechanically actuated keys. It is understood that other combinations of the three force ranges may be used in sequence to provide an input to the computerized system.
[0250] In one embodiment, an audible signal may be generated in addition to or instead of the graphical representation of the touch point when the user's finger is subsequently touching the first, control region using the force range A. In one embodiment, an mechanical response may be generated by the system in addition to or instead of the graphical representation of the touch point when the user's finger is subsequently touching the first control region using the force range A. In other words, the computerized system may generate a haptic feedback response from the computerized system to the user when the user's finger is subsequently touching the first control region using the force range A.
[0251] The sequence of steps for a user's finger actuating a command area on the touchpad in the embodiments above included the system responding to an optionally stronger force range, followed by a weaker force range, followed by a stronger force range, in that order. It is understood that, in an alternative embodiment, the inverse sequence of force by the user's finger may be used where the system responds to a weak force range (optionally) applied by the user's finger, followed by a stronger force range, followed by a weaker force range, in that order. In either alternative embodiment, the system may recognize and respond to any sequence of a first force range followed by a second force range that is different from the first force range applied by the user's finger to actuate a command area on the touchpad. [0252] The embodiments of the present invention may be extended or modified for use with not only force-sensing touchpads that may directly determine the force exerted by a finger in contact with or touching the touchpad, but also with capacitive sensing touchpads. I one embodiment of the present invention, a capacitive sensing touchpad may indirectly determine the force using a contact area included in the data from the touchpad. In contrast, force-sensing touchpads directly determine the force without using a contact area included in the data from the touchpad.
[0253] Figures 31 A - 3 IB respectively depict simplified exemplary side and top views of a portion of the touchpad (3110) using the contact area (3120) resulting from a first force FA, in accordance with one embodiment of the present invention. Figures 31A depicts a user's finger (3130) pressing on the touchpad with a force FA using force range A, in accordance with one embodiment of the present invention as described above. Force range A deforms the soft tissue of user's finger (3130) between the user's bone and the touchpad surface, which is more rigid than the soft tissue, forming a contact area (3120) at the touch point on the touchpad.
[0254] Figures 32A - 32B respectively depict simplified exemplary side and top views of a portion of the touchpad (3110) using the contact area (3220) resulting from a second force FB, in accordance with one embodiment of the present invention. Figures 32A depicts user's finger (3130) pressing on the touchpad with a force FB using force range B, which is different than force FA, in accordance with one embodiment of the present invention as described above. In one embodiment, force range B may be less than force range A. Therefore, force FB deforms the soft tissue of user's finger (3130) between the user's bone and the touchpad surface to a lesser extent than when force FA is applied, forming a contact area (3220) at the touch point on the touchpad that has smaller area than contact area (3120) as depicted respectively in Figures 32B and 3 I B.
[0255] The system may use the contact area data from the touchpad to then indirectly calculate or determine the force range applied by the finger and/or hand against the touchpad. The contact area information requires the soft and/or resilient tissue of the hand to be in contact with the touchpad without the touchpad supplying the force data directly, which for example, precludes the use of a rigid stylus to enter the touchpad data instead of a user's hand. Once the system calculates or determines the finger force range applied to the touchpad, the system may then use the calculated force information in the same embodiments described in reference to Figures 27A - 30.
[0256] Distinguishing between control regions and "holding regions."
{02571 Although users may, for example, use back mounted touchpads to control their various handheld computerized devices, in some situations, users may simply wish to hold their handheld computerized devices in the same region as the back mounted touchpad, which may create false command inputs when the user inadvertently touches a control region but really intends to merely hold the device by touching the touchpad. In these and related situations, according to one embodiment, the user may designate a. portion of the touchpad surface area as being reserved for non-control purposes, e.g. "holding" purposes, hereinafter also referred to as a "non-control" region of the touchpad. In other words, the system enables the user to designate or lock out a portion of the touchpad temporarily as a non-control region for holding the handheld computerized device without controlling an input when the user touches the non-control region.
{0258] In one embodiment, the system enables the user to designate some or all of a touchpad as being at least temporarily a non-control region or off limits from a device control perspective by including an actuating button - either real or virtual. Alternatively, in another embodiment, certain user hand gestures, such as a swipe border gesture followed by a swipe "x" gesture within the border, may be assigned and recognized by the system as temporarily turning off touch control within the portion of the touchpad covered by the border and the swiped "x". The user may then safely hold the handheld computerized device or other device by the non-control region of the touchpad. When the user wishes to return to controlling the computerized device using the non-control regions of the touchpad, the user may then, in one embodiment, actuate a corresponding "restore" (real or virtual) button, or implement an appropriate "restore control" gesture or set of gestures designated to execute the restore control command.
[0259] "Push and lift" or "Enter and lift" key actuation.
[0260] Figure 35 depicts a simplified exemplary flowchart of a "push and lift" technique of key entry for controlling an input from a user to the computerized system, in accordance with one embodiment of the present invention. Figure 33 has the same features as Figure 26 with the following exceptions. Figure 33 depicts that the the computerized system first obtains (3310) data from the touchpad. The data may be associated with the location and timed movement of a finger and/or hand of the user and not associated with an image of the finger of the user from an image sensor, when the user operates the computerized system using the touchpad. After analyzing (2630) the data in accordance with a model of a. human hand, the computerized system determines that at least one finger is initially touching a first control region on the touchpad in accordance with the data and the model. In other words, the user may initially slide, touch, or tap one or more fingers over a key area or control region on the touchpad. The system may use the biomechanical and anatomical model of the human hand to recognize this sliding, touching, or tapping action by the user's finger on the touchpad.
[0261] In one embodiment, the computerized system may respond by, in turn, changing the visual appearance of the key or control area on the display as described above. Then, the computerized system determines (3350), using the model , that, at least one finger of the user is positioned above but not touching a first control region of the touchpad during a predetermined time window. In other words, the key or control region on the touchpad becomes activated or actuated when the system determines, again using the biomechanical and anatomical model of the human hand, that the user has then subsequently lifted their fingers or other portion of their hand from the key or control region of the touchpad.
[0262] Figure 34 depicts a simplified exemplary time-line of the "push and lift" technique depicted in Figure 33, in accordance with one embodiment of the present invention. The computerized system may use a predefined delay time Td and a predefined time window Tw to provide the system a technique to distinguish between deliberate user control actions, and random signals that might be caused by the user simply holding the device by the input surface area of the touchpad without intending to send control signals. Referring simultaneousl to Figures 33 - 34, the computerized system may be configured so that when the system determines (3340) the user initially slides, touches, or taps their fingers or other portions of the hand on a key or control region at time To, there is then only a limited predefined time window, Tw, during or in which the system may determine (3350) that a subsequent lifting of the user's fingers or other portion of their hand is considered to be a deliberate control signal so as to store a record of the lifting of the user's fingers in memory as described in the embodiments previously described.
[02631 In one embodiment, time window Tw may initially commence or open at a first delay time Td after the system determines (3340) the initial sliding, touching, or tapping motion of the user's finger or hand over the key or control region on the touchpad at time T0. Once open, time window Tw may then remain open until a time equal to To + Td + Tw when the time window then closes. In other words, the predetermined time window closes at the sum of the first delay time and window duration time Tw. Once time window Tw closes or after time To + Td + Tw, then if the system determines a finger of the user positioned above but not touching the first control region of the touchpad, then the computerized system may not consider such "finger lift" as a deliberate control signal and may not store a. record of such a finger lift.
[0264] In one embodiment, the time duration for delay time Td may be between about 0 and 1 seconds after sliding was first detected at To, or in other words, predetermined time window Tw commences after determining that the at least one finger is initially touching the first control region, which is at To. In one embodiment, predetermined time window Tw commences at delay time Td after determining that the at least one finger is initially touching the first control region. In one embodiment, delay time Td includes a range of time equal to or less than 1 second. In one embodiment, the time duration for window duration time Tw may include a range of time between about 0.25 second to 30 seconds.
[0265] In alternative embodiments, other values for delay time Td and window duration time Tw may also be possible so long as Td and Tw are chosen so as to enable the computerized system to differentiate between finger lifts intended as inputs versus finger lifts resulting in unintended inputs, such as for example, during finger repositioning merely to better grip or support the touchpad. In other words, determining (3350), using the model, using predetermined time window Tw enables the computerized system to differentiate between an intended input versus an unintended input by the at least one finger when the at least one finger is positioned above but not touching the first control region of the touchpad and when predetermined time window Tw is not open, such as before time To + Td and/or after time To + Td + Tw, The embodiments described in reference to Figures 33 - 34 may be combined in any combination with the embodiments referenced in Figures 1 - 32B described previously.
[0266| "Hover and tap" type key activation.
[0267] Figure 35 depicts a simplified exemplary flowchart of a "hover and tap" technique of key entry for controlling an input from a user to the computerized system, in accordance with one embodiment of the present invention. Figure 35 depicts features similar to Figure 26 with the following exceptions. The computerized system obtains (3510) data from the touchpad. The data, may be associated with the location and movement of a multitude of fingers and/or a hand of the user. But the data, is not associated with an image of a. first finger of the multitude of fingers from an image sensor, when the user operates the computerized system using the touchpad.
[0268] Figures 36A - 36F depict a series of simplified exemplar}' display screen shots of the "hover and tap" technique of key entry depicted in Figure 35 being used to type and enter the numeral "0" on a prototype computerized system, in accordance with embodiments of the present invention. Referring simultaneously to Figures 35 and 36A, after analyzing (2630) the data, in accordance with the biomechanicai and anatomical model of the human hand, in one embodiment the computerized system may optionally determine (3540) that first finger Fl is initially touching a control region on the touchpad, in accordance with the data and the model. For example, the system displays a graphical representation of virtual touch-point (3610) of finger Fl, touching between virtual keys "9" and/or "0" (3620, 3630) respectively. In one embodiment, the computerized system may generate a special graphical representation of the control region when first finger Fl is initially touching a control region on the touchpad.
However, in the embodiments depicted in Figures 36A because touch-point (3610) of finger Fl is in an intermediate position between keys, the keys are still displayed normally. The graphical representation of finger 1 is near both the number keys "9" and "0" on the virtual keyboard, but is not yet hovering over either number key, so neither number key is highlighted.
[0269] Referring simultaneously to Figures 35 and 36B, in one embodiment, the computerized system may determine (3550), using the model and the data from the touchpad, that finger Fl of the user is positioned above but not touching, hereinafter also referred to as "hovering", a control region such as virtual key "9" of the touchpad, which in one embodiment may in turn cause the system to display the control region or virtual key "9" in a highlighted fashion as described previously. The biomechaiiical and anatomical model of the human hand may be used to determine when a. user's finger is likely hovering above a key or other control area, which when detected may cause the computerized system to highlight the graphical representation of the displayed control area for virtual key "9" (3625).
[0270] The computerized system may predict (3545) the location of finger Fl in accordance with the analyzed data and the location of at least another finger of the multitude of fingers, such as for example finger F2 different than finger Fl and/or hand (2701). The touch-point data from finger F2 and/or hand (2701), which is touching the touchpad, is used with the bioniechanicai and anatomical model of the human hand by the computerized system to predict the location of finger Fl even when direct real-time finger Fl touch-point location data is absent because finger Fl is hovering above the touchpad without making touch contact with the touchpad. For example, it is noted that the position of finger F2 fully obscures the display of the numeral "6" on virtual key "6" (3630), while the position of finger F3 only partially obscures the display of the numeral "4" on virtual key "4" (3635). In some embodiments, the prediction (3545) step may precede determining when a finger hovers step (3550), while in alternative embodiments the prediction (3545) step may come after determining when a finger hovers step (3550)
[0271] Referring to Figure 36C, because the user does not wish to type the number "9", the user continues to move his Finger 1 hover location still above but not touching the touchpad towards the location of control region key "0". It is noted that finger F2 and F3 are still touching the touchpad but at slightly different positions than in Figure 36B. Now in Figure 36C the position of finger F2 only partially obscures the display of the numeral "6" on virtual key "6" (3650) because finger F2 has moved slightly upwards with respect to the figure. Similarly, the position of finger F3 has moved slightly upwards to fully obscures the display of the numeral "4" on virtual key "4" (3655). The changed positions of the touch-points of fingers F2 and F3 are used to predict the new real time position of still hovering finger Fl . Thus the computerized system in transition briefly enlarges or highlights the displayed graphical representation of both the previous key "9" (3640) and key "0" (3645). [02721 Referring to Figure 36D, the user continues to move his Finger 1 so that it is now hovering over the desired control area of key "0" on the touchpad, and the computerized system continues to predict the real time location of still hovering finger Fl using for example the new real time locations of fingers F2 and F3, which are still touching the touchpad at locations slightly different than those depicted in Figure 36C, Then in one embodiment, the graphical representation of key "0" (3660) is enlarged or otherwise highlighted by the system.
[0273] Referring simultaneously to Figures 35 and 36E, in one embodiment, the computerized system determines (3560) that finger Fl is subsequently touching the control region key "0" on the touchpad in accordance with the data and the model. The user is now actuating the desired key "0" by touching the corresponding "key 0" region of the touch pad. Note that the system is indicating a. virtual key "strike" by altering the appearance of the graphical representation as a. enlarged or highlighted region of virtual key "0" 3665, and that additionally the system now registers or stores in memory and displays the number "0" (3670) on the top of the screen.
Referring to Figure 36E, the action of tapping or pressing the control region of key "0" action is now fully complete and the graphical representation of virtual key "0" (3675) is displayed as at the start in Figure 36A in original state.
[0274] The user actuating a key or other control area by hovering and then pressing or tapping a key or control area is, from a human factors standpoint, easy for the user to learn and use because the user need merely tap his finger on a hover-highlighted key or control area in order to then actuate the corresponding virtual key or control area. One advantage of this approach is that that the user does not have to do a separate tap and lift motion to inform the system about his interest in actuating a particular key. Rather, when the user's finger hovers above a given virtual key, but does not actually touch it, the system will enlarge or otherwise highlight the key by predicting the location of the user's hovering finger without having to first touch the touchpad at the desired control region location. As a result, the hovering finger can more easily "hover- slide" onto a particular virtual key of interest, as depicted by the user both hovering and sliding between the key "9" and the key "0" locations in Figures 36B - 36D, Thus the user may experience a more natural keyboard typing experience, although the motions and positions of the fingers are otherwise different since the touchpad may be obscured from direct user view. The embodiments described in reference to Figures 35 - 36F may be combined in any combination with the embodiments referenced in Figures 1 - 34 described previously.
{0275] Tracking a position of a portion of the user's hand and/or finger using a virtual keyboard
{0276| In some embodiments, the position and orientation of the virtual keyboard can be set to track the position and orientation of a portion of the user's hand and/or finger as the user operates the touchpad. in this situation, although the virtual keyboard generation software may optionally still perform a certain amount of time averaging of the position and orientation of the portion of the user's hand and fingers, this time averaging may be relatively short, on the average of less than a tenth of a second to a few seconds, or even not at all, in some examples. Making the time averaging either very short or zero enables the system to essentially annotate the image of the moving virtual hand and fingers with corresponding virtual keys displayed in a display screen of the computerized system. In an embodiment, and as will be discussed in detail below, these keys may be assigned to the different fingers of the user in accordance with a standard QUERTY format, or other keyboard formats.
[0277] In some embodiments, when the user moves his or her hand around the touch pad, a portion of the virtual hand and/or the virtual finger may be within immediate striking range of the various virtual keys, thus enabling the user to start typing immediately regardless of the orientation of the user's hand relative to the touchpad or the device display screen.
[0278] Controlling a control region on a display screen of a computerized system
{0279] In accordance with at least some embodiments, the computerized system may assign touch data, pertaining to a position of a portion of the hand of the user when the user operates the touchpad to at least one of a multitude of fingers of the model of the human hand and compute a graphical representation of the user's fingers in accordance with the model. In some
embodiments, the computerized system may be configured to identify a set of virtual keys to be associated with each of the user's virtual (i.e., graphical representation of) fingers. The assignment of the set of virtual keys to a user's virtual finger may be perfonned in various ways. In one example, each virtual finger may be associated with a pre-defined set of keys. For example, each virtual finger may be associated with a group of keys that the user is familiar with, such as based on a QIJERTY keyboard configuration. Here, for example, the left hand index finger may be associated with the keys "R\ 'F', 'V, 'Τ', Ό', 'Β', which may be the same set of keys that this finger would be expected to strike if the finger were operating a real QUERTY keyboard. Similarly the right hand middle finger may be associated with the keys T, and ' ' and other QUERTY keys that the right hand middle finger would be expected to strike if this finger were operating a. real QUERTY keyboard. While the above discussion relates to mapping keys arranged in a QUERTY configuration to the different fingers of the user, it should be appreciated that other finger-key mapping schemes may also be implemented, in other embodiments. For example, other methods to assign keys to specific fingers of the user's virtual hand may be used such as numeric keypad schemes, alternative non-QUERTY keyboard schemes, such as Dvorak, Colemak, as well as various variants of the QUERTY keyboard technique.
[0280] In some embodiments, the computerized system may be configured to generate a control region comprising the set of virtual keys and display the control region in a first, position of the display screen of the computerized system. In an embodiment, the control region may- display the set, of virtual keys associated with the fingers of the user. When a user's finger that is in contact with the touchpad moves, the computerized system may be configured to update the position of the set of virtual keys associated with the movement of that particular user' s finger on the touchpad, on the display screen. In an alternate embodiment, the control region may also display a graphical representation of the user's fingers and the different sets of virtual keys associated with each finger, on the display screen. In such an embodiment, when the user's finger that is in contact with the touchpad moves, the computerized system may be configured to update the position of the virtual finger, as well as the group of virtual keys associated with the virtual finger in the control region in accordance with the movement of the user's finger on the to chpad.
[0281] As an example, using a QUERTY key board arrangement as discussed above, if the user's right index finger moves on the touchpad, the virtual keys 'U', , 'M' as well as Ύ', Ή', and ' ' associated with the user's right index finger displayed in the control region may also move, in some examples, the user's right index finger may be appropriately positioned in the control region to strike these keys.
(0282| In some examples, resting keys (i.e. keys, such as 'Α', 'S', 'D', 'F' (left hand) and , ' ', 'Ε', and '; ' (right hand) that a relaxed user's hand would normally contact on a QUERTY keyboard when not pressing a. key may also move when the fingertip of the user's finger moves. As described herein, 'resting keys' may refer to the set of keys on a QUERTY keyboard that the user's fingers may typically rest on when not typing. Thus, in some embodiments, the resting keys may move in addition to the movement of the set of virtual keys that are associated with this particular finger that would normally be expected to be struck by the user's finger.
[0283] Thus, using the above technique, the computerized system may be configured to position the user's virtual fingers and the corresponding sets of virtual keys associated with the virtual fingers to be located anywhere in the display screen according to the hardware limitations of the touchpad and the display screen.
[0284] In some embodiments, when the user lifts his fingertip from the surface of the touchpad (i.e., the user's finger is positioned above but not touching the touchpad), the keys associated with this finger are both frozen or fixed into position, as well as set to an 'enabled' configuration. In an example, the 'enabled' keys may be displayed by visually highlighting these keys on the display screen. For example, the 'enabled' keys may be highlighted by showing these keys as brighter, or more enlarged, or in a different color, and so on.
[0285] In some embodiments, the touchpad may include pressure sensitive sensors that are capable of sensing the pressure of a touch of a portion of the hand of the user on the touchpa d. In such an embodiment, the computerized system may obtain touch data, pertaining to the amount of pressure applied by the user when the user operates the touchpad. Based on the obtained touch data, the computerized system may determine whether to enable a virtual key associated with the user's finger by visually highlighting the virtual key. For example, the computerized system may 'enable' a virtual key associated with the user's virtual finger in the control region if the obtained touch data pertaining to the amount of pressure applied by the user indicates that, the pressure is within a pre-determined threshold value. In some embodiments, this pre-determined pressure threshold value may be specified by the user of the computerized system. [0286| In certain embodiments, when the user places the tip of the lifted finger back on to an enabled virtual key of an associated group of enabled virtual keys (i.e., when the user's finger subsequently touches the touchpad), then that particular key is selected (i.e. considered pressed). This feature thus enables a user to move his or her hand freely on the touchpad surface, while still allowing the user to type using conventional typing techniques. Additionally, by freezing or fixing virtual key positions when a user's finger lifts the touchpad, the user can precisely strike a. virtual key according to the standard relative position of that key on a standard keyboard. Thus, the disclosed technique enables the user to leverage off of a user's long standing 'muscle memory' of relative key positions on a standard keyboard.
[0287] Figure 37 depicts a simplified exemplary flowchart of a method 3700 for controlling a control region on a display screen of a computerized system, in accordance with an embodiment of the present invention. The method includes obtaining, using the computerized system, first data from a. touchpad, the first data being associated with a position of a portion of the hand of a. user when the user operates the computerized system using the touchpad at 3702, In an example, the first data is not associated with an image of a finger of the user from an image sensor. The method then includes transmitting the first data from the touchpad to the computerized device at 3704. In an embodiment, the touchpad may be located in a location that is different from the location of the display screen. In some examples, at 3706, the method may include analyzing the first, data in accordance with a model of a human hand and assigning the first data to at least one of a plurality of fingers of the model. At 3708, the method may include computing a graphical representation of a first finger of the portion of the hand of the user on the touchpad in accordance with the model of the hum an hand.
[0288] In certain embodiments, at, 3710, the method may include identifying a first set of virtual keys to be associated with the first finger. At 3712, the method may include generating a control region comprising the first set of virtual keys. In an embodiment, the control region may include the first set of virtual keys associated with the first finger of the user. In some embodiments, at 3714, the method may include displaying the control region in a first position on the display screen in accordance with the position of the first finger. Additional details of the manner in which the method of Figure 37 may be implemented are discussed in detail in Figures 38 and 39 below.
[0289| Figures 38A-38F depict a series of simplified exemplary illustrations, 3800, 3802, 3804, 3806, 3807 and 3810 of controlling a control region displayed in a display screen of a handheld computerized device by a user operating a touchpad, in accordance with embodiments of the present invention. In one embodiment, the control region displays different sets of keys associated with different fingers of the user's hand as the user proceeds to type at touch-typing speed using, for example, the 'lift and tap' technique discussed in Figure 26. It may be noted that at this point, the system has already assigned the touch data, from the touchpad to at least one of the multitude of fingers of the model and has computed a. graphical representation of the at least one finger of the user, in accordance with the model.
[0290] Referring now to Figure 38 A, in one embodiment, the system determines (in accordance with the touchpad data and the model of the human hand discussed in Figure 37) that the four fingers (Fl, F2, F3, F4) of the user's right hand 3804 are contacting the touchpad on the back side of the computerized device display, but neither the fingers nor the thumb of the user's left hand 3806, nor the thumb of the user's right hand 3804, are yet contacting the touchpad. Based on the obtained touchpad data, the system may then identify different sets of keys to be associated with each of the virtual fingers (Fl, F2, F3, F4) of the user's hand. In an embodiment, the system may then generate a control region comprising a set of virtual keys to be associated with a finger (Fl, F2, F3, F4) of the user's hand. In certain embodiments, the system may then display the control region in a first position in the display screen 3802. In the example depicted in Figure 38A, the virtual keys (Ύ', 'Η', 'Ν') and ('ϋ', , 'Μ') associated with the finger Fl may comprise a first control region. Similarly, the virtual keys (T, 'K', ',') associated with the finger F2 may comprise a second control region, the virtual keys ('()', 'L', '.') associated with the finger F3 may comprise a third control region and so on. In one example, the letters corresponding to the sets of keys associated with the fingers (Fl, F2, F3, F4) are shown as being positioned near or close to or on top of the user's fingers,
[0291] Referring now to Figure 38B, the system (in accordance with the touchpad data and the model of the human hand) may determine that the four fingers of the user's left hand 3806 are also now touching the touchpad on the back side of the computerized device, and the system therefore generates and displays a control region in the display screen 3802 comprising the virtual representation of the four fingers of the user's left hand and the sets of different keys associated with each of the four fingers of the user's left hand, on the display screen 3802. In the example depicted in Figure 38B, the sets of virtual keys associated with each hand may be separated and rotated. The example illustrated in Figure 38B further indicates that the user has raised the little finger of the right hand 3804 above the touchpad, causing the letters ('Ρ', ';' and '/") normally associated with the right little finger to become highlighted.
[0292] Figure 38C illustrates all four fingers of each hand of the user touching the touchpad. In this situation, system may not highlight any of the keys associated with the user's fingers and may depict them in their normal sizes, thus allowing the user to freely move his hands while operating the touchpad. In some situations, the system may not highlight a particular key (e.g., 'P') associated with the user's finger if the system determines that the user's finger is kept raised (without subsequently touching the touchpad) for a period of time that is greater than a predetermined time out, period (e.g., 0-30 seconds) or if the user's finger touches the outside of the highlighted area associated with the particular virtual key.
[0293] In Figure 38D, the user has raised the little finger of the left hand 3806 above the keyboard, causing the keys, 'Q, 'A' and 'Ζ', associated with this finger to become activated. In the illustrated example, the letters, 'Q', Ά', 'Ζ,' are shown as activated by highlighting and /or enlarging these letters. In an embodiment, the letters, 'Q', Ά', 'Ζ,' may be fixed (i.e., frozen) into position so that the user can precisely strike any one of these keys according to the standard relative positions of those keys on the display screen.
[0294] In Figure 38E, the user has just pressed the location on the touchpad that corresponds to the location of the key 'A' on the display screen with the tip of his left little finger. In an embodiment, the system may then be configured to depict the newly entered letter 'A' at a position (e.g., the top) 3809 of the computerized device screen, as well as start to shrink the enlarged highlighted letters ('Q', Ά', 'Ζ') since the tip of the left little finger is now in contact with the touchpad again. [029S| In Figure 38F, since the tip of the left middle finger is again in contact with the surface of the touchpad, the system no longer highlights the keys, 'Q', 'A' and 'Z', and depicts them in their normal sizes. In certain embodiments, the system may be configured to store the letter 'A' just typed by the user in system memory, as well as shows the letter as being displayed (e.g., 3809) in the display screen 3802.
[0296] Figures 39A-38F depict a series of simplified exemplary illustrations, 3900, 3902, 3904, 3906, 3908 and 3910 of controlling a control region displayed in a display screen of a handheld computerized device by a. user operating a touchpad, in accordance with another embodiment of the present invention. In the illustrated embodiment, the touchpad is deta ched from its normal position on the back of the handheld computerized device and positioned in front of the handheld computerized device. It is to be appreciated that by placing the touchpad in front of the device, a user may simultaneously observe the position of his hands and fingers, as well as the corresponding motions of his virtual hand and fingers and the action of the pressed virtual keys on the display screen of the handheld computerized device,
[0297] Figure 39A is an exemplary illustration of a user detaching the touchpad 3901 from its normal location on the back of the computerized device to reposition the touchpad to the front of the computerized device. As discussed above, by re -positioning the touchpad to the front of the device, the user may simultaneously observe both the actions of the user's real fingers on the touchpad as well as the corresponding actions of the user's virtual hand in the display screen.
[0298] Figure 39B is an exemplar}' illustration of a user touching the detached touchpad 3901 near the bottom of the touchpad using the four fingers of the user's right hand. In some examples, the user's thumb may not actually touch the touchpad. However, the system, based on the model of the user's hand, may determine the position of the user's thumb and display the user's virtual thumb in a corresponding location of the display screen. In an embodiment, the system may be configured to display the user's virtual thumb in an alternate color on the display screen. The alternate color may enable the user to recognize that, this part of the user's hand is not touching the touchpad. Thus, in some embodiments, the system may be configured to display a more realistic virtual hand of the user by displaying portions of the user's hand that are not touching the touchpad. In addition, by displaying portions of the user's hand that are not touching the touchpad on the display screen, the user's left hand may be visually distinguished from the user's right hand.
[0299| In some embodiments, the system may be configured to assign a set of virtual keys to be associated with the user's virtual thumb, in this implementation, the user's virtual thumb and its associated virtual keys may also move in accordance with the user's thumb when the user's thumb touches and moves around the touchpad. Figure 39B also displays a control region in a first position in the display screen 3903. In the example depicted in Figure 39B, the keys (Ύ', Ή', 'Ν') associated with the finger Fl may comprise a first virtual region. Similarly, the keys f'U', 'J', 'M') associated with the finger F2 may comprise a second control region, the keys (T, "K '. ' ") associated with the finger F3 may comprise a third control region, and so on. In one example, the letters corresponding to the sets of keys associated with the fingers (Fl, F2, F3, F4) are shown as being positioned near or close to or on top of the user's fingers.
[0300] Figure 39C is an exemplary illustration of the user moving his left hand to a. different position of the touchpad. In an embodiment, the system may be configured to obtain touchpad data that indicates a position (e.g., a location and/or movement) of a portion of the user's hand and/or the user's fingers on the touchpad. Based on the obtained data, in one embodiment, the system may then be configured to determine an angular position of the portion of the user's hand and/or the user's fingers on the touchpad. The system, may then be configured to reposition the control region comprising the user's virtual fingers and its associated virtual keys in accordance with the angular position of the portion of the user's hand and/or the user's fingers on the touchpad.
[0301] Figure 39D is an exemplary illustration of a user not yet, raising his left middle finger 3907 above the touchpad. However, since the system has not yet received inputs according to the 'lift and tap' technique discussed above, the system has not yet responded by highlighting the virtual keys, E', 'D', 'C associated with this finger.
[0302] Figure 39E is an exemplary illustration of the system highlighting (for e.g., by enlarging) the keys Έ', T)', 'C\ normally associated with the user's left middle finger on the QUERTY keyboard when the system detects that the user has raised his left middle finger 3907 above the touchpad. It may be observed that the system has also locked in (i.e., frozen or fixed) the position of these keys so that they now do not move (on the display screen) when the user moves this finger or a portion of his hand on the touchpad. Figure 39E also indicates that the user is in the process of starting to bring the tip of his left middle finger 3907 back onto the touchpad in the touchpad position corresponding to the letter 'Ε'.
[0303] In Figure 39F, since the tip of the left middle finger is again in contact with the surface of the touchpad, the system no longer highlights the keys, Έ', 'D' and 'C, and depicts them in their normal sizes. In certain embodiments, the system may be configured to register a particular key (e.g., Έ') struck by the user and store the letter Έ' in system memory, as well display the letter Έ' in a position 3907 in the display screen 3903.
DETECTION OF USER GESTURES
[0304] In accordance with at least some embodiments, the computerized system (e.g., the handheld computerized device 100) may be configured to detect an interaction of a portion of the hand of the user when the user operates the touchpad and/or the display screen of the
computerized system. The computerized system may then be configured to cause a property of an object displayed on the display screen to be controlled in accordance with the interaction.
[0305] In some examples, the object may correspond to a multipage application (e.g., an electronic book), a page oriented application (e.g., a word processing application) or the like, displayed on the display screen of the computerized system. For instance, a user may wish to interact with an electronic book displ ayed on the display screen by turning a page of the book. In an embodiment, the computerized system may be configured to detect a finger swipe of the user's finger on the touchpad as an action indicative of the user's desire to turn a page of the electronic book. In certain embodiments, the computerized system may then be configured to change the page number of the electronic book in response to the finger swipe.
[0306] In some examples, the computerized system may be configured to detect a finger swipe of a plurality of fingers of the user on the touchpad when the u ser interacts with the multipage application (e.g., an electronic book) displayed on the display screen. The computerized system may then be configured to change the page number by a pre-determined number of pages in response to the finger swipe. For instance, a single finger swipe may cause the computerized system to change the page number of the multipage appl ication by a single page, a two finger swipe may cause the computerized system to change the page number of the mul tipage application by two pages, a three finger swipe may cause the computerized system to change the page number of the multipage application by three pages, and the like.
{0307] 1° some embodiments, the computerized system may be configured to determine the pre-determined number of pages to be changed as a function of the number of individual fingers used in the finger swipe. As an example, the pre-determined number of pages to be changed may be determined as a function of a power of two raised to the power of the number of individual fingers used in each finger swipe, minus 1 , As an example, the pre-determined number of pages to be changed may be represented by the function: 2""geraum"1, where fingernum represents the number of lingers used in the finger swipe. Using such a. function, a single finger swipe may translate to 2' to change the page number by a single page, a 2 finger swipe may translate to 21 to change the page number by two single pages, a. 3 finger swipe may translate to 2' to change the page number by four pages, a 3 finger swipe may translate to 2J to change the page number by 8 pages and so on. In other embodiments, other base numbers may also be used, such as a base 10 function, for example, when scrolling thorough extremely large documents. For instance, using a base 1 0 function, a single finger swipe may translate to 10° to change the page number by a single page, a 2 finger swipe may translate to 101 to change the page number by 10 pages, a 3 finger swipe may translate to 10 to change the page number by 100 pages, a 3 finger swipe may translate to 03 to change the page number by 1000 pages, and so on.
[0308] In certain embodiments, the computerized system, may be configured to detect a velocity of the finger swipe of the user on the touchpad. The computerized system may then be configured to change the page number by a pre-determined number of pages in response to the velocity . For instance, the computerized system may be configured to change the page number by a greater amount (e.g., two pages) when it detects a relatively fast finger swipe and change the page number by a smaller amount (e.g., one page) when it detects a relatively slower finger swipe.
[0309] In certain embodiments, the computerized system may be configured to determine a first direction of the finger swipe and increment the page number by a pre-determined number of pages in response to the first direction. For instance, the computerized system may be configured to increment the page number by a pre-determined number of pages when the computerized system detects a finger swipe in the 'right' or 'forward' direction. Similarly, the computerized system may be configured to decrement the page number by a pre-determined number of pages when the computerized system detects a finger swipe in the 'left or 'backward direction.
[0310] Figure 40 depicts simplified exemplar}' illustrations, 4000 and 4002, that indicate the manner in which a computerized system (e.g., the handheld computerized device 100) may interpret a single finger swipe from a. user operating a touchpad on the back side of the computerized device, in accordance with an embodiment of the present invention. In the illustrated embodiment, the handheld computerized device 100 may be configured to detect a single finger swipe of the user's finger on the touch pad 200 to control a property (e.g., to change a page number of a virtual page) of an object (e.g., an electronic book) displayed on the display screen 102. In the example shown in 4000, the user's thumb 4004 is shown resting on the edge of the device 100 or on the display screen 102 while the user's other fingers are shown located behind the device 100 and resting on the device's rear touchpad 200 (not, shown in FIG. 40). The display screen 102 shows a first page, e.g., page 1 from an electronic book. As further illustrated in the example shown in 4000, the user interacts with the electronic book by extending a finger, 4006, on the touchpad 200 to perform a single finger swipe to turn page 1 of the electronic book. The handheld computerized device 100 detects the single finger swipe as a command to change the page number to page 2 of the book. The device 100 then displays page 2 to the user as shown in the illustrated example, 4002.
[03111 Figure 45 depicts simplified exemplary illustrations, 4100 and 4102, that indicate the manner in which a handheld computerized device may interpret a multiple finger swipe from a user operating a touchpad on the back side of the computerized de vice, in accordance with another embodiment of the present invention. In an embodiment, the handheld computerized de vice 500 may be configured to detect a multiple finger swipe of a plurality of the user's fingers on the touch pad 200 to control a property (e.g., to change the page number by a pre-determined number of pages) of an object (e.g., an electronic book) displayed on the display screen 102. In the example shown in 4100, the user's thumb 4104 is shown resting on the edge of the device 100 or on the display screen 102 while the user's other fingers are shown located behind the device 100, and resting on the device's rear touchpad 200 (not shown in FIG. 41). The display screen 102 shows a first page, e.g., page 1 from the electronic book. As further illustrated in the example shown in 4100, the user interacts with the book by extending two fingers, 4106 and 4108 on the touchpad 200 to perform a two finger swipe across the touchpad. The handheld computerized device 100 detects the two finger swipe and interprets the two finger swipe as a command to change the page number from page 1 to page 3 of the book. The device 100 then displays page 3 to the user as shown in the illustrated example, 4102,
[0312] In certain embodiments, the computerized system may be configured to detect an interaction of a first finger of the user on a first touchpad of the computerized system with a second finger of the user. In an embodiment, the second finger may be located on a second touchpad of the computerized system. The second touchpad may be located in a location that is different from, the first touchpad. In other embodiments, the second finger may also be located on the display screen. In some embodiments, the first touch pad, the second touchpad and the display screen may all be located in different locations and need not be physically connected to each other. For instance, the display screen may be in nearly any location, such as on a regular monitor, TV screen, projector screen, or on a virtual heads-up eyeglass display worn by the user (e.g. a device similar to Google Glass).
[0313] In certain embodiments, the computerized system may be configured to identify a first position of the first finger on the first touchpad and identify a second position of the second finger on the second touchpad. The computerized system may then be configured to detect a selection of an object, by the user. For instance, the object may correspond to a finger controlled device such as a watch dial, a camera, a robotic arm, a two-dimensional (2-D) object, a three- dimensional (3-D) object or other device displayed on the display screen. In some examples, the user may select the object displayed on the display screen using the first finger. In other examples, the user may select the object using the second finger on the second touchpad. In certain examples, the user may also select the object using both the first finger on the first touchpad as well as the second finger on the second touchpad ,
[0314] In some embodiments, the computerized system may then be configured to detect a movement of the first position of the first finger relative to the second position of the second finger and cause a property of an object displayed on the display screen to be controlled and/or altered in response to the detected movement. In an alternate embodiment, the computerized system may also be configured to detect a movement of the second position of the second finger on the second touchpad relative to the first position of the first finger on the first touchpad and cause a property of an object displayed on the display screen to be controlled and/or altered in response to the detected movement. For instance, the computerized system may be configured to rotate the displayed object about an axis of rotation, change a display characteristic (e.g., color) of the object, alter the size (e.g., enlarge or diminish) of the object, move the object and the like, based on the detected movement.
[0315] Figure 42 depicts exemplary illustrations, 4200, 4202 and 4204 of the manner in which a computerized system (e.g., the hand held computerized system, 100) may detect finger gestures from a user using multiple touch pads and/or the display screen of the computerized system to control an object, displayed on the display screen, in accordance with an embodiment of the present invention. In an embodiment, the user may interact with an object (e.g., 4206) displayed on the display screen 102 by placing a first finger 4208 on the first touchpad (e.g., 200) located on the back side of the device 100 and a second finger 4210 on the display screen 102 (e.g., the front touch screen of the device 100). The user may then move the first finger 4208 relative to the second finger 4210 to alter a desired property (e.g., an axis of rotation, a color, a size of the object and the like). For example, the user may move the first finger 4208 relative to the second finger 4210 to rotate the object about an axis of rotation, change the color of the object, enlarge or diminish the object, control an angle at which the object is displayed and the like. In response to the detected movement, the computerized system may then be configured to cause a particular property (e.g., an axis of rotation, a color, a size of the object and the like) of the object to be controlled and/or altered as discussed below.
[0316] For instance, in the exemplary illustration shown in 4200, the user's first finger 4208 may be located on the touchpad 200 and the user's second finger may be located on the display screen 102. In this example, the user may move the first finger 4208 in a first direction (e.g., right) and the second finger 4210 in a second direction (e.g., left). In this case, the computerized system may interpret the movement 4212 as a twist about the axis (not shown) of the object and cause the object to rotate about its axis.
{0317] In the exemplary illustration shown in 4202, the user's first finger 4208 and the user's second finger 4210 are located on different touchpads or touch surfaces. For example, the user's first finger may be located on a first touchpad, 4216 and the user's second finger may be located on a second touchpad, 4218 located in a different location from the first touchpad 4216. In other examples, the user's first finger may be located on the second touchpad 4218 and the user's second finger may be located on the first touchpad 4216. In the example shown in 4202, the user may move the first finger 4208 in a downward direction, and a second finger 4210 in an upward direction. In response to the detected movement, the computerized system may cause the object to rotate about its axis, 4220.
[0318] In the exemplary illustration shown in 4204, the user's first finger 4208 is located on the touchpad 200 and the user's second finger 4210 is located on the display screen (e.g., 102), In this instance, when the user moves the first finger 4208 in one direction (e.g., left) and the second finger 4210 in another direction (e.g., right), the computerized system may interpret this movement as a twist around the object's axis 4222 and cause the object to rotate about its axis, 4222.
[0319] While the above examples shown in 4200, 4202 and 4204 illustrate a rotation of the object, about, an axis of the object in response to a detected movement and/or interaction between a first finger and a second finger of the user, it should be appreciated that the device 100 may be configured to cause various other properties of the displayed object to be controlled in response to a detected movement, in other embodiments. For example, the device 100 may be configured to alter a display characteristic (e.g., color) of the object in response to the detected movement. In other examples, the device 100 may be configured to alter the size (e.g., enlarge or diminish) of the object, move the object and the like in response to the detected movement.
[0320] In certain embodiments, the computerized system may be configured to identify a first position of the first finger of the user on the first touchpad and identify a second position of the second finger on the second touchpad. The computerized system may then be configured to detect a selection of a point of rotation of an object displayed on the display screen. For instance, the object may correspond to a virtual joystick, a virtual hand or mechanical claw displayed on the display screen, in some examples, the user may select the object displayed on the display screen using the first finger. In other examples, the user may also select the object using the second finger on the second touchpad that is located at a. different location from the first touchpad. In certain examples, the user may also select the object using both the first finger on the first touchpad as well as the second finger on the second touchpad.
[0321] The computerized system may then be configured to detect a movement of the second finger on the second touchpad relative to the first position of the first finger on the first touchpa d. In an alternate embodiment, the computerized system may also be configured to detect a movement of the first position of the first finger on the first touchpad relative to the second position of the second finger on the second touchpad. In certain embodiments, the computerized system may then be configured to cause a property of the object to be controlled based on the detected movement. For instance, the computerized system may cause the object to move around the stationary point of rotation based on the detected movement. In other examples, the computerized system may push the handle of the virtual joystick in a direction and magnitude defined by a corresponding direction and magnitude of the movement of the second position of the second finger relative to the first position of the first finger.
[0322] Figure 43 depicts exemplary illustrations, 4300 and 4302 of the manner in which a computerized system (e.g., the handheld computerized device 100) may detect finger gestures from a user using multiple touch pads and/or the display screen of the computerized system to control an object displayed on the display screen, in accordance with an embodiment of the present invention. In an embodiment, the user may interact with an object (e.g., virtual joystick 4312) displayed on the display screen 102 by placing a first finger 4304 in a first position 4308 on the touchpad 200 located on the back side of the device 100 and a second finger 4306 in a second position 4310 on the display screen 502 of the device 100. In an embodiment, the user may hold the first finger 4304 stationary, while moving the second finger 4306 relative to the first finger to alter a desired property (e.g., handle 4354) of the displayed object (i.e., virtual joystick 4312. However, in an alternate embodiment, the user may move the first finger relative to the second finger while holding the second finger stationary to alter the desired property (e.g., handle 4314) of the displayed object (i.e., virtual joystick 4312). For instance, the user may move the moving finger (i.e., the first finger or the second finger) in an upward, downward, left, right or circular direction 4316 around the first position or the second position to alter the desired property (e.g., point of rotation 4316 of the handle 4314) of the displayed object (i.e., virtual joystick 4312). In certain embodiments, the computerized system may be configured to push the handle 4314 of the virtual joystick 4312 in the direction and magnitude as defined by the direction and magnitude of the movement (e.g., 4316) of the first finger in relation to the second finger,
[0323] In the exemplary illustration shown in 4302, the computerized system may be configured to select a stationary point of rotation (i.e. center 4316) of the virtual joystick 4312 and its handle 4314 based on a second position of the user's second finger 4306 on the display- screen 102. The computerized device may further be configured to interpret the movement of the user's first finger 4304 to issue a command to the virtual joystick 4312 to control the operation of the virtual joystick, (e.g., to move the virtual joystick's handle 4314 around the joystick handle's stationary point of rotation 4316)
[0324] In accordance with at least some embodiments, the computerized system may be configured to identify a first contact position of the first finger of the user and detect a selection of an object displayed on the display screen using the first finger. In some examples, the first, finger may be located on the touchpad 200 located on the back of the device 100. In some embodiments, the computerized system may then be configured to detect a change in a characteristic of the first contact, position and cause at least one property of the object to be controlled based on the change in the characteristic of the first contact position. In some examples, the characteristic of the first contact position may include an area, size or shape of the contact position. As an example, the computerized system may be configured to detect a movement of the first finger away from the first contact position, a change in the angl e of the first finger in the first contact position, a change (increase or decrease) in a touch area of the first contact position. In response to the detected movement, the computerized system may then be configured to apply corresponding pressure and/or a corresponding momentum to the displayed object. [0325] In certain embodiments, the computerized system may be configured to identify a first contact position of the first finger of the user on a first touchpad and a second contact position of a second finger of the user on a second touchpad, located in a location that is different from the first touchpad. The computerized system may then be configured to detect a selection of an object displayed on the display screen using both the first finger and the second finger and detect a change in a characteristic of the first contact position and the second conta ct position. In some embodiments, the computerized system may then be configured to cause at least one property of the object to be controlled based on the change in the characteristic of the first contact position and the second contact position.
[0326] Figure 44 depicts exemplary illustrations, 4400, 4402 and 4404 of the manner in which a computerized system (e.g., the hand held computerized system, 100) may detect finger gestures from a user using multiple touch pads and/or the display screen of the computerized system to control an object, displayed on the display screen, in accordance with an embodiment of the present invention. In an embodiment, the user may interact with an object (e.g., virtual bow and arrow 4418 or a virtual squeeze ball 4420) displayed on the display screen 102 by placing a first finger 4406 on the touchpad 200 located on the back side of the device 100. The computerized system may then be configured to detect a selection of the object (e.g. , 4418 or 4420) by the user's finger on the touchpad and identify a first contact area 4408 of the first, finger 4406. In some embodiments, the computerized system may then be configured to detect a change in a characteristic of the contact area 4408 and cause a property of the displayed object to be controlled based on the detected change in the contact area.
[0327] For instance, as shown in the examples, 4400, 4402 and 4404, the computerized system may be configured to detect an increase (e.g., 50%, 100% or the like) in the contact or touch area of the first contact position 4408, for example, when the user presses the touchpad. The computerized system may then be configured to determine a corresponding force and/or pressure or load to be applied to the displayed object based on the detected increase in the touch area. In some examples, the computerized system may be configured to apply a first, relatively smaller force 4414 to the string of the virtual bow and arrow 4418 based on the first contact position 4408 of the user's finger 4406 and a second, relatively larger force 4416 to the string of the virtual bow and arrow 4418, based on detecting an increase 4410 to the touch area of the first contact position 4408. Similarly, the computerized system may be configured to apply a first, relatively smaller pressure 4420 to the virtual squeeze ball 4424 based on the first contact position 4408 of the user's finger and a second, relatively larger pressure 4422 to the virtual squeeze ball 4424, based on detecting an increase 4410 to the touch area of the first contact position 4408. Additionally, in some examples, the computerized system may be configured to reduce the pressure (4416, 4422) applied to the virtual object (4418, 4424) when the system detects a decrease in the touch area 4410 of the first conta ct position. In some examples, the computerized system may also be configured to detect a movement of the first finger away from the first contact position, a. change in the angle of the first finger in the first contact position and the like to alter a property of the displayed object such as a change in the object's momentum, a force applied to the object or a change to other parameters of the object.
[0328] In accordance with at least some embodiments, the computerized system may be configured to detect a movement of a first finger of the user from a first position to a second position on the first touchpad. Based on the detected movement, the computerized system may further be configured to reposition an object displayed in the display screen and enable, for the user, an interaction with the re-positioned object. In some examples, the computerized system may be configured to reposition the object, accordance with a direction of the movement of the first finger from the first, position to the second position. In other examples, the computerized system may be configured to reposition the object in accordance with an amount of movement of the first finger from the first, position to the second position.
[0329] For instance, a. user may wish to operate the handheld computerized device using a single hand. In such a situation, the user may use a thumb (often in front of the device) and another finger, such as an index finger (often on the back of the device) to grip or otherwise ho ld the device. The remaining fingers of the user may be positioned on the back of the device, and somewhat free to move (for e.g., over the touchpad 200 located on the back of the handheld device (100)), but may typically be constrained by the user's hand geometry to reside near the same edge of the device where the user's thumb and index finger are gripping the device,
[0330] In certain situations, the user may not be able to reach an object displayed on the display screen from the side of the device using the user's three remaining free (non-gripping) fingers. In such a situation, the user may drag or scoop (i.e. reach out, and then pull in) one or more of the remaining free (non-gripping) fingers to reach the displayed object. In certain embodiments, the computerized system may be configured to detect this dragging or scooping movement of the user's finger and/or fingers towards the location of the object displayed on the display screen. In some embodiments, the computerized system may then be configured to reposition the object to a location on the display screen that is within a pre-determined distance from the position of the user's gripping finger.
[0331] Figure 45 depicts exemplary illustrations, 4500 and 4502 of the manner in which a.
computerized system (e.g., the hand held computerized device 100) may detect finger gestures of a user from multiple touch pads and/or the display screen of the computerized system to control an object displayed on the display screen, in accordance with an embodiment of the present invention. In an embodiment, the user may interact with an object (e.g., a set, of virtual keys 4504) displayed in a virtual control region on the display screen 102 by gripping the device using a thumb 4510 and an index finger 4512 and using the remaining free fingers to interact with the object. For instance, the user may drag or scoop (i.e. reach out, and then pull in) the middle (non- gripping) finger 4506 to reach the set of virtual keys 4504. In an embodiment, the computerized system may be configured to detect the movement of the finger 4506 towards the set of virtual keys 4504 and re -position the set of virtual keys 4504 to location that is closer (i.e., is within a predetermined distance) to the index finger 4512 or the thumb 4510 of the user on the touchpad. Once the location of the virtual keys is brought to within striking range of the free (non-gripping) finger 4506, the user may strike the virtual keys 4504 using the free (non- gripping) finger 4506, and thus interact with the object. Additionally, the user may move the virtual keys 4504 within striking range of the non- gripping finger 4506 by moving the finger 4506 over the touchpad 200, in a direction 4508 either towards or away from the position of the set of virtual keys 4504.
[0332] Figure 46 depicts a simplified exemplary flowchart of a method 4600 for controlling a control region on a display screen of a computerized system, in accordance with an embodiment of the present invention. The method includes obtaining, using the computerized system, first data from a first touchpad, the first data being associated with a position of a portion of the hand of a user when the user operates the computerized system using the first touchpad at 4602. In an example, the first data is not associated with an image of a finger of the user from an image sensor. The method then includes transmitting the first data from the first touchpad to the computerized device at 4604. In an embodiment, the first touchpad may be located in a. location that is different from the location of the display screen. In some examples, at 4606, the method may include analyzing the first data in accordance with a. model of a human hand and assigning the first data to at least one of a plurality of fingers of the model. At 4608, the method may include detecting an interaction of a portion of the hand of the user on the first touchpad with the object displayed on the display screen. In some examples, at 4610, the method may include causing, by the computerized system, at least one property of the object to be controlled in accordance with the interaction of the portion of the hand of the user on the first touchpad.
[0333] In some embodiments, the method may include detecting a finger swipe of a first finger of the user on the first touchpad when the user interacts with the at least one object on the display screen. In an example, the object, may correspond to a multipage application displayed on the display screen and the property of the object may correspond to a page number in the multipage application. In an embodiment, the method may include changing the page number of the multipage application in response to the finger swipe.
[0334] In some embodiments, the method may include detecting a finger swipe of a plurality of fingers of the user on the first to uchpad when the user interacts with the object, on the display screen. In an example, the method may include changing the page number by a pre-determined number of pages in response to the finger swipe of the plurality of fingers. In some examples, the method may include detecting a velocity of the finger swipe and changing and the page number by the pre-determined number of pages in response to the velocity. In some examples, the method may include determining a first direction of the finger swipe and incrementing the page number by a pre-determined number of pages in response to the first direction. In some examples, the method may include determining a second direction of the finger swipe and decrementing the page number by a pre-determined number of pages in response to the second direction. In an example, the first direction may be different from the second direction.
{03351 I" accordance with some embodiments, the method may include detecting an interaction of at least a second finger of the user with the first finger. In some examples, the second finger may be located a second touchpad. In some examples, the second touchpad may be located in a location different from the first touchpad. In some examples, the second finger may also be located on the display screen. In some embodiments, the method may include identifying a first position of the first finger on the first touchpad, detecting a selection of the object, identifying a second position of the second finger on the second touchpad and detecting a movement of the first position of the first finger relative to the second position of the second finger. In some examples, the method may include detecting the selection of the object by the first finger on the first touchpad or the second finger on the second touchpad. In some embodiments, the method may include rotating the object based on the detected movement, altering an axis of rotation of the object based on the detected movement, altering the size of the object based on the detected movement, altering a display characteristic of the object based on the detected movement or moving the object based on the detected movement.
[0336] In accordance with at least some embodiments, the method may include identifying a first position of the first finger on the display screen, detecting a selection of a point of rotation of the object, identifying a second position of the second finger on the second touchpad and detecting a movement of the second position of the second finger relative to the first position of the first finger. In some examples, the method may include detecting the selection of the point of rotation of the object by the first finger on the first touchpad or the second finger on the second touchpad. In some examples, the method may include moving the object around the point of rotation of the object based on the detected movement. In some examples, the object may correspond to a virtual joystick and the method may include pushing the handle of the virtual joystick in a direction and magnitude defined by a corresponding direction and magnitude of the movement of the second position of the second finger relative to the first position of the first finger.
[0337] In accordance with at least some embodiments, the method may include detecting a selection of the object, identifying a first contact position of the first finger, detecting a change in a characteristic of the first contact position and causing the at least one property of the object to be controlled based on the change in the characteristic. In an embodiment, the method may include detecting a movement of the first finger away from the first contact position, detecting a change in the angle of the first finger in the first contact position, detecting an increase in a touch area of the first contact position and the like. In an example, the characteristic may comprise at least one of the area, the size or the shape of the first contact position. In an embodiment, the method may include applying a. corresponding pressure or a load to the displayed object based on the change in the characteristic of the first contact position.
[0338] In accordance with at least some embodiments, the method may include detecting a movement of the first finger from a. first position to a second position on the first touchpad, repositioning the object in the display screen in accordance with a direction of the movement of the first finger or an amount of movement of the first finger from the first position to the second position and enabling, for the user, an interaction with the re-positioned object. In an example, the object may correspond to a set of virtual keys in a virtual control region in the display screen.
[0339] The above embodiments of the present invention are illustrative and not limiting.
Various alternatives and equivalents are possible. Although, the invention has been described with reference to a handheld computerized device by way of an example, it is understood that the invention is not limited by the type of computerized device or system wherever the device or system may benefit by differentiating between a user's touch o a touchpad for command input, and a user's touch on a touchpad for merely holding the device by the touchpad. Although, the invention has been described with reference to certain user fingers touching the touchpad by way of an example, it is understood that the invention is not limited by which user fingers are touching the touchpad. Although, the invention has been described with reference to a touchpad located on the back of a handheld device including a display at the front of the device by way of an example, it is understood that the invention is not limited by where the touchpad is located. Although, the invention has been described with reference to a capacitive touchpad used for data entry by way of an example, it is understood that the invention is not limited by the type of input device. Although, the invention has been described with reference to a sequence of strong, weak, strong or medium, small, large force applied by a user's finger used for data entry by way of examples, it is understood that the in vention is not limited by those two sequences of forces applied. Other additions, subtractions, or modifications are obvious in view of the present disclosure and are intended to fall within the scope of the appended claims. (034θ| Trademarks: iPAD™ and iPhone™ are trademarks of Apple Inc., Cupertino California. Surface™ is a copyright of Microsoft Corporation.

Claims

WHAT IS CLAIMED IS: L A computer- implemented method for controlling a control region on a display screen of a computerized system, the method comprising:
obtaining, using the computerized system, first data from a first touchpacl, the first data being associated with a position of a portion of the hand of a user when the user operates the computerized system using the first touchpad, the first data not being associated with an image of a finger of the user from an image sensor;
transmitting the first data from the first touchpad to the computerized system, the first touchpad being located in a location that is different from the location of the display screen;
analyzing the first data in accordance with a model of a human hand and assigning the first data to at lea st one of a plurality of fingers of the model;
detecting, by the computerized system, an interaction of at least the portion of the hand of the user on the first touchpad with at least one o bject displayed on the display screen: and
causing, by the computerized system, at least one property of the object to be controlled in accordance with the interaction of the portion of the hand of the user on the first touchpad.
2. The computer- implemented method of claim 1, wherein detecting an interaction of at least the portion of the hand comprises detecting a finger swipe of a first finger of the user on the first touchpad when the user interacts with the at least one object on the display screen,
3. The computer-implemented method of claim 2, wherein the object corresponds to a raultipage application displayed on the display screen and the at least one properly of the object corresponds to a page number in the raultipage application.
4. The computer-implemented method of claim 3, wherein causing the at least one property of the object to be controlled comprises changing the page number of the multipage application in response to the finger swipe.
5. The computer- implemented method of claim 1 , wherein detecting an interaction of at least the portion of the hand further comprises detecting a finger swipe of a plurality of fingers of the user on the first touchpad when the user interacts with the at l east one object on the display screen.
6. The computer-implemented method of claim 5, wherein causing the at least one property of the object to be controlled comprises changing the page number by a pre- determined number of pages in response to the finger swipe of the plurality of fingers.
7. The computer-implemented method of claim 6, further comprising detecting a velocity of the finger swipe, wherein causing the at least one property of the object to be controlled comprises changing the page number by the pre-determined number of pages in response to the velocity.
8. The computer-implemented method of claim 3, further comprising:
determining a first direction of the finger swipe; and
incrementing the page number by a pre-determined number of pages in response to the first direction.
9. The computer-implemented method of claim 8, further comprising:
determining a second direction of the finger swipe; and
decrementing the page number by a pre-determined number of pages in response to the second direction, the second direction being different from the first direction.
10. The computer-implemented method of claim 2 , wherein detecting an interaction of at least the portion of the hand of the user on the first touchpad further comprises:
detecting an interaction of at least a second finger of the user with the first finger.
1 1 . The computer-implemented method of claim 10, wherein the second finger is located a second touchpad, the second touchpad being located in a location different from the first touchpad.
12. The computer-implemented method of claim 1 1 , wherein the second finger is located on the display screen.
13. The computer-implemented method of claim 10, further comprising: identifying a first position of the first finger on the first touchpad; identifying a second position of the second finger on the second touchpad;
detecting a selection of the object; and
detecting a movement of the first position of the first finger relative to the second position of the second finger.
14. The computer-implemented method of claim 13, further comprising detecting the selection of the object using the first finger on the first touchpad.
15. The computer-implemented method of claim 13, further comprising detecting the selection of the object using the second finger of the user on the second touchpad.
16. The computer-implemented method of claim 13, wherein causing the at least one property of the object to be controlled comprises rotating the object based on the detected movement.
17, The computer-implemented method of claim 1 3, wherein causing the at least one propexty of the object to be controlled comprises altering an axis of rotation of the object based on the detected movement.
18. The computer-implemented method of claim 13, wherein causing the at least one property of the object to be controlled comprises altering the size of the object based on the detected movement.
19, The coxnputer-implemented method of claim 1 3, wherein causing the at least one propexty of the object to be controlled comprises altering a display characteristic of the object based on the detected movement.
20. The computer-implemented method of claim 13, wherein causing the at least one property of the object to be controlled comprises moving the object based on the detected movement.
21. The computer-implemented method of claim 1 0, further comprising: identifying a first position of the first finger on the first touchpad identifying a second position of the second finger on the second touchpad;
detecting a selection of a point of rotation of the object; and
detecting a movement of the second position of the second finger relative to the first position of the first finger.
22. The computer-implemented method of claim 21 , further comprising detecting the selection of the point of rotation of the object using the first finger on the first touchpad.
23. The computer- implemented method of claim 21 , further comprising detecting the selection of the point of rotation of the object using the second finger on the second touchpad.
24, The computer-implemented method of claim 21 , wherein causing the at least one property of the object to be controlled comprises moving the object around the point of rotation of the object based on the detected movement.
25. The computer-implemented method of claim 21 , wherein the object comprises a virtual joystick and wherein causing the at least one property of the object to be controlled comprises pushing the handle of the virtual joystick in a direction and magnitude defined by a corresponding direction and magnitude of the movement of the second position of the second finger relative to the first position of the first finger.
26. The computer-implemented method of claim 1 , further comprising:
identifying a first contact position of a first finger;
detecting a selection of the object; detecting a change in a characteristic of the first contact position; and
causing the at least one property of the object to be controlled based on the change in the characteristic.
27. The computer-implemented method of claim 26, wherein detecting the change in the characteristic of the first contact position comprises detecting a movement of the first finger away from the first contact position.
28. The computer-implemented method of claim 26, wherein detecting the change in the characteristic of the first contact position comprises detecting a change in the angle of the first finger in the first contact position.
29. The computer-implemented method of claim 26, wherein detecting the change in the characteristic of the first contact position comprises detecting a change in a touch area of the first contact position.
30. The computer-implemented method of claim 26, wherein the characteristic comprises at least one of the area or the size of the first contact position.
31 . The computer-implemented method of claim 26, wherein the characteristic comprises the shape of the first contact position.
32. The computer-implemented method of claim 26, wherein causing the at least one property of the object to be controlled comprises applying a corresponding pressure to the displayed object based on the change in the characteristic of the first contact position.
33, The computer-implemented method of claim 26, wherein causing the at least one property of the object to be controlled comprises applying a corresponding momentum to the displayed object based on the change in the characteristic of the first contact position.
34. The computer-implemented method of claim 1 , further comprising:
detecting a mo vement of a first finger from a first position to a second position on the first touchpad; repositioning the object in the display screen in accordance with a direction of the movement of the first finger from the first position to the second position; and
enabling, for the user, an interaction with the re-positioned object.
35 The computer-implemented method of claim 1 , wherein detecting an interaction of the first finger of the user further comprises:
detecting a movement of a first finger from a first position to a second position on the first touchpad;
repositioning the object in the display screen in accordance with an amount of movement of the first finger from the first position to the second position; and
enabling, for the user, an interaction with the re-positioned object.
36. The computer-implemented method of claim 1 , wherein the object corresponds to a set of virtual keys in a virtual control region in the display screen.
37. A non-transitory computer-readable storage medium comprising instructions which when executed by a computer cause the computer to:
obtain first data from a first touchpad, the first data being associated with a position of a portion of the hand of a. user when the user operates the computerized system using the first touchpad, the first data not being associated with an image of a finger of the user from an image sensor;
transmit the first data from the first touchpad to the computerized device, the first touchpad being located in a location that is different from the location of the display screen;
analyze the first data in accordance with a model of a human hand and assigning the first data to at least one of a plurality of fingers of the model;
detect an interaction of at least the portion of the hand of the user on the first touchpad with at least one object displayed on the display screen; and
cause at least one property of the object to be controlled in accordance with the interaction of the portion of the hand of the user on the first to uchpad.
38. The computer-readable media of claim 37, wherein the instructions to detect an interaction of at least the portion of the hand comprises instructions to detect a finger swipe of a first finger of the user on the first touchpad when the user interacts with the at least one object on the display screen.
39. The computer-readable media of claim 38, wherein the instructions to detect an interaction of at least the portion of the hand of the user on the first touchpad further comprise instructions to detect an interaction of at least a second finger of the user with the first finger, wherein the second finger is located a second touchpad, the second touchpad being located in a location different from the first touchpad.
40. The computer-readable media of claim 39, further comprising instructions to:
identify a first position of the first finger on the first touchpad;
identify a second position of the second finger on the second touchpad;
detect a selection of the object; and
detect a movement of the first position of the first finger relative to the second position of the second finger.
41. The computer-readable media of claim 40, wherein the instructions to cause the least one property of the object to be controlled comprise instructions to rotate the object based on the detected movement.
42. The computer-readable media of claim 40, wherein the instructions to cause the least one property of the object to be controlled comprise instructions to alter the size of the object based on the detected movement.
43. The computer-readable media of claim 40, wherein the instructions to cause the least one property of the object to be controlled comprise instructions to alter a display characteristic of the object based on the detected movement.
44. The computer-readable media of claim 37, further comprising instructions to:
identify a first contact position of a first finger;
detect a selection of the object; detect a change in a characteristic of the first contact position; and cause the at least one property of the object to be controlled based on the change in the characteristic ,
45. The computer-readable media of claim 44, wherein the instructions to detect the change in the characteristic of the first contact position further comprise instructions to detect a change in a touch area of the first contact position.
46. The computer-readable media of claim 44, wherein the characteristic comprises at least one of the area, the size or the shape of the first contact position.
47. The computer-readable media of claim 44, wherein the instructions to cause the at least one property of the object to be controlled further comprises instructions to apply a corresponding pressure to the displayed object based on the change in the characteristic of the first contact position.
48. The computer-readable media of claim 37, further comprising instructions to:
detect a movement of the first finger from a first position to a second position on the first touchpad;
reposition the object in the display screen in accordance with a direction of the movement of the first finger from the first position to the second position; and
enable, for the user, an interaction with the re -positioned object,
49. A system for controlling a control region on a display screen of a computerized system configured to:
obtain first data from a first touchpad, the first data being associated with a position of a portion of the hand of a user when the user operates the computerized system using the first touchpad, the first data not being associated with an image of a finger of the user from an image sensor;
transmit the first data from the first touchpad to the computerized device, the first touchpad being located in a location that is different from the location of the display screen; analyze the first data in accordance with a model of a human hand and assigning the first data to at least one of a plurality of fingers of the model ;
detect an intera ction of at least the portion of the hand of the user on the first touchpad with at least one object displayed on the display screen; and
cause at least one property of the object to be controlled in accordance with the interaction of the portion of the hand of the user on the first touchpad.
50, The system of claim 49, further configured to detect a finger swipe of a first finger of the user on the first touchpad when the user interacts with the at least one object on the display screen,
51. The system of claim 49, further configured to detect an interaction of at least a second finger of the user with the first finger, wherein the second finger is located a second touchpad, the second touchpad being located in a location different from the first touchpad.
52. The system of claim 51 further configured to:
identify a first position of the first finger on the first touchpad;
identify a second position of the second finger on the second touchpad detect a selection of the object; and
detect a movement of the first position of the first finger relative to the second position of the second finger,
53. The system of claim 51 further configured to rotate the object based on the detected movement.
54. The system of claim 51 further configured to:
identify a. first contact position a first finger;
detect a selection of the object;
detect a change in a characteristic of the first contact position; and cause the at least one property of the object to be controlled based on the change in the characteristic.
55. The system of claim 51 further configured to apply a corresponding pressure to the displayed object based on the change in the characteristic of the first contact position.
PCT/US2014/070112 2013-12-14 2014-12-12 Method for detecting user gestures from alternative touchpads of a handheld computerized device WO2015089451A1 (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US201361916168P 2013-12-14 2013-12-14
US61/916,168 2013-12-14
US14/568,492 US9678662B2 (en) 2010-04-23 2014-12-12 Method for detecting user gestures from alternative touchpads of a handheld computerized device
US14/568,492 2014-12-12

Publications (1)

Publication Number Publication Date
WO2015089451A1 true WO2015089451A1 (en) 2015-06-18

Family

ID=53371886

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2014/070112 WO2015089451A1 (en) 2013-12-14 2014-12-12 Method for detecting user gestures from alternative touchpads of a handheld computerized device

Country Status (1)

Country Link
WO (1) WO2015089451A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3995188A4 (en) * 2020-01-21 2022-11-09 Tencent Technology (Shenzhen) Company Limited Display method and apparatus for interactive interface, and storage medium and electronic apparatus

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030043114A1 (en) * 2001-09-04 2003-03-06 Miika Silfverberg Zooming and panning content on a display screen
US20060084482A1 (en) * 2004-10-15 2006-04-20 Nokia Corporation Electronic hand-held device with a back cover keypad and a related method
US20100045703A1 (en) * 2008-08-22 2010-02-25 Google Inc. User Interface Gestures For Moving a Virtual Camera On A Mobile Device
US20100110025A1 (en) * 2008-07-12 2010-05-06 Lim Seung E Control of computer window systems and applications using high dimensional touchpad user interface
US20100156813A1 (en) * 2008-12-22 2010-06-24 Palm, Inc. Touch-Sensitive Display Screen With Absolute And Relative Input Modes
US20110242138A1 (en) * 2010-03-31 2011-10-06 Tribble Guy L Device, Method, and Graphical User Interface with Concurrent Virtual Keyboards
US20110261058A1 (en) * 2010-04-23 2011-10-27 Tong Luo Method for user input from the back panel of a handheld computerized device
US20120105464A1 (en) * 2010-10-27 2012-05-03 Google Inc. Animated page turning

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030043114A1 (en) * 2001-09-04 2003-03-06 Miika Silfverberg Zooming and panning content on a display screen
US20060084482A1 (en) * 2004-10-15 2006-04-20 Nokia Corporation Electronic hand-held device with a back cover keypad and a related method
US20100110025A1 (en) * 2008-07-12 2010-05-06 Lim Seung E Control of computer window systems and applications using high dimensional touchpad user interface
US20100045703A1 (en) * 2008-08-22 2010-02-25 Google Inc. User Interface Gestures For Moving a Virtual Camera On A Mobile Device
US20100156813A1 (en) * 2008-12-22 2010-06-24 Palm, Inc. Touch-Sensitive Display Screen With Absolute And Relative Input Modes
US20110242138A1 (en) * 2010-03-31 2011-10-06 Tribble Guy L Device, Method, and Graphical User Interface with Concurrent Virtual Keyboards
US20110261058A1 (en) * 2010-04-23 2011-10-27 Tong Luo Method for user input from the back panel of a handheld computerized device
US20120105464A1 (en) * 2010-10-27 2012-05-03 Google Inc. Animated page turning

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
WIGDOR ET AL.: "LucidTouch: A See- Through Mobile Device.", SCIENTIFIC PAPER., 2007, Retrieved from the Internet <URL:http://www.cliftonforlines.com/papers/2007_wigdor_lucidtouch.pdf> *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3995188A4 (en) * 2020-01-21 2022-11-09 Tencent Technology (Shenzhen) Company Limited Display method and apparatus for interactive interface, and storage medium and electronic apparatus
JP7348396B2 (en) 2020-01-21 2023-09-20 ▲騰▼▲訊▼科技(深▲セン▼)有限公司 Interactive interface display method, device, computer program and electronic device

Similar Documents

Publication Publication Date Title
US9678662B2 (en) Method for detecting user gestures from alternative touchpads of a handheld computerized device
US9891821B2 (en) Method for controlling a control region of a computerized device from a touchpad
US9891820B2 (en) Method for controlling a virtual keyboard from a touchpad of a computerized device
US20160364138A1 (en) Front touchscreen and back touchpad operated user interface employing semi-persistent button groups
US20170017393A1 (en) Method for controlling interactive objects from a touchpad of a computerized device
US9529523B2 (en) Method using a finger above a touchpad for controlling a computerized system
US9542032B2 (en) Method using a predicted finger location above a touchpad for controlling a computerized system
US9477874B2 (en) Method using a touchpad for controlling a computerized system with epidermal print information
US9311724B2 (en) Method for user input from alternative touchpads of a handheld computerized device
US20150363038A1 (en) Method for orienting a hand on a touchpad of a computerized system
US9430147B2 (en) Method for user input from alternative touchpads of a computerized system
US8384683B2 (en) Method for user input from the back panel of a handheld computerized device
US9857868B2 (en) Method and system for ergonomic touch-free interface
WO2018053357A1 (en) Touch sensitive keyboard
WO2012048380A1 (en) Virtual keyboard
EP2767888A2 (en) Method for user input from alternative touchpads of a handheld computerized device
JP2009527041A (en) System and method for entering data into a computing system
US20140253486A1 (en) Method Using a Finger Above a Touchpad During a Time Window for Controlling a Computerized System
US9639195B2 (en) Method using finger force upon a touchpad for controlling a computerized system
Sax et al. Liquid Keyboard: An ergonomic, adaptive QWERTY keyboard for touchscreens and surfaces
EP2474890A1 (en) Virtual keyboard configuration putting fingers in rest positions on a multitouch screen, calibrating key positions thereof
Rekimoto Organic interaction technologies: from stone to skin
WO2015178893A1 (en) Method using finger force upon a touchpad for controlling a computerized system
WO2015042444A1 (en) Method for controlling a control region of a computerized device from a touchpad
WO2015013662A1 (en) Method for controlling a virtual keyboard from a touchpad of a computerized device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14870043

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 07.10.2016)

122 Ep: pct application non-entry in european phase

Ref document number: 14870043

Country of ref document: EP

Kind code of ref document: A1