WO2007093984A2 - A system and method of inputting data into a computing system - Google Patents

A system and method of inputting data into a computing system Download PDF

Info

Publication number
WO2007093984A2
WO2007093984A2 PCT/IL2007/000174 IL2007000174W WO2007093984A2 WO 2007093984 A2 WO2007093984 A2 WO 2007093984A2 IL 2007000174 W IL2007000174 W IL 2007000174W WO 2007093984 A2 WO2007093984 A2 WO 2007093984A2
Authority
WO
WIPO (PCT)
Prior art keywords
user
virtual
keyboard
hand
gesture
Prior art date
Application number
PCT/IL2007/000174
Other languages
French (fr)
Other versions
WO2007093984A3 (en
Inventor
Harel Cohen
Giora Bar-Sakai
Original Assignee
Ftk Technologies Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ftk Technologies Ltd. filed Critical Ftk Technologies Ltd.
Priority to EP07706117A priority Critical patent/EP1999547A4/en
Priority to JP2008554905A priority patent/JP2009527041A/en
Publication of WO2007093984A2 publication Critical patent/WO2007093984A2/en
Publication of WO2007093984A3 publication Critical patent/WO2007093984A3/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/018Input/output arrangements for oriental characters
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/02Input arrangements using manually operated switches, e.g. using keyboards or dials
    • G06F3/023Arrangements for converting discrete items of information into a coded form, e.g. arrangements for interpreting keyboard generated codes as alphanumeric codes, operand codes or instruction codes
    • G06F3/0233Character input methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/02Input arrangements using manually operated switches, e.g. using keyboards or dials
    • G06F3/023Arrangements for converting discrete items of information into a coded form, e.g. arrangements for interpreting keyboard generated codes as alphanumeric codes, operand codes or instruction codes
    • G06F3/0233Character input methods
    • G06F3/0237Character input methods using prediction or retrieval techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0425Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected
    • G06F3/0426Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected tracking fingers with respect to a virtual keyboard projected or printed on the surface
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0489Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using dedicated keyboard keys or combinations thereof
    • G06F3/04895Guidance during keyboard input operation, e.g. prompting

Definitions

  • the present disclosure generally relates to the field of data inputting devices. More specifically, the present disclosure relates to a method for facilitating multilingual data inputting and to a system utilizing the multilingual data inputting method.
  • keyboards are designed for the input of data such as text and other types of characters (collectively referred to hereinafter as "symbols” or “key labels”), and also to control the operation of the computer.
  • data such as text and other types of characters
  • keyboards are an arrangement of rectangular or near-rectangular buttons, or "keys”.
  • Keyboards typically have one or more symbols engraved, printed or otherwise marked on each key; in most cases, each press of a key corresponds to a single symbol being entered into the computer and, in many cases, displayed on the computer's display screen. However, producing some symbols requires pressing and holding several keys simultaneously, or in sequence.
  • Standard keyboards suffer from a number of disadvantages and limitations.
  • standard keyboards normally contain function keys, for example with symbols Fl to F 12, which hide functions that are defined in separate instructions. Often, the user has to learn and memorize these hidden functions or invoke their meaning from a lookup table, a "Help" directory, or from other sources.
  • Such keyboards are limited in the number of keys, and therefore key functions. Normally, a nonprofessional typist has to follow the typing action by shifting frequently his gaze between the keyboard placed on a desk and the PC monitor screen, which is normally placed in front and higher on the desk.
  • US Patent No. 6,611,253 by the same inventor as the present disclosure, describes a method and system for virtual input environment, and creation of an input unit with changeable keys display.
  • US 6,611,253 does not teach using hands gestures, to control the layout of the virtual keyboard, nor it teaches that the appearance of the virtual hands may be depend on the virtual keyboard's layout being used.
  • US 6,611,253 does not evaluating previous commands to predict current, or future, command or user's request.
  • keyboards For some languages, as stated above, for example, Russian and Hebrew, hardware keyboards generally have second language alphabet symbols etched into the keyboard keys together with the English alphabet. In some countries where the spoken languages have large alphabets and/or many characters (in excess of 50 characters, for example), the keyboards generally include only English letters, as all the letters of the other language cannot be displayed on the physical keyboard. This situation often creates a huge problem for users of such a languages to perform data entry tasks in such languages.
  • Indian scripts typically have 12-15 vowels, 35-40 consonants and a few diacritical marks. Besides this, for each vowel there is a corresponding modifier symbol and for each consonant, there is a corresponding pure consonant form (called half-letter). This makes the total set of symbols required to enter such languages larger than what a normal keyboard could accommodate.
  • different Indian language word processors are distributed with hardcopy "maps" indicating the Indian letter hiding behind each English key. Approximately 50 hardcopy maps are available among different distributors of Indian word processors, however hardware manufacturers do not generally supply keyboards with Indian languages layout. Over 95 percent of Indian population is generally deprived of the benefits of English-based Information Technology.
  • the system disclosed herein may include a controller adapted to set and display a map of an input key(s), or entire (or partial) layout of a keyboard, based on a signal associated with an acquired image; and an image acquisition device functionally coupled to the controller and adapted to provide the controller with a signal relating to, or associated with, the acquired image.
  • the system may capture and identify, recognize or interpret, one or more gestures, for example, gestures generated or performed by a user's hands, fingers, or other body parts, and to execute commands in accordance with the gestures.
  • the system may include a monitor, or display screen, to display a virtual keyboard and virtual hands that simulate the ⁇ osition(s) and/or movements) of the user's physical hands, optionally in real time.
  • the meaning of key(s), or key label(s), on the virtual keyboard may be dynamically updated (changed) according to user's commands.
  • the entire, or only part of the, layout of the virtual keyboard may be dynamically updated (changed) according to user's commands.
  • the virtual keyboard may be dynamically updated (changed) according to the user's hand location(s) and/or movement(s).
  • the system may include evaluation and predictability software application (the controller may be adapted) to evaluate, predict or otherwise determine, based on a user's previous command, anticipated key(s) subsequently required on the virtual keyboard and/or the anticipated layout subsequently required on the virtual keyboard.
  • hand movements or other gestures may implement mouse-type navigation.
  • a method for inputting data into a computing system.
  • the method may include acquiring image(s) of parts of a user body and of a physical keyboard, and setting and displaying a mapping of a key(s) based on the acquired image(s).
  • the method may further include processing and interpreting signals relating to, or associated with, acquired image(s), to enable inputting of selected commands and/or symbols according to the signals.
  • the method may further include using a keyboard identification function to identify keys of a physical keyboard placed in the FOV of the image acquisition device; processing the images of at least one user hand to determine the hand's position and/or hand's movement relative to the physical keyboard; and displaying the position(s) movement(s) of at least one hand on a virtual keyboard on a corresponding display screen, for example on a computer display, or on a computer display object.
  • a keyboard identification function to identify keys of a physical keyboard placed in the FOV of the image acquisition device
  • processing the images of at least one user hand to determine the hand's position and/or hand's movement relative to the physical keyboard
  • displaying the position(s) movement(s) of at least one hand on a virtual keyboard on a corresponding display screen, for example on a computer display, or on a computer display object.
  • the method may include dynamically updating key labels on the virtual keyboard in response to the images processed, and/or dynamically updating the entire or portions of the keyboard layout of the virtual keyboard in response to the images processed.
  • the method may include hand movements that are intended to implement mouse-type navigation using a virtual mouse, and/or other body movements that may be interpreted as user's input commands and/or data.
  • other body movements may refer, for example, to hand movements, head movements, eye movements, mouth movements or other types of movements that may indicate user commands and/or data entry.
  • FIGS. IA and IB are graphic exemplify a virtual keyboard according to some embodiments of the present disclosure.
  • Fig. 1C is a graphical example of a keyboard with a limited number of Hindi characters, which may be utilized as a physical and/or virtual keyboard, according to some embodiments;
  • Fig. ID shows an exemplary set of graphical views of various fingers-based signals or gestures which may be used to indicate commands and/or data for input, according to some embodiments;
  • Figs. IE and IF show examples for mapping an input key based on a signal associated with an acquired image, and examples of maps of an input key based on a signal associated with an acquired image according to some embodiments of the present disclosure
  • FIG. 2A shows an exemplary flowchart for operating a data entry system according to some embodiments of the present disclosure
  • FIG. 2B schematically illustrates a general layout and functionality of a data entry system according to some embodiments of the present disclosure.
  • FIG. 3 schematically illustrates a general layout and functionality of a data entry system according to other embodiments of the present disclosure.
  • gesture may include at least movement(s) and/or signal(s) and/or indication(s) and/or sign(s) and/or instruction(s) and/or request(s), and the like, made by body part(s) of a person operating a keyboard.
  • command is meant herein using a gesture, or a series or combination of gestures, to instruct, request or order a computer to change the meaning or interpretation (assigning, or reassigning a symbol) of selected keys, or to change the meaning or interpretation of the entire keyboard layout in accordance with the gesture, or the series or combination of gestures.
  • Embodiments as described herein may facilitate improvement in human computer interaction (HCI) problems associated with, in particular but not only, languages that have many symbols. Such embodiments enhance the speed of data inputting, offer a complete solution to Language data entry tasks, substantially reduce the number of typing errors and improve the usability of languages word processors by presenting a user-friendly data input environment.
  • HCI human computer interaction
  • Data inputting system 100 may include at least one image acquisition, or capturing, device, such as image acquisition device 110, which may be, for example, a digital camera, video camera, PC camera, Webcam, and so on, which may be located on a computer display monitor 120, for example.
  • image acquisition device 110 may be located at different locations, provided that position(s), location(s), movement(s) and gesture(s) of a user hand(s), or other parts of the user's body for that matter, are clearly visible to; that is, they appear in the FOV of, image acquisition device 110.
  • Data inputting system 100 may further include a controller (not shown) associated with, or functionally coupled to, image acquisition device 110 and adapted to set a map of a key(s) or a map of the entire keyboard layout, based on a signal that is generated and outputted by image acquisition device 110 to the controller, which signal represents an image(s) in the FOV of image acquisition device 110 relating to, and including, a gesture(s) or movement(s).
  • a controller (not shown) associated with, or functionally coupled to, image acquisition device 110 and adapted to set a map of a key(s) or a map of the entire keyboard layout, based on a signal that is generated and outputted by image acquisition device 110 to the controller, which signal represents an image(s) in the FOV of image acquisition device 110 relating to, and including, a gesture(s) or movement(s).
  • the mapping of a key(s) may include changing the symbolic meaning assigned to the key(s) in accordance with movements or gestures made by, or associated with, a user such as the user whose (real) hands only are shown, at 131 and 132, resting on physical keyboard 130.
  • the controller may be an integral part of, or embedded or incorporated or affiliated into a computer (PC, laptop and the like) that gets input signals from a keyboard such as keyboard 130 and operates a display screen such as display screen 120.
  • the user may move his hands 131 and/or 132 from one position to another, in respect of, or relative to, physical keyboard 130, while signals relating to images of hands 131 and 132, which are acquired by image acquisition device 110, are constantly, or intermittently, forwarded to data inputting system 100 for processing and interpretation.
  • Data inputting system 100 may process and interpret the signal relating to the acquired images to identify gesture(s) and/or movement(s) made by the user by his hand(s) or other body part(s), and execute commands in accordance, or in connection, with the gesture(s) and/or movement(s).
  • Physical keyboard 130 may be a standard keyboard (with symbols marked thereon), blank keyboard (a keyboard with no markings on the keys), paper keyboard (a drawing of a keyboard with any number of keys, for example), touch pad, keypad, imaginary keyboard (flat naked surfaces such as tables and boards), and so on.
  • Data inputting system 100 may also utilize a Word application(s) suitable for processing language(s) being used (for example English, Hindi and German).
  • the controller of data inputting system 100 may utilize digital signal processing ("DSP") techniques, for processing images captured by image acquisition device 110, and simulation techniques for displaying a corresponding virtual keyboard, such as virtual keyboard 140, on a computer screen such as computer screen 120.
  • DSP digital signal processing
  • the number, size and spacing of the keys on virtual keyboard 140 may substantially resemble those of physical keyboard 130 to facilitate user's orientation. According to other aspects either the number or size or spacing of the keys on virtual keyboard 140 may differ from those of physical keyboard 130.
  • the controller of data inputting system 100 may cause the symbol(s), or meaning, assigned to a key(s), and/or the symbols or meaning assigned to the entire layout of virtual keyboard 140, to change according to a corresponding user's gesture or movement, which may be identified, recognized or interpreted by the controller of data inputting system 100 from the acquired, or captured image(s).
  • the controller of data inputting system 100 may utilize simulation techniques for creating and handling virtual hand(s) and cause virtual hand(s) to appear and move on the display screen in accordance with the user's (real, physical) hand(s) position, location and movement.
  • virtual [034] hands 121 and 122 are shown in Fig. IA reflecting the user hands 131 and 132, respectively.
  • Virtual keyboard 140 and/or virtual hands 121 and/or 122 may be likewise or differently scaled to facilitate ease of data inputting.
  • a user may place his hands, shown as 131 and 132, and make a gesture, or a series or combination of gestures, in the FOV of image acquisition device 110, which is/are associated with the requested language.
  • the gesture, or series or combination of gestures may then be recognized or interpreted by the controller of data inputting system 100 as being associated with the requested language.
  • the controller of data inputting system 100 may assign symbol(s) constituting the requested language to selected keys on the virtual keyboard 140 and display the layout of virtual keyboard 140 with the currently requested assigned symbol(s).
  • the controller of data inputting system 100 has set a map of the keyboard layout, which corresponds to the requested language. Once the requested language has been set by the controller of data inputting system 100 responsive to the user's command, the user may enter data to data inputting system 100 by observing keys on virtual keyboard 140 and moving his hand (131 or 132), or certain finger(s) thereof, across physical keyboard 130.
  • the user may move his hand, or certain fmger(s) thereof, until the respective virtual hand (121 or 122), or corresponding virtual fmger(s) thereof, reaches the vicinity of the next key on virtual key 140 to be depressed and a finger of the virtual hand 121 or 122 overlaps that key. Then, the user may depress the key in physical keyboard 130 underneath the finger corresponding to, or associated with, the virtual finger overlapping the requested key on virtual keyboard 140. The above-described procedure may be repeated as many times as required for inputting additional symbols. Should the user wish to change to, or to set, a different language, the user may pose a gesture, or series or combination of gestures that correspond to the different language.
  • the controller of data inputting system 100 Every time the controller of data inputting system 100 is requested to set a different map of keys, to set a different language, the controller may enable the corresponding WORD application/processor. For example, if the controller of data inputting system 100 is requested to change from French to English, then the controller may disable the French WORD application/processor and enable the English application/processor.
  • Physical keyboard 130 is functionally coupled to the controller of data inputting system 100, or to a computer within which the controller of data inputting system 100 resides, for forwarding to the controller signals representative of default symbols or functions associated with the keys in physical keyboard 130. Nevertheless, the controller of data inputting system 100 is adapted, or configured, to interpret signals forwarded to it from physical keyboard 130 according to a current mapping setting.
  • Data inputting system 100 has several advantages over prior art solutions. For example, a user inputting data to the system does not have to shift gaze, back and forth, between the physical keyboard (keyboard 130, for example) and the screen displaying the resulting typing (display screen 120, for example). Instead, the user may only gaze at the virtual keyboard (virtual keyboard 140, for example), and see virtual hands (virtual hands 121 and 122, for example) positioned and moving in correlation with the position(s) and movements) of his (real) hands (hands 131 and 132, for example).
  • a symbol or function may be assigned to a key depending on the language, mode or function that is requested by a user (by performing corresponding movement or gesture), so that a given key, when depressed by the user after it has been assigned the new language, mode or function, will be interpreted by the controller of the data inputting system 100 in a different way.
  • the controller of data inputting system 100 may change the appearance of virtual layouts (such as virtual layout 140) responsive to commands issued, or posed, by the user.
  • the controller may change the keyboard architecture or arrangement of keys, for example by changing the number of keys, size, spacing and/or placing of keys on the virtual keyboard, depending on a desired application(s).
  • a layout of a virtual keyboard may be changed according to a real time simulation of the user's hands and the positioning and movement of the user hands over a physical keyboard, whether the keyboard is real (with actual labels marked on respective keys), blank or a paper keyboard.
  • Another advantage of data inputting system 100 is that the same physical keyboard (for example physical keyboard 130) may be used to enter as many sets of symbols (each of which belonging to a different language) as the number of available WORD applications/processors.
  • the controller of data inputting system 100 may locate, at any given moment and in real time, the position and/or movement of the user's hands and fingers, and mimic them by displaying virtual fingers in the appropriate position over the keys of virtual keyboard 140. This allows the user to view his/her hand positioning and movements on monitor 120, thereby giving the user confidence in his/her finger placements above any key on the physical keyboard 130, regardless of the selected language, at any given moment, before pressing down the key and without having to look down at the physical keyboard (keyboard 130, for example).
  • the controller may cause the virtual hands (hands 121 and 122, for example) to mimic the movement and shape of the finger in respect of the virtual keyboard.
  • data inputting system 100 may enable processing images of one or two hands, as well as other body movements.
  • data inputting system 100 may capture and process head movements, eye movements, mouth movements or other movements to indicate user commands, signals and/or data entry.
  • an additional image acquisition device such as image acquisition device 110 may be used in cases where different parts of a user's body cannot be placed in the FOV of a single image acquisition device. In such cases, each image acquisition device may be spatially located to acquire images associated with different body parts.
  • Virtual keyboard 150 which is shown displayed on computer screen 155, can be adjusted, adapted or modified by the controller of the data inputting system to be as large or small as required or desired. The controller can also change the location of virtual keyboard 150 relative to screen 155.
  • Virtual keyboard 150 is shown displaying a current Indian language keyboard setup, or layout, as may be defined by the language/script set (out of 50 options, for example) being employed, or by a macro defined for a function key or for the configured keyboard.
  • a physical finger is moved from one key to another on a physical keyboard (such as physical keyboard 130 of Fig.
  • a corresponding graphical change may be made in respect of virtual keyboard 150, which results in a movement of the hand(s) (shown as 160 in Fig. IB) from one location to another on virtual keyboard 150, that mimics, or reflects, the movement of the user's hands to the appropriate (desired) physical key.
  • the controller of the data inputting system may change the appearance of user's hands 160 according to a direct command issued by the user, or according to the result of such a command.
  • a data inputting system such as data inputting system 100 of Fig. IA, may allow a user to change the labeling and/or layout of a virtual keyboard such as virtual keyboard 150. If a user wants to write a document using an Indian Language (Hindi language with Devanagari script in the case represented in Fig.
  • a user may change keyboard modes or functions, for example, s/he may change between languages, characters, letters, graphics of the keys on the virtual keyboard and so on, by indicating with his/her hands/fingers suitable gestures. Additionally, the appearance and/or the transparency extent of the virtual hand(s) may change according to the actual keyboard being used. For example, virtual hand 172 is shown less transparent than virtual hand 173, as they are each related to a different keyboard layout (to keyboards 170 and 171, respectively). Virtual keyboard 175 is shown having only six selected symbols (collectively designated 174). Virtual keyboard 175 may be shown as semi transparent.
  • Fig. ID it depicts examples of signals, or gestures, that a user may pose to an image acquisition device such as image acquisition device 110, to command a data input system, such as data inputting system 100, to change languages, modes, enter data, change functions and so on.
  • a data input system such as data inputting system 100
  • the user may use an object and/or his/her left hand to create a selected signal/gesture that may be captured by the image acquisition device, causing virtual keyboard keys to be desirably mapped.
  • Fig. ID depicts ten exemplary hand gestures, each of which is assigned a unique hand gesture number, for example. Each hand gesture number may be associated with a specific command or action to be executed or taken by the controller of the data inputting system.
  • hand gesture number 5 shown at 182, which is associated with hand gesture description 181, may indicate to, command or signal the controller of the data inputting system to change the layout, or mapping, of a virtual keyboard (for example the layout of virtual keyboard 150 of Fig. IB) from one language to another, for example. Thereafter, the user may use the changed virtual layout to enter (type) characters or symbols of the other language by moving his hand(s)/fmgers over a physical keyboard, so as to create corresponding virtual hand(s) that move in correlation [042] with the changed virtual keyboard.
  • a virtual keyboard for example the layout of virtual keyboard 150 of Fig. IB
  • identifying signals, commands, instructions and the like, by data input system such as data inputting system 100 may be implemented by first identifying or recognizing the hand gesture or signal (by an image acquiring device such as image acquiring device 110 of Fig. IA) and, then, by interpreting the hand gesture to corresponding hand gesture number and using the hand gesture number as described hereinbefore.
  • Data inputting system 100 of Fig. IA may be instructed by a user (by displaying corresponding gestures or movements to image acquiring device 110) to receive signals manually, automatically or following a selected command, for example, after depressing the "Reset" button on the physical keyboard.
  • Any number and type of hand gestures, signals and/or movements and/or other suitable signals and/or gestures and/or movements made by body parts and/or objects and so on, may be used as commands to the controller of the data inputting system.
  • left and/or right hand positions and/or movements may be captured, as may facial movements, head movements, finger movements, shoulder movements or other suitable movements which a user may use to indicate a command.
  • the data inputting system may allow a user to change, in a minimal number of keystrokes or other actions, the layout, mode, functions, and so on, of keys in a virtual keyboard such as virtual keyboard 150 of Fig. IB.
  • the user may make a gesture that is associated with a chosen layout, and then subsequently type one or more keys in order to enter the required data.
  • a data entry which may normally require several key entries in order to change layouts, keys and so on, and arrive at the required layout, may be done by applying a combination of a gesture and typing of the selected key.
  • the user may enter a command using a gesture or signal, for example, to change the keyboard key labels and/or layout on the virtual keyboard.
  • This change in the virtual keyboard may cause the required characters to be displayed on the virtual keyboard, such that a minimal number of keystrokes are required to enter selected keys. Therefore, only one keystroke may be required to enter any selected character from a set of characters of a language with many distinct characters.
  • Other actions and/or combinations of actions may be implemented as well.
  • the data inputting system may include an evaluation and predictability application for helping the controller of the data inputting system determine anticipated keys and/or keyboard layout (map) that may be subsequently required or desired by the user.
  • the predictability may be based on the evaluation of user's previous command(s), for example, commands previously issued by using hand gesture(s), movement(s), mouse movement(s), key entry and so on.
  • the currently used language Word application may interpret a combination of two or more specific keys to be equivalent to entry of selected characters.
  • the predictability application may, for example after hitting the first of the combination keys, automatically update other relevant keys to complete the possible functions resulting from combinations of the first key with various other keys.
  • the virtual keyboard may be immediately changed to display all the relevant commands or keys that may be entered in combination with "A". In this way, the user does not need to remember or use a physical table to discover keys' combinations; rather, the relevant combinations may be dynamically updated on the virtual keyboard in real time.
  • Table 190 depicts some Indian Languages symbols and respective English letters that are to be inputted to obtain the Indian Languages symbols.
  • a Indian Languages characters may be represented, used or obtained, by using a conventional method according to which a single, or a combination of two, three, four or five English letters or signs have to be typed (entered, or keyed in).
  • character 193 is obtainable by entering the letter "s” (194)
  • character 195 is obtainable by entering a combination of letters "s/t/r” (196). Accordingly, five keystrokes are required to obtain character 195.
  • Table 191 depicts a way for obtaining the same Indian Languages characters (shown at 197) by entering only one English character (one-strike implementation) in combination with corresponding hand gestures.
  • character 193 is obtained by entering the character "s” (198), as before (194), and without using any hand gesture, because using one character (194 or 198) is simple enough.
  • only one character may be entered (for example character "s") in combination with a corresponding hand gesture 199 (Hand Gesture 3, in this example).
  • Fig. IF several examples of mapping are schematically illustrated and described, which correspond to the Indian Languages characters shown in Fig. IE.
  • Fig. IF will be described in association with Fig. IE.
  • the initial or default English character "S" is symbolically shown (at 184) assigned the Indian Languages character 183, since the character "S" was entered, according to this example, without any hand gesture ("Hand Gesture” equal “none", at 185 in both figures).
  • the initial or default English character “S” is symbolically shown (at 186) assigned the Indian Languages character 187, since the character "S" was entered, according to this example, with hand gesture ("Hand Gesture” equal "none", at 188 in both figures).
  • a series of operations or processes is schematically illustrated, that may be implemented to operate the data entry system.
  • a user may configure or initialize the data inputting system's software on his/her computer. Once the software is executed and functioning, a calibration screen may be shown, indicating that the system is beginning or has begun operations.
  • the user may place a real (physical) keyboard in the view of the camera.
  • the data inputting system may employ a keyboard identification function to identify the keyboard position and keys, for example, and may subsequently notify the user that the keyboard has been identified and that the data inputting system is ready to operate.
  • the user may place his/her hand(s) in the field of vision of the image acquiring device.
  • the data inputting system may capture and
  • the user may place his/her hand(s) in the field of vision of the image acquiring device.
  • the data inputting system may capture and process the images of the hand(s), after which the data inputting system may notify the user that hand(s) identification has been completed.
  • the user may operate a word processing application, according to the actual language being used.
  • the data inputting system may display a virtual keyboard on the data inputting system's display, with a default set of keys, according to the selected word processing application.
  • the user may type commands into the real keyboard, while looking at his/her virtual hand(s) correlated movements on the virtual keyboard, to enter data in a selected language, as required.
  • the virtual keyboard may depict virtual fingers actively moving to depress selected keys, thereby indicating actual entry of commands.
  • the user may make a selected signal by using one or two hands (or other body parts).
  • the signal(s) may be selected from a set of pre-configured signals or actions.
  • the data inputting system may capture and process the hand signal, and enter the required user command(s) or data.
  • the keys and/or layout on the virtual keyboard may be changed in accordance with the user's command(s), for example by entering a function key, combination of keys, mouse action or command, combination of key entry(ies) and mouse action(s) and so on. Any combination of the above steps may be implemented. Further, other steps or series of steps may be used instead of and/or in addition to steps specified hereinbefore.
  • the method may enable processing of one or two hands, as well as other body movements.
  • the method may capture and process head movements, eye movements, mouth movements or other movements to indicate user commands, signals and/or data entry.
  • image acquiring device 290 may capture the gesture made, or posed, by hand 292 of a user (not shown).
  • the captured gesture may be identified, for example, as gesture number 3 in Fig. ID.
  • the virtual keyboard (291) may be changed to display layout number 3 (for example), which corresponds to gesture number 3.
  • a Word Processor may change to mode 3 of operation (change language).
  • the user may depress a key(s) on physical keyboard 293.
  • the controller (not shown) of data inputting system 201 may simulate and display the user's finger(s) (shown at 294) hitting a corresponding key(s) on virtual keyboard 291, while it substantially correlates position(s) and movement(s) of virtual hands 294 to position(s) and movement(s) of physical hands 292 of the user using the data inputting system 201.
  • Computer 286 is functionally coupled to physical keyboard 293, from which it may receive signals that represent depressed keys, and to display screen 295, to which it forwards, among other things, an image of the virtual hands and virtual keyboards.
  • a virtual hand may simulate, or mimic, mouse-like navigation; a user may enter data into a Computer (304) and/or operate graphical applications by using a virtual mouse.
  • Computer 304 includes the controller (not shown) of data306.
  • image acquisition device 301 may capture a user's movement or body part, for example a hand (shown at 302), which may move in order to implement mouse-type navigation.
  • the direction of movement of the hand for example, in an X-Y plane, is observed and forwarded to Computer 304.
  • the movement or gesture(s) of the user may be captured by image acquisition device 301.
  • captured image(s) of gesture(s) may be processed to enter user's command(s) and/or data.
  • the commands and/or data and so on may be entered into Computer 304, where they may be executed accordingly, for example, by navigating on display 303, changing modes and/or functions, entering specific commands and so on.
  • the computer's screen display screen 295 of Fig. 2B or 303 of Fig. 3, for example
  • physical keyboard physical keyboard 293 of Fig. 2B, for example
  • image acquisition device image acquisition device 290 of Fig. 2B or 301 of Fig.
  • Computer 286 of Fig. 2B or Computer 304 of Fig. 3 may be any suitable conventional computer provided that it includes, in addition to its normal hardware and software components, virtual reality enabling hardware and software applications required for analyzing acquired images to determine movements and gestures made, or posed, by a user, and for generating, and generally handling, a virtual keyboard and virtual hand(s).

Abstract

A system and method are provided to enable data entry into a computing system. The system may include a controller functionally coupled to an image acquisition device and adapted to set a map of an input key, or an entire keyboard layout, based on acquired image(s) captured by the image acquisition device. The system may capture images of user movements and/or gestures in a selected field of view, and may process these images to identify and execute commands in accordance with the movements.

Description

A SYSTEM AND METHOD OF INPUTTING DATA INTO A COMPUTING
SYSTEM
FIELD OF THE DISCLOSURE
[001] The present disclosure generally relates to the field of data inputting devices. More specifically, the present disclosure relates to a method for facilitating multilingual data inputting and to a system utilizing the multilingual data inputting method.
BACKGROUND
[002] The inputting of information into electronic systems such as personal computers (PCs), mobile phones, palm computers, aircraft computers, and the like, using data input devices with keys, for example alphanumeric keyboards, touch pads or touch screens, collectively referred to herein as "keyboard", has not changed significantly ever since the PC was invented.
[003] Almost every computer comes equipped with a keyboard as the main form of interaction between a user and a computer, as keyboards are designed for the input of data such as text and other types of characters (collectively referred to hereinafter as "symbols" or "key labels"), and also to control the operation of the computer. Physically, computer keyboards are an arrangement of rectangular or near-rectangular buttons, or "keys". Keyboards typically have one or more symbols engraved, printed or otherwise marked on each key; in most cases, each press of a key corresponds to a single symbol being entered into the computer and, in many cases, displayed on the computer's display screen. However, producing some symbols requires pressing and holding several keys simultaneously, or in sequence. Other keys can produce actions when pressed, and other actions may be available by simultaneously pressing more than one action key. [004] There exist a large number of different keyboard layouts (arrangements of keys and assignment of symbols on keys). The need for different keyboard layouts arises because different people may need to use different sets of symbols; typically, this is because they are writing/reading in different languages. Depending on the application, the number of keys on a keyboard generally varies from the standard 101 keys to the 104 windows keyboards all the way up to 130 keys with some programmable keys. There are also compact variants that have fewer than 90 keys; they are normally found in laptops or in desktop computers with space constraints. The layout of keys on the most common modern-day keyboard layout on most English language computer and typewriter keyboards is called the QWERTY design, after the first six letters seen in the keyboard's top row of letters.
[005] Most of the information or data is typically typed, or keyed in, using a keyboard with fixed key functions. Particularly with PCs, dual language keyboards normally have a second language alphabet symbols marked onto the keyboard keys together with the English alphabet, or a replaceable layout that allows changing the language, with the switching between the dual language functions of each key achieved generally by software instructions. In a standard keyboard, a user may usually view up to three symbols imposed on each depressible key, defining different languages and options provided by different software and system operation methods. A computer mouse may also serve independently for choosing menu-like options and for graphic command inputs.
[006] Standard keyboards suffer from a number of disadvantages and limitations. For example, standard keyboards normally contain function keys, for example with symbols Fl to F 12, which hide functions that are defined in separate instructions. Often, the user has to learn and memorize these hidden functions or invoke their meaning from a lookup table, a "Help" directory, or from other sources. However, such keyboards are limited in the number of keys, and therefore key functions. Normally, a nonprofessional typist has to follow the typing action by shifting frequently his gaze between the keyboard placed on a desk and the PC monitor screen, which is normally placed in front and higher on the desk. In particular with dual-language keyboards, frequent eye shifting, and the frequent non-feedback use of the "Alt+Shift" and "Caps Lock" functions, leads to errors in typing. With the advent of wide Internet usage, PC users are required to use a traditional keyboard for more and more complicated input commands, or for remembering more "hidden" functions of each key.
[007] US Patent No. 6,611,253, by the same inventor as the present disclosure, describes a method and system for virtual input environment, and creation of an input unit with changeable keys display. However, US 6,611,253 does not teach using hands gestures, to control the layout of the virtual keyboard, nor it teaches that the appearance of the virtual hands may be depend on the virtual keyboard's layout being used. In addition, US 6,611,253 does not evaluating previous commands to predict current, or future, command or user's request.
[008] For some languages, as stated above, for example, Russian and Hebrew, hardware keyboards generally have second language alphabet symbols etched into the keyboard keys together with the English alphabet. In some countries where the spoken languages have large alphabets and/or many characters (in excess of 50 characters, for example), the keyboards generally include only English letters, as all the letters of the other language cannot be displayed on the physical keyboard. This situation often creates a huge problem for users of such a languages to perform data entry tasks in such languages.
[009] One example of a country where this problem is exacerbated is India, which is a multilingual country with 22 constitutional languages and 10 different scripts. Eighteen constitutional Indian Languages are mentioned as follows with their scripts within parentheses: Hindi (Devanagari), Konkani (Devanagari), Marathi (Devanagari), Nepali (Devanagari), Sanskrit (Devanagari), Sindhi (Devanagari/Urdu), Kashmiri (Devanagari/Urdu); Assamese (Assamese), Manipuri (Manipuri), Bangla (Bangali), Oriya (Oriya), Gujarati (Gujarati), Punjabi (Gurumukhi), Telugu (Telugu), Kannada (Kannada), Tamil (Tamil), Malayalam (Malay alam) and Urdu (Urdu). Indian scripts typically have 12-15 vowels, 35-40 consonants and a few diacritical marks. Besides this, for each vowel there is a corresponding modifier symbol and for each consonant, there is a corresponding pure consonant form (called half-letter). This makes the total set of symbols required to enter such languages larger than what a normal keyboard could accommodate. In India, for example, in order to try providing solutions for the lack of Keyboards, different Indian language word processors are distributed with hardcopy "maps" indicating the Indian letter hiding behind each English key. Approximately 50 hardcopy maps are available among different distributors of Indian word processors, however hardware manufacturers do not generally supply keyboards with Indian languages layout. Over 95 percent of Indian population is generally deprived of the benefits of English-based Information Technology.
SUMMARY
[010] The following embodiments and aspects thereof are described and illustrated in conjunction with systems, tools and methods, which are meant to be exemplary and illustrative, not limiting in scope. In various embodiments, one or more of the above- described problems have been reduced or eliminated, while other embodiments are directed to other advantages or improvements.
[011] There is provided, in accordance with some embodiments of the present disclosure, systems and methods of inputting data into a computing system. The system disclosed herein may include a controller adapted to set and display a map of an input key(s), or entire (or partial) layout of a keyboard, based on a signal associated with an acquired image; and an image acquisition device functionally coupled to the controller and adapted to provide the controller with a signal relating to, or associated with, the acquired image. The system may capture and identify, recognize or interpret, one or more gestures, for example, gestures generated or performed by a user's hands, fingers, or other body parts, and to execute commands in accordance with the gestures. Accordingly, by "identify, recognize or interpret" is meant associating (by the controller) a given gesture, or a combination of gestures, to a specific command. The system may include a monitor, or display screen, to display a virtual keyboard and virtual hands that simulate the ρosition(s) and/or movements) of the user's physical hands, optionally in real time. [012] In some embodiments the meaning of key(s), or key label(s), on the virtual keyboard may be dynamically updated (changed) according to user's commands. Alternatively, the entire, or only part of the, layout of the virtual keyboard may be dynamically updated (changed) according to user's commands. For example, the virtual keyboard may be dynamically updated (changed) according to the user's hand location(s) and/or movement(s).
[013] In some embodiments the system may include evaluation and predictability software application (the controller may be adapted) to evaluate, predict or otherwise determine, based on a user's previous command, anticipated key(s) subsequently required on the virtual keyboard and/or the anticipated layout subsequently required on the virtual keyboard. In some embodiments hand movements or other gestures may implement mouse-type navigation.
[014] As part of the present disclosure a method is provided for inputting data into a computing system. In some embodiments the method may include acquiring image(s) of parts of a user body and of a physical keyboard, and setting and displaying a mapping of a key(s) based on the acquired image(s). The method may further include processing and interpreting signals relating to, or associated with, acquired image(s), to enable inputting of selected commands and/or symbols according to the signals. The method may further include using a keyboard identification function to identify keys of a physical keyboard placed in the FOV of the image acquisition device; processing the images of at least one user hand to determine the hand's position and/or hand's movement relative to the physical keyboard; and displaying the position(s) movement(s) of at least one hand on a virtual keyboard on a corresponding display screen, for example on a computer display, or on a computer display object.
[015] In some embodiments the method may include dynamically updating key labels on the virtual keyboard in response to the images processed, and/or dynamically updating the entire or portions of the keyboard layout of the virtual keyboard in response to the images processed. [016] In some embodiments the method may include hand movements that are intended to implement mouse-type navigation using a virtual mouse, and/or other body movements that may be interpreted as user's input commands and/or data. The term "other body movements" may refer, for example, to hand movements, head movements, eye movements, mouth movements or other types of movements that may indicate user commands and/or data entry.
[017] In addition to the exemplary aspects and embodiments described above, further aspects and embodiments will become apparent by reference to the figures and by study of the following detailed description.
BRIEF DESCRIPTION OF THE FIGURES
[018] Briefl Exemplary embodiments are illustarted in referenced figures. It is intended that the embodiments and figures disclosed herein are to be considered illustrative, rather than restrictive. The disclosure, however, both as to organization and method of operation, together with objects, features, and advantages thereof, may best be understood by reference to the following detailed description when read with the accompanying figures, in which:
[019] Figs. IA and IB are graphic exemplify a virtual keyboard according to some embodiments of the present disclosure;
[020] Fig. 1C is a graphical example of a keyboard with a limited number of Hindi characters, which may be utilized as a physical and/or virtual keyboard, according to some embodiments;
[021] Fig. ID shows an exemplary set of graphical views of various fingers-based signals or gestures which may be used to indicate commands and/or data for input, according to some embodiments;
[022] Figs. IE and IF show examples for mapping an input key based on a signal associated with an acquired image, and examples of maps of an input key based on a signal associated with an acquired image according to some embodiments of the present disclosure;
[023] Fig. 2A shows an exemplary flowchart for operating a data entry system according to some embodiments of the present disclosure;
[024] Fig. 2B schematically illustrates a general layout and functionality of a data entry system according to some embodiments of the present disclosure; and
[025] Fig. 3 schematically illustrates a general layout and functionality of a data entry system according to other embodiments of the present disclosure.
[026] It will be appreciated that for simplicity and clarity of illustration, elements shown in the drawings have not necessarily been drawn to scale. For example, the dimensions of some of the elements may be exaggerated relative to other elements for clarity. Further, where considered appropriate, reference numerals may be repeated among the drawings to indicate corresponding or analogous elements throughout the serial views.
DETAILED DESCRIPTION
[027] While a number of exemplary aspects and embodiments have been discussed above, those of skill in the art will recognize certain modifications, permutations, additions and sub-combinations thereof. It is therefore intended that the following appended claims, and claims hereafter introduced, be construed as including all such modifications, permutations, additions and sub-combinations as are within their true spirit and scope.
[028] Unless specifically stated otherwise, as apparent from the following discussions, it is appreciated that throughout the specification discussions utilizing terms such as "processing", "computing", "calculating", "determining", or the like, refer to the action and/or processes of a computer or computing system, or similar electronic computing device, that manipulate and/or transform data represented as physical, such as electronic, quantities within the computing system's registers and/or memories into other data similarly represented as physical quantities within the computing system's memories, registers or other such information storage, transmission or display devices.
[029] The platforms, processes and displays presented herein are not inherently related to any particular computer or other apparatus. Various general-purpose computing systems and networking equipment may be used with programs in accordance with the teachings herein, or it may prove convenient to construct a more specialized apparatus to perform the desired method. The desired structure for a variety of these systems will appear from the description below. In addition, embodiments are not described with reference to any particular programming language. It will be appreciated that a variety of programming languages may be used to implement the teachings of the present disclosure as described herein.
[030] In the following detailed description, numerous specific details are set forth in order to provide a thorough understanding of various embodiments. However, it will be understood by those skilled in the art that the described embodiments may be practiced without these specific details.
[031] The term "gesture" as used herein may include at least movement(s) and/or signal(s) and/or indication(s) and/or sign(s) and/or instruction(s) and/or request(s), and the like, made by body part(s) of a person operating a keyboard. By "command" is meant herein using a gesture, or a series or combination of gestures, to instruct, request or order a computer to change the meaning or interpretation (assigning, or reassigning a symbol) of selected keys, or to change the meaning or interpretation of the entire keyboard layout in accordance with the gesture, or the series or combination of gestures. In this respect, the meaning or interpretation of a key (the way a computer interprets the key when depressed) may be, at a given instant, either the symbol physically marked on the key (the initial, or default, symbol), or a different symbol assigned (or reassigned) to the key by the computer in accordance with a specific command. [032] Embodiments as described herein may facilitate improvement in human computer interaction (HCI) problems associated with, in particular but not only, languages that have many symbols. Such embodiments enhance the speed of data inputting, offer a complete solution to Language data entry tasks, substantially reduce the number of typing errors and improve the usability of languages word processors by presenting a user-friendly data input environment.
[033] Referring now to Fig. IA, an exemplary data inputting system in accordance with some embodiments of the present disclosure is partially depicted. Data inputting system 100 may include at least one image acquisition, or capturing, device, such as image acquisition device 110, which may be, for example, a digital camera, video camera, PC camera, Webcam, and so on, which may be located on a computer display monitor 120, for example. Of course, image acquisition device 110 may be located at different locations, provided that position(s), location(s), movement(s) and gesture(s) of a user hand(s), or other parts of the user's body for that matter, are clearly visible to; that is, they appear in the FOV of, image acquisition device 110. Data inputting system 100 may further include a controller (not shown) associated with, or functionally coupled to, image acquisition device 110 and adapted to set a map of a key(s) or a map of the entire keyboard layout, based on a signal that is generated and outputted by image acquisition device 110 to the controller, which signal represents an image(s) in the FOV of image acquisition device 110 relating to, and including, a gesture(s) or movement(s). By "setting a map" is generally meant herein gesture(s)-dependent assignment of a specific symbol(s) to a specific key (or assignment of a specific set of symbols to respective specific keys). More specifically, the mapping of a key(s) may include changing the symbolic meaning assigned to the key(s) in accordance with movements or gestures made by, or associated with, a user such as the user whose (real) hands only are shown, at 131 and 132, resting on physical keyboard 130. The controller may be an integral part of, or embedded or incorporated or affiliated into a computer (PC, laptop and the like) that gets input signals from a keyboard such as keyboard 130 and operates a display screen such as display screen 120. [034] The user (not shown) may move his hands 131 and/or 132 from one position to another, in respect of, or relative to, physical keyboard 130, while signals relating to images of hands 131 and 132, which are acquired by image acquisition device 110, are constantly, or intermittently, forwarded to data inputting system 100 for processing and interpretation. Data inputting system 100 may process and interpret the signal relating to the acquired images to identify gesture(s) and/or movement(s) made by the user by his hand(s) or other body part(s), and execute commands in accordance, or in connection, with the gesture(s) and/or movement(s). Physical keyboard 130 may be a standard keyboard (with symbols marked thereon), blank keyboard (a keyboard with no markings on the keys), paper keyboard (a drawing of a keyboard with any number of keys, for example), touch pad, keypad, imaginary keyboard (flat naked surfaces such as tables and boards), and so on. Data inputting system 100 may also utilize a Word application(s) suitable for processing language(s) being used (for example English, Hindi and German).
[035] In some embodiments the controller of data inputting system 100 may utilize digital signal processing ("DSP") techniques, for processing images captured by image acquisition device 110, and simulation techniques for displaying a corresponding virtual keyboard, such as virtual keyboard 140, on a computer screen such as computer screen 120. In some aspects of these embodiments the number, size and spacing of the keys on virtual keyboard 140 may substantially resemble those of physical keyboard 130 to facilitate user's orientation. According to other aspects either the number or size or spacing of the keys on virtual keyboard 140 may differ from those of physical keyboard 130. The controller of data inputting system 100 may cause the symbol(s), or meaning, assigned to a key(s), and/or the symbols or meaning assigned to the entire layout of virtual keyboard 140, to change according to a corresponding user's gesture or movement, which may be identified, recognized or interpreted by the controller of data inputting system 100 from the acquired, or captured image(s).
[036] In some embodiments of the present disclosure the controller of data inputting system 100 may utilize simulation techniques for creating and handling virtual hand(s) and cause virtual hand(s) to appear and move on the display screen in accordance with the user's (real, physical) hand(s) position, location and movement. For example, virtual [034] hands 121 and 122 are shown in Fig. IA reflecting the user hands 131 and 132, respectively. Virtual keyboard 140 and/or virtual hands 121 and/or 122 may be likewise or differently scaled to facilitate ease of data inputting.
[035] In order to input data in a requested language by utilizing data inputting system 100 a user may place his hands, shown as 131 and 132, and make a gesture, or a series or combination of gestures, in the FOV of image acquisition device 110, which is/are associated with the requested language. The gesture, or series or combination of gestures, may then be recognized or interpreted by the controller of data inputting system 100 as being associated with the requested language. Responsive to the recognition or interpretation of the gesture, or series or combination of gestures, the controller of data inputting system 100 may assign symbol(s) constituting the requested language to selected keys on the virtual keyboard 140 and display the layout of virtual keyboard 140 with the currently requested assigned symbol(s). It may be said that the controller of data inputting system 100 has set a map of the keyboard layout, which corresponds to the requested language. Once the requested language has been set by the controller of data inputting system 100 responsive to the user's command, the user may enter data to data inputting system 100 by observing keys on virtual keyboard 140 and moving his hand (131 or 132), or certain finger(s) thereof, across physical keyboard 130.
[036] The user may move his hand, or certain fmger(s) thereof, until the respective virtual hand (121 or 122), or corresponding virtual fmger(s) thereof, reaches the vicinity of the next key on virtual key 140 to be depressed and a finger of the virtual hand 121 or 122 overlaps that key. Then, the user may depress the key in physical keyboard 130 underneath the finger corresponding to, or associated with, the virtual finger overlapping the requested key on virtual keyboard 140. The above-described procedure may be repeated as many times as required for inputting additional symbols. Should the user wish to change to, or to set, a different language, the user may pose a gesture, or series or combination of gestures that correspond to the different language. Every time the controller of data inputting system 100 is requested to set a different map of keys, to set a different language, the controller may enable the corresponding WORD application/processor. For example, if the controller of data inputting system 100 is requested to change from French to English, then the controller may disable the French WORD application/processor and enable the English application/processor. Physical keyboard 130 is functionally coupled to the controller of data inputting system 100, or to a computer within which the controller of data inputting system 100 resides, for forwarding to the controller signals representative of default symbols or functions associated with the keys in physical keyboard 130. Nevertheless, the controller of data inputting system 100 is adapted, or configured, to interpret signals forwarded to it from physical keyboard 130 according to a current mapping setting.
[037] Data inputting system 100 has several advantages over prior art solutions. For example, a user inputting data to the system does not have to shift gaze, back and forth, between the physical keyboard (keyboard 130, for example) and the screen displaying the resulting typing (display screen 120, for example). Instead, the user may only gaze at the virtual keyboard (virtual keyboard 140, for example), and see virtual hands (virtual hands 121 and 122, for example) positioned and moving in correlation with the position(s) and movements) of his (real) hands (hands 131 and 132, for example).
[038] In general, a symbol or function may be assigned to a key depending on the language, mode or function that is requested by a user (by performing corresponding movement or gesture), so that a given key, when depressed by the user after it has been assigned the new language, mode or function, will be interpreted by the controller of the data inputting system 100 in a different way. The controller of data inputting system 100 may change the appearance of virtual layouts (such as virtual layout 140) responsive to commands issued, or posed, by the user. For example, the controller may change the keyboard architecture or arrangement of keys, for example by changing the number of keys, size, spacing and/or placing of keys on the virtual keyboard, depending on a desired application(s). According to some embodiments a layout of a virtual keyboard may be changed according to a real time simulation of the user's hands and the positioning and movement of the user hands over a physical keyboard, whether the keyboard is real (with actual labels marked on respective keys), blank or a paper keyboard. Another advantage of data inputting system 100 is that the same physical keyboard (for example physical keyboard 130) may be used to enter as many sets of symbols (each of which belonging to a different language) as the number of available WORD applications/processors.
[039] In some embodiments, by capturing images of the user's hand(s) positions (as described hereinbefore) together with keyboard key entries (depressible buttons), the controller of data inputting system 100 may locate, at any given moment and in real time, the position and/or movement of the user's hands and fingers, and mimic them by displaying virtual fingers in the appropriate position over the keys of virtual keyboard 140. This allows the user to view his/her hand positioning and movements on monitor 120, thereby giving the user confidence in his/her finger placements above any key on the physical keyboard 130, regardless of the selected language, at any given moment, before pressing down the key and without having to look down at the physical keyboard (keyboard 130, for example). When a real finger (for example a finger of hand 131) moves across the physical keyboard and depresses a key on a physical keyboard, such as physical keyboard 130, or touches an area thereon, the controller may cause the virtual hands (hands 121 and 122, for example) to mimic the movement and shape of the finger in respect of the virtual keyboard.
[040] In other embodiments data inputting system 100 may enable processing images of one or two hands, as well as other body movements. For example, in some embodiments data inputting system 100 may capture and process head movements, eye movements, mouth movements or other movements to indicate user commands, signals and/or data entry. If required, an additional image acquisition device such as image acquisition device 110 may be used in cases where different parts of a user's body cannot be placed in the FOV of a single image acquisition device. In such cases, each image acquisition device may be spatially located to acquire images associated with different body parts.
[041] Referring now Fig. IB, an exemplary virtual keyboard and virtual hands are shown according to some embodiments of the present disclosure. Virtual keyboard 150, which is shown displayed on computer screen 155, can be adjusted, adapted or modified by the controller of the data inputting system to be as large or small as required or desired. The controller can also change the location of virtual keyboard 150 relative to screen 155. Virtual keyboard 150 is shown displaying a current Indian language keyboard setup, or layout, as may be defined by the language/script set (out of 50 options, for example) being employed, or by a macro defined for a function key or for the configured keyboard. When a physical finger is moved from one key to another on a physical keyboard (such as physical keyboard 130 of Fig. IA) and the latter key is depressed, a corresponding graphical change may be made in respect of virtual keyboard 150, which results in a movement of the hand(s) (shown as 160 in Fig. IB) from one location to another on virtual keyboard 150, that mimics, or reflects, the movement of the user's hands to the appropriate (desired) physical key. In some embodiments the controller of the data inputting system may change the appearance of user's hands 160 according to a direct command issued by the user, or according to the result of such a command. By "according to a direct command issued by the user" is meant (in respect of the latter embodiments) depressing a key on the physical keyboard that functions to change the appearance of the virtual hand(s), whereas by "according to the result of such a command" is meant (in respect of the latter embodiments) issuing a command to the controller to change language (by posing a gesture or movement, as described hereinbefore) and concurrently changing (by the controller) the appearance of the virtual hands according to the language being used. Changing the appearance of the virtual hands may include, for example, making the virtual hand(s) transparent or partially transparent, for allowing a user to view substantially the entire area of virtual keyboard 150. Transparent virtual hands 160 are shown in Fig. IB superimposed on virtual keyboard 150. Text 151 is shown consisting of Hindi symbols, in accordance with the Hindi language currently assigned to virtual keyboard layout 150.
Referring now to Fig. 1 C, three examples of virtual hands superimposed on different virtual keyboards are shown according to some embodiments of the present disclosure. A data inputting system, such as data inputting system 100 of Fig. IA, may allow a user to change the labeling and/or layout of a virtual keyboard such as virtual keyboard 150. If a user wants to write a document using an Indian Language (Hindi language with Devanagari script in the case represented in Fig. 1C), by using the "KA"+"HALANT" combination ("K" + "D" keys within Windows), s/he may shift from a first to a second layout (for example from keyboard layout 170 keyboard layout to 170), and the appearance of certain, or all of, the keys on the virtual keyboard may accordingly change the relevant keys and/or layout or map for the language and/or mode and/or script that is to be used (the "HALANT form of KA" in the case represented in Fig. 1C, for example).
[045] hi other embodiments a user may change keyboard modes or functions, for example, s/he may change between languages, characters, letters, graphics of the keys on the virtual keyboard and so on, by indicating with his/her hands/fingers suitable gestures. Additionally, the appearance and/or the transparency extent of the virtual hand(s) may change according to the actual keyboard being used. For example, virtual hand 172 is shown less transparent than virtual hand 173, as they are each related to a different keyboard layout (to keyboards 170 and 171, respectively). Virtual keyboard 175 is shown having only six selected symbols (collectively designated 174). Virtual keyboard 175 may be shown as semi transparent.
[046] Referring now to Fig. ID, it depicts examples of signals, or gestures, that a user may pose to an image acquisition device such as image acquisition device 110, to command a data input system, such as data inputting system 100, to change languages, modes, enter data, change functions and so on. For example, the user may use an object and/or his/her left hand to create a selected signal/gesture that may be captured by the image acquisition device, causing virtual keyboard keys to be desirably mapped. Fig. ID, depicts ten exemplary hand gestures, each of which is assigned a unique hand gesture number, for example. Each hand gesture number may be associated with a specific command or action to be executed or taken by the controller of the data inputting system. For example, hand gesture number 5, shown at 182, which is associated with hand gesture description 181, may indicate to, command or signal the controller of the data inputting system to change the layout, or mapping, of a virtual keyboard (for example the layout of virtual keyboard 150 of Fig. IB) from one language to another, for example. Thereafter, the user may use the changed virtual layout to enter (type) characters or symbols of the other language by moving his hand(s)/fmgers over a physical keyboard, so as to create corresponding virtual hand(s) that move in correlation [042] with the changed virtual keyboard. Accordingly, identifying signals, commands, instructions and the like, by data input system such as data inputting system 100 may be implemented by first identifying or recognizing the hand gesture or signal (by an image acquiring device such as image acquiring device 110 of Fig. IA) and, then, by interpreting the hand gesture to corresponding hand gesture number and using the hand gesture number as described hereinbefore.
[043] Data inputting system 100 of Fig. IA may be instructed by a user (by displaying corresponding gestures or movements to image acquiring device 110) to receive signals manually, automatically or following a selected command, for example, after depressing the "Reset" button on the physical keyboard. Any number and type of hand gestures, signals and/or movements and/or other suitable signals and/or gestures and/or movements made by body parts and/or objects and so on, may be used as commands to the controller of the data inputting system. For example, left and/or right hand positions and/or movements may be captured, as may facial movements, head movements, finger movements, shoulder movements or other suitable movements which a user may use to indicate a command. In this way, the data inputting system may allow a user to change, in a minimal number of keystrokes or other actions, the layout, mode, functions, and so on, of keys in a virtual keyboard such as virtual keyboard 150 of Fig. IB.
In one example the user may make a gesture that is associated with a chosen layout, and then subsequently type one or more keys in order to enter the required data. Such a data entry, which may normally require several key entries in order to change layouts, keys and so on, and arrive at the required layout, may be done by applying a combination of a gesture and typing of the selected key. For example, when the user desires to enter Hindi characters, where there are more characters than there are keys on the physical keyboard, the user may enter a command using a gesture or signal, for example, to change the keyboard key labels and/or layout on the virtual keyboard. This change in the virtual keyboard may cause the required characters to be displayed on the virtual keyboard, such that a minimal number of keystrokes are required to enter selected keys. Therefore, only one keystroke may be required to enter any selected character from a set of characters of a language with many distinct characters. Other actions and/or combinations of actions may be implemented as well.
[049] According to some embodiments, the data inputting system may include an evaluation and predictability application for helping the controller of the data inputting system determine anticipated keys and/or keyboard layout (map) that may be subsequently required or desired by the user. The predictability may be based on the evaluation of user's previous command(s), for example, commands previously issued by using hand gesture(s), movement(s), mouse movement(s), key entry and so on. For example, if a language with many characters is currently used, the currently used language Word application may interpret a combination of two or more specific keys to be equivalent to entry of selected characters. The predictability application may, for example after hitting the first of the combination keys, automatically update other relevant keys to complete the possible functions resulting from combinations of the first key with various other keys. For example, if striking "A" and then a variety of other keys enters a selection of commands, then the "A" acts as a function key of sorts. When the user enters "A", the virtual keyboard may be immediately changed to display all the relevant commands or keys that may be entered in combination with "A". In this way, the user does not need to remember or use a physical table to discover keys' combinations; rather, the relevant combinations may be dynamically updated on the virtual keyboard in real time.
[050] Referring now to Fig. IE, two comparative method tables are exemplified according to some embodiments of the present disclosure. Table 190 depicts some Indian Languages symbols and respective English letters that are to be inputted to obtain the Indian Languages symbols. According to exemplary table 190, a Indian Languages characters may be represented, used or obtained, by using a conventional method according to which a single, or a combination of two, three, four or five English letters or signs have to be typed (entered, or keyed in). For example, character 193 is obtainable by entering the letter "s" (194), and character 195 is obtainable by entering a combination of letters "s/t/r" (196). Accordingly, five keystrokes are required to obtain character 195. [051] Table 191 depicts a way for obtaining the same Indian Languages characters (shown at 197) by entering only one English character (one-strike implementation) in combination with corresponding hand gestures. For example, character 193 is obtained by entering the character "s" (198), as before (194), and without using any hand gesture, because using one character (194 or 198) is simple enough. However, instead of using five keystrokes (196) for obtaining character 195, only one character may be entered (for example character "s") in combination with a corresponding hand gesture 199 (Hand Gesture 3, in this example).
[052]Referring now to Fig. IF, several examples of mapping are schematically illustrated and described, which correspond to the Indian Languages characters shown in Fig. IE. Fig. IF will be described in association with Fig. IE. In example, the initial or default English character "S" is symbolically shown (at 184) assigned the Indian Languages character 183, since the character "S" was entered, according to this example, without any hand gesture ("Hand Gesture" equal "none", at 185 in both figures). In another example, the initial or default English character "S" is symbolically shown (at 186) assigned the Indian Languages character 187, since the character "S" was entered, according to this example, with hand gesture ("Hand Gesture" equal "none", at 188 in both figures).
[053] Referring now to Fig. 2A, a series of operations or processes is schematically illustrated, that may be implemented to operate the data entry system. At block 200, a user may configure or initialize the data inputting system's software on his/her computer. Once the software is executed and functioning, a calibration screen may be shown, indicating that the system is beginning or has begun operations. At block 205 the user may place a real (physical) keyboard in the view of the camera. At block 210 the data inputting system may employ a keyboard identification function to identify the keyboard position and keys, for example, and may subsequently notify the user that the keyboard has been identified and that the data inputting system is ready to operate. [054] At block 215 the user may place his/her hand(s) in the field of vision of the image acquiring device. At block 220 the data inputting system may capture and
[044] At block 215 the user may place his/her hand(s) in the field of vision of the image acquiring device. At block 220 the data inputting system may capture and process the images of the hand(s), after which the data inputting system may notify the user that hand(s) identification has been completed. At block 225 the user may operate a word processing application, according to the actual language being used. At block 230 the data inputting system may display a virtual keyboard on the data inputting system's display, with a default set of keys, according to the selected word processing application. The user may type commands into the real keyboard, while looking at his/her virtual hand(s) correlated movements on the virtual keyboard, to enter data in a selected language, as required. The virtual keyboard may depict virtual fingers actively moving to depress selected keys, thereby indicating actual entry of commands.
[045] At block 235, if the user wishes to change keyboard modes, languages, input commands and so on, the user may make a selected signal by using one or two hands (or other body parts). The signal(s) may be selected from a set of pre-configured signals or actions. At block 240 the data inputting system may capture and process the hand signal, and enter the required user command(s) or data. At block 245 the keys and/or layout on the virtual keyboard may be changed in accordance with the user's command(s), for example by entering a function key, combination of keys, mouse action or command, combination of key entry(ies) and mouse action(s) and so on. Any combination of the above steps may be implemented. Further, other steps or series of steps may be used instead of and/or in addition to steps specified hereinbefore.
[046] In other embodiments the method may enable processing of one or two hands, as well as other body movements. For example, in some embodiments the method may capture and process head movements, eye movements, mouth movements or other movements to indicate user commands, signals and/or data entry. [047] Referring now to Fig. 2B, a general layout and functionality of a data inputting system according to some embodiments is schematically illustrated and described. At block 260 image acquiring device 290 may capture the gesture made, or posed, by hand 292 of a user (not shown). At block 265 the captured gesture may be identified, for example, as gesture number 3 in Fig. ID. At block 270 the virtual keyboard (291) may be changed to display layout number 3 (for example), which corresponds to gesture number 3. Accordingly, at block 275 a Word Processor may change to mode 3 of operation (change language). At block 280 the user may depress a key(s) on physical keyboard 293. At block 285 the controller (not shown) of data inputting system 201 may simulate and display the user's finger(s) (shown at 294) hitting a corresponding key(s) on virtual keyboard 291, while it substantially correlates position(s) and movement(s) of virtual hands 294 to position(s) and movement(s) of physical hands 292 of the user using the data inputting system 201. Other steps, series of steps may be used. Computer 286 is functionally coupled to physical keyboard 293, from which it may receive signals that represent depressed keys, and to display screen 295, to which it forwards, among other things, an image of the virtual hands and virtual keyboards.
[048] Referring now to Fig. 3, a mouse-like implementation is shown and described according to some embodiments of the present disclosure. As part of the present disclosure, a virtual hand may simulate, or mimic, mouse-like navigation; a user may enter data into a Computer (304) and/or operate graphical applications by using a virtual mouse. Computer 304 includes the controller (not shown) of data306. At block 300, image acquisition device 301 may capture a user's movement or body part, for example a hand (shown at 302), which may move in order to implement mouse-type navigation. At block 305 the direction of movement of the hand, for example, in an X-Y plane, is observed and forwarded to Computer 304. Additionally or alternatively, at block 310 the movement or gesture(s) of the user may be captured by image acquisition device 301. At block 315 captured image(s) of gesture(s) may be processed to enter user's command(s) and/or data. At block 320 the commands and/or data and so on, may be entered into Computer 304, where they may be executed accordingly, for example, by navigating on display 303, changing modes and/or functions, entering specific commands and so on. [049] The computer's screen (display screen 295 of Fig. 2B or 303 of Fig. 3, for example), physical keyboard (physical keyboard 293 of Fig. 2B, for example) and image acquisition device (image acquisition device 290 of Fig. 2B or 301 of Fig. 3, for example), may be any suitable conventional display screen, physical keyboard or image acquisition device. Computer 286 of Fig. 2B or Computer 304 of Fig. 3 may be any suitable conventional computer provided that it includes, in addition to its normal hardware and software components, virtual reality enabling hardware and software applications required for analyzing acquired images to determine movements and gestures made, or posed, by a user, and for generating, and generally handling, a virtual keyboard and virtual hand(s).
[050] The foregoing description of various embodiments has been presented for the purposes of illustration and description. It is not intended to be exhaustive or to limit the disclosure to the precise form disclosed. It should be appreciated by persons skilled in the art that many modifications, variations, substitutions, changes, and equivalents are possible in light of the above teaching. It is therefore intended that the appended claims be interpreted to include all modifications, permutations, additions and sub- combinations as are within their true spirit and scope.

Claims

CLAIMSWhat is claimed is:
1. A data inputting system, comprising: an image acquisition device for generating an output signal associated with an acquired image(s) of parts of a user body and a physical keyboard; and a controller adapted to receive said output signal and, based on acquired image(s), to set and display a map of an input key(s).
2. The system of claim 1, wherein the map of an input key(s) is part of a virtual keyboard.
3. The system of claim 2, wherein part(s) of the user body is a user hand(s) and the controller further displays a virtual hand(s) that simulates said user hand(s) positioned or moving relative to the physical keyboard.
4. The system of claim 1, wherein said controller is further adapted to interpret acquired image(s) relating to a user gesture as a unique command.
5. The system of claim 4, wherein said controller is further adapted to change a key(s) on said virtual keyboard according to a user gesture(s).
6. The system of claim 4, wherein said controller is further adapted to change the layout of the virtual keyboard according to a user gesture(s).
7. The system of claim 3, wherein the controller is further adapted to change the appearance of the virtual hand(s).
8. The system of claim 1, wherein the physical keyboard is selected from the group of keyboards consisting of blank keyboards, paper keyboards, touch pads, key pads, imaginary keyboards and flat naked surfaces.
9. The system of claim 4, wherein said controller is further adapted to determine and display, based on a user's previous command(s), anticipated keys subsequently required on the virtual keyboard.
10. The system of claim 4, wherein said controller is further adapted to determine and display, based on a user's previous command(s), anticipated keyboard layout subsequently required on the virtual keyboard.
11. The system of claim 4, wherein the gesture is performed by a user's hand.
12. The system of claim 4, wherein a gesture is one or more user movements selected from the group consisting of movements made to implement mouse-type navigation, and movements made to implement commands.
13. The system of claim 4, wherein said gesture is performed by one or more parts of a user's body selected from a group including hand movements, head movements, eye movements, mouth movements or other movements to indicate user commands.
14. The system of claim 3, wherein the virtual hand(s) is transparently displayed superimposed on the virtual keyboard.
15. The system of claim 3, wherein the appearance of the virtual hands depends on the virtual keyboard layout being used.
16. A system according to claim 4, wherein the controller is further adapted to predict next key(s) label(s) and next virtual keyboard layout by evaluating prior command(s).
17. A system according to claim 4, wherein the gesture is used in combination with a depressed key in the physical keyboard to set and display a corresponding mapping.
18. A method of inputting data into a computing system, comprising: acquiring image(s) of parts of a user body and a physical keyboard and setting and displaying a mapping of an input key(s) on a virtual keyboard based on the acquired image(s).
19. The method of claim 18, wherein said image(s) relates to a gesture made by the user to be interpreted as a command.
20. The method of claim 19, wherein the gesture relates to movement(s) selected from a group consisting of hand movements, head movements, eye movements and mouth movements.
21. The method of claim 19, wherein key label(s) changes on the virtual keyboard responsive to the gesture.
22. The method of claim 19, wherein the layout of the virtual keyboard changes responsive to the gesture.
23. The method of claim 18, further comprising acquiring the user hand(s) and displaying a virtual hand(s) that simulates said user hand(s) positioned or moving relative to the physical keyboard.
24. The method of claim 23, wherein a character is entered into the computing system using a single keystroke of the physical keyboard.
25. The method of claim 21, wherein hand movement(s) simulate a virtual mouse for implementing mouse-type navigation.
26. The method of claim 23, wherein the virtual hand(s) is transparently displayed superimposed on the virtual keyboard.
27. The method of claim 23, wherein the appearance of the virtual hands depends on the virtual keyboard layout being used.
28. The method of claim 19, further comprising evaluating prior command(s) to predict next key(s) label(s) and next virtual keyboard layout.
29. The method according to claim 19, wherein the gesture is used in combination with a depressed key in the physical keyboard to set and display a corresponding mapping.
30. A method of inputting data into a computing system, comprising: displaying a virtual hand(s) in correlation with a real user hand(s) and superimposed on a virtual keyboard representative of a physical keyboard; and setting a map to key on the virtual keyboard responsive to a gesture being made by a user's hand.
31. The method of claim 30, wherein the layout of the virtual keyboard changes responsive to a gesture made by a real user's hand.
32. The method of claim 30, wherein the virtual hand(s) is transparently displayed superimposed on the virtual keyboard.
33. The method of claim 30, wherein the appearance of the virtual hand(s) depends on the virtual keyboard layout being used.
34. The method of claim 30, wherein the gesture is used in combination with a depressed key on the physical keyboard.
PCT/IL2007/000174 2006-02-16 2007-02-08 A system and method of inputting data into a computing system WO2007093984A2 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP07706117A EP1999547A4 (en) 2006-02-16 2007-02-08 A system and method of inputting data into a computing system
JP2008554905A JP2009527041A (en) 2006-02-16 2007-02-08 System and method for entering data into a computing system

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
IN421DE2006 2006-02-16
IN421/DEL/2006 2006-02-16

Publications (2)

Publication Number Publication Date
WO2007093984A2 true WO2007093984A2 (en) 2007-08-23
WO2007093984A3 WO2007093984A3 (en) 2009-04-23

Family

ID=38371891

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IL2007/000174 WO2007093984A2 (en) 2006-02-16 2007-02-08 A system and method of inputting data into a computing system

Country Status (5)

Country Link
EP (1) EP1999547A4 (en)
JP (1) JP2009527041A (en)
KR (1) KR20080106265A (en)
CN (1) CN101589425A (en)
WO (1) WO2007093984A2 (en)

Cited By (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2921634A1 (en) * 2007-09-27 2009-04-03 Airbus Sas SYSTEM AND METHOD FOR ACCESSING PERSONAL COMPUTER EQUIPMENT ON BOARD AN AIRCRAFT, AND AIRCRAFT COMPRISING SUCH A SYSTEM.
US20090146957A1 (en) * 2007-12-10 2009-06-11 Samsung Electronics Co., Ltd. Apparatus and method for providing adaptive on-screen keyboard
CN102405453A (en) * 2009-04-20 2012-04-04 微软公司 Context-based state change for an adaptive input device
EP2480951A1 (en) * 2009-09-21 2012-08-01 Extreme Reality Ltd. Methods circuits apparatus and systems for human machine interfacing with an electronic appliance
WO2012144666A1 (en) * 2011-04-19 2012-10-26 Lg Electronics Inc. Display device and control method therof
EP2575006A1 (en) * 2011-09-27 2013-04-03 Elo Touch Solutions, Inc. Touch and non touch based interaction of a user with a device
EP2575007A1 (en) * 2011-09-27 2013-04-03 Elo Touch Solutions, Inc. Scaling of gesture based input
WO2013101206A1 (en) * 2011-12-30 2013-07-04 Intel Corporation Interactive drawing recognition
CN101874404B (en) * 2007-09-24 2013-09-18 高通股份有限公司 Enhanced interface for voice and video communications
WO2014200874A1 (en) * 2013-06-13 2014-12-18 Microsoft Corporation Generation of text by way of a touchless interface
GB2470653B (en) * 2009-05-26 2015-04-29 Zienon L L C Enabling data entry based on differentiated input objects
US9092132B2 (en) 2011-01-24 2015-07-28 Apple Inc. Device, method, and graphical user interface with a dynamic gesture disambiguation threshold
US9128614B2 (en) 2010-11-05 2015-09-08 Apple Inc. Device, method, and graphical user interface for manipulating soft keyboards
US9141285B2 (en) 2010-11-05 2015-09-22 Apple Inc. Device, method, and graphical user interface for manipulating soft keyboards
US9436381B2 (en) 2011-01-24 2016-09-06 Apple Inc. Device, method, and graphical user interface for navigating and annotating an electronic document
US9442654B2 (en) 2010-01-06 2016-09-13 Apple Inc. Apparatus and method for conditionally enabling or disabling soft buttons
WO2020111626A1 (en) * 2018-11-28 2020-06-04 Samsung Electronics Co., Ltd. Electronic device and key input method therefor
US11617953B2 (en) 2020-10-09 2023-04-04 Contact Control Interfaces, Llc. Virtual object interaction scripts

Families Citing this family (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9400559B2 (en) 2009-05-29 2016-07-26 Microsoft Technology Licensing, Llc Gesture shortcuts
CN102214009A (en) * 2010-04-08 2011-10-12 深圳市闪联信息技术有限公司 Method and system for implementing keyboard input
CN102289283A (en) * 2010-06-16 2011-12-21 微软公司 Status change of adaptive device
US20130187893A1 (en) * 2010-10-05 2013-07-25 Hewlett-Packard Development Company Entering a command
KR101772979B1 (en) * 2011-04-06 2017-08-31 엘지전자 주식회사 Mobile terminal and control method thereof
US8850349B2 (en) * 2012-04-06 2014-09-30 Google Inc. Smart user-customized graphical keyboard
US10664657B2 (en) 2012-12-27 2020-05-26 Touchtype Limited System and method for inputting images or labels into electronic devices
GB201223450D0 (en) 2012-12-27 2013-02-13 Touchtype Ltd Search and corresponding method
GB201322037D0 (en) * 2013-12-12 2014-01-29 Touchtype Ltd System and method for inputting images/labels into electronic devices
KR102040288B1 (en) * 2013-02-27 2019-11-04 삼성전자주식회사 Display apparatus
KR101489069B1 (en) * 2013-05-30 2015-02-04 허윤 Method for inputting data based on motion and apparatus for using the same
KR102166330B1 (en) 2013-08-23 2020-10-15 삼성메디슨 주식회사 Method and apparatus for providing user interface of medical diagnostic apparatus
JP5877824B2 (en) * 2013-09-20 2016-03-08 ヤフー株式会社 Information processing system, information processing method, and information processing program
CN104978016A (en) * 2014-04-14 2015-10-14 宏碁股份有限公司 Electronic device with virtual input function
CN105224069B (en) * 2014-07-03 2019-03-19 王登高 A kind of augmented reality dummy keyboard input method and the device using this method
CN104199550B (en) * 2014-08-29 2017-05-17 福州瑞芯微电子股份有限公司 Virtual keyboard operation device, system and method
JP2016177658A (en) * 2015-03-20 2016-10-06 カシオ計算機株式会社 Virtual input device, input method, and program
KR102447858B1 (en) * 2015-04-07 2022-09-28 인텔 코포레이션 avatar keyboard
CN106488160A (en) * 2015-08-24 2017-03-08 中兴通讯股份有限公司 A kind of method for displaying projection, device and electronic equipment
CN110007774B (en) * 2019-03-27 2022-01-14 联想(北京)有限公司 Keyboard device and electronic equipment
CN110414225B (en) * 2019-07-24 2023-05-26 广东魅视科技股份有限公司 System and method for preventing HID keyboard attack
CN112684901A (en) * 2019-10-18 2021-04-20 王光达 Screen key position identification display method and single-hand chord mobile keyboard thereof
CN114167997B (en) * 2022-02-15 2022-05-17 北京所思信息科技有限责任公司 Model display method, device, equipment and storage medium

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6433774B1 (en) * 1998-12-04 2002-08-13 Intel Corporation Virtualization of interactive computer input
US6614422B1 (en) * 1999-11-04 2003-09-02 Canesta, Inc. Method and apparatus for entering data using a virtual input device
WO2001093182A1 (en) * 2000-05-29 2001-12-06 Vkb Inc. Virtual data entry device and method for input of alphanumeric and other data
US7042442B1 (en) * 2000-06-27 2006-05-09 International Business Machines Corporation Virtual invisible keyboard
KR20030072591A (en) * 2001-01-08 2003-09-15 브이케이비 인코포레이티드 A data input device
JP4099117B2 (en) * 2003-07-22 2008-06-11 シャープ株式会社 Virtual keyboard system
IL161002A0 (en) * 2004-03-22 2004-08-31 Itay Katz Virtual video keyboard system
US20070063979A1 (en) * 2005-09-19 2007-03-22 Available For Licensing Systems and methods to provide input/output for a portable data processing device
US20080018591A1 (en) * 2006-07-20 2008-01-24 Arkady Pittel User Interfacing

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See references of EP1999547A4 *

Cited By (33)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103442201A (en) * 2007-09-24 2013-12-11 高通股份有限公司 Enhanced interface for voice and video communications
CN101874404B (en) * 2007-09-24 2013-09-18 高通股份有限公司 Enhanced interface for voice and video communications
CN103442201B (en) * 2007-09-24 2018-01-02 高通股份有限公司 Enhancing interface for voice and video communication
WO2009047464A3 (en) * 2007-09-27 2009-06-11 Airbus System and method for accessing a personal computer device onboard an aircraft and aircraft equipped with such system
US8321611B2 (en) 2007-09-27 2012-11-27 Airbus System and method for accessing a personal computer device onboard an aircraft and aircraft equipped with such system
FR2921634A1 (en) * 2007-09-27 2009-04-03 Airbus Sas SYSTEM AND METHOD FOR ACCESSING PERSONAL COMPUTER EQUIPMENT ON BOARD AN AIRCRAFT, AND AIRCRAFT COMPRISING SUCH A SYSTEM.
WO2009047464A2 (en) * 2007-09-27 2009-04-16 Airbus System and method for accessing a personal computer device onboard an aircraft and aircraft equipped with such system
US20090146957A1 (en) * 2007-12-10 2009-06-11 Samsung Electronics Co., Ltd. Apparatus and method for providing adaptive on-screen keyboard
US8345008B2 (en) * 2007-12-10 2013-01-01 Samsung Electronics Co., Ltd. Apparatus and method for providing adaptive on-screen keyboard
CN102405453A (en) * 2009-04-20 2012-04-04 微软公司 Context-based state change for an adaptive input device
CN102405453B (en) * 2009-04-20 2013-11-06 微软公司 Context-based state change for an adaptive input device
GB2470653B (en) * 2009-05-26 2015-04-29 Zienon L L C Enabling data entry based on differentiated input objects
EP2480951A1 (en) * 2009-09-21 2012-08-01 Extreme Reality Ltd. Methods circuits apparatus and systems for human machine interfacing with an electronic appliance
EP2480951A4 (en) * 2009-09-21 2014-04-30 Extreme Reality Ltd Methods circuits apparatus and systems for human machine interfacing with an electronic appliance
US9442654B2 (en) 2010-01-06 2016-09-13 Apple Inc. Apparatus and method for conditionally enabling or disabling soft buttons
US9128614B2 (en) 2010-11-05 2015-09-08 Apple Inc. Device, method, and graphical user interface for manipulating soft keyboards
US9146673B2 (en) 2010-11-05 2015-09-29 Apple Inc. Device, method, and graphical user interface for manipulating soft keyboards
US9141285B2 (en) 2010-11-05 2015-09-22 Apple Inc. Device, method, and graphical user interface for manipulating soft keyboards
US9436381B2 (en) 2011-01-24 2016-09-06 Apple Inc. Device, method, and graphical user interface for navigating and annotating an electronic document
US9092132B2 (en) 2011-01-24 2015-07-28 Apple Inc. Device, method, and graphical user interface with a dynamic gesture disambiguation threshold
US10365819B2 (en) 2011-01-24 2019-07-30 Apple Inc. Device, method, and graphical user interface for displaying a character input user interface
US10042549B2 (en) 2011-01-24 2018-08-07 Apple Inc. Device, method, and graphical user interface with a dynamic gesture disambiguation threshold
WO2012144666A1 (en) * 2011-04-19 2012-10-26 Lg Electronics Inc. Display device and control method therof
US9746928B2 (en) 2011-04-19 2017-08-29 Lg Electronics Inc. Display device and control method thereof
US9448714B2 (en) 2011-09-27 2016-09-20 Elo Touch Solutions, Inc. Touch and non touch based interaction of a user with a device
EP2575006A1 (en) * 2011-09-27 2013-04-03 Elo Touch Solutions, Inc. Touch and non touch based interaction of a user with a device
EP2575007A1 (en) * 2011-09-27 2013-04-03 Elo Touch Solutions, Inc. Scaling of gesture based input
WO2013101206A1 (en) * 2011-12-30 2013-07-04 Intel Corporation Interactive drawing recognition
US9430035B2 (en) 2011-12-30 2016-08-30 Intel Corporation Interactive drawing recognition
WO2014200874A1 (en) * 2013-06-13 2014-12-18 Microsoft Corporation Generation of text by way of a touchless interface
WO2020111626A1 (en) * 2018-11-28 2020-06-04 Samsung Electronics Co., Ltd. Electronic device and key input method therefor
US11188227B2 (en) 2018-11-28 2021-11-30 Samsung Electronics Co., Ltd Electronic device and key input method therefor
US11617953B2 (en) 2020-10-09 2023-04-04 Contact Control Interfaces, Llc. Virtual object interaction scripts

Also Published As

Publication number Publication date
KR20080106265A (en) 2008-12-04
JP2009527041A (en) 2009-07-23
CN101589425A (en) 2009-11-25
WO2007093984A3 (en) 2009-04-23
EP1999547A4 (en) 2011-10-12
EP1999547A2 (en) 2008-12-10

Similar Documents

Publication Publication Date Title
EP1999547A2 (en) A system and method of inputting data into a computing system
US5157384A (en) Advanced user interface
US6600480B2 (en) Virtual reality keyboard system and method
US9891820B2 (en) Method for controlling a virtual keyboard from a touchpad of a computerized device
US6388657B1 (en) Virtual reality keyboard system and method
EP1383034B1 (en) Touch-type key input apparatus
US9529523B2 (en) Method using a finger above a touchpad for controlling a computerized system
EP0769175B9 (en) Multiple pen stroke character set and handwriting recognition system
US20160364138A1 (en) Front touchscreen and back touchpad operated user interface employing semi-persistent button groups
US20060119588A1 (en) Apparatus and method of processing information input using a touchpad
US20020190946A1 (en) Pointing method
US9542032B2 (en) Method using a predicted finger location above a touchpad for controlling a computerized system
WO2004010276A1 (en) Information display input device and information display input method, and information processing device
AU2005203634A1 (en) Integrated keypad system
JP2013515295A (en) Data input system and method
JP2006524955A (en) Unambiguous text input method for touch screen and reduced keyboard
JPH05508500A (en) User interface with pseudo devices
WO2007121673A1 (en) Method and device for improving inputting speed of characters
JP2009110092A (en) Input processor
JP2007510999A (en) Character conversion of data input panel
KR20050048758A (en) Inputting method and appartus of character using virtual button on touch screen or touch pad
JP2003196007A (en) Character input device
Hirche et al. Adaptive interface for text input on large-scale interactive surfaces
CN101551701A (en) Multidimensional control method and device, optimal or relatively favorable display input method and device
WO2015013662A1 (en) Method for controlling a virtual keyboard from a touchpad of a computerized device

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 200780013631.5

Country of ref document: CN

WWE Wipo information: entry into national phase

Ref document number: 2008554905

Country of ref document: JP

NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 1020087022648

Country of ref document: KR

Ref document number: 2007706117

Country of ref document: EP