WO2015016894A1 - Gesture detection - Google Patents

Gesture detection Download PDF

Info

Publication number
WO2015016894A1
WO2015016894A1 PCT/US2013/052972 US2013052972W WO2015016894A1 WO 2015016894 A1 WO2015016894 A1 WO 2015016894A1 US 2013052972 W US2013052972 W US 2013052972W WO 2015016894 A1 WO2015016894 A1 WO 2015016894A1
Authority
WO
WIPO (PCT)
Prior art keywords
gesture
sensor
keyboard
input device
computer
Prior art date
Application number
PCT/US2013/052972
Other languages
French (fr)
Inventor
Inderjit Singh Chohan
Original Assignee
Hewlett-Packard Development Company, L.P.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hewlett-Packard Development Company, L.P. filed Critical Hewlett-Packard Development Company, L.P.
Priority to PCT/US2013/052972 priority Critical patent/WO2015016894A1/en
Publication of WO2015016894A1 publication Critical patent/WO2015016894A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/02Input arrangements using manually operated switches, e.g. using keyboards or dials
    • G06F3/023Arrangements for converting discrete items of information into a coded form, e.g. arrangements for interpreting keyboard generated codes as alphanumeric codes, operand codes or instruction codes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means

Definitions

  • Electronic devices including personal computers and mobile phones, may have keyboards for receiving input from a user.
  • Keyboards are typically used to type text and numbers into a word processor, text editor or other programs.
  • Keyboards may also be used for computer gaming, either with regular keyboards or by using keyboards with special gaming features, which can expedite frequently used keystroke combinations.
  • Keyboards have characters engraved or printed on the keys, and the user enters information into their computers by pressing the keys on a keyboard. Each press of a key corresponds to a single written symbol and can produce actions or computer commands.
  • FIG. 1 illustrates an example system in accordance with an implementation
  • FIG. 2 illustrates example components of an example system in accordance with an implementation
  • FIG. 3 illustrates an example process flow diagram in accordance with an implementation.
  • Various implementations described herein are directed to an interface device that incorporates sensors. More specifically, and as described in greater detail below, various aspects of the present disclosure are directed to a manner by which a sensor in a keyboard detects gestures performed above keyboard.
  • aspects of the present disclosure described herein perform actions associated with gestures in response to detecting the gestures over a keyboard of an electronic device such as a computer.
  • the approach described herein allows a user to communicate with the computer without touching the keyboard of the computer.
  • aspects of the present disclosure described herein detect and interpret 3D motions of an object (e.g., stylist) or user (e.g., user's hands or fingers).
  • this approach may allow communication through gestures including 3D motions, increase performance speed (e.g., typing speed, browsing speed) and lead to a more flexible and enjoyable experience for the user.
  • a method for interpreting a gesture comprises detecting, by a sensor, a gesture performed over an input device housing the sensor, identifying an action associated with the gesture, and performing the identified action.
  • the gesture comprises a 3D motion.
  • a system in another example in accordance with the present disclosure, comprises a sensor to detect a gesture in an area on the input device, and a controller, communicatively coupled to the sensor, to provide input device data to an output device.
  • the input device data is generated in response to the gesture and indicates an action associated with the gesture.
  • the output device performs an action in response to receiving the input device data.
  • a non-transitory computer-readable medium comprises instructions that when executed cause a device to (i) detect, by a sensor, a gesture performed over an input device housing the sensor, and (ii) identify an action associated with the gesture.
  • the gesture comprises a 3D motion.
  • Fig. 1 illustrates an example system 100 in accordance with an implementation.
  • the system 100 comprises a computer 1 10 with a user interface 120, a keyboard 130, each of which is described in greater detail below.
  • the system 100 depicted in Fig. 1 represents a generalized illustration and that other components may be added or existing components may be removed, modified, or rearranged without departing from a scope of the present disclosure.
  • the system 100 illustrated in Fig. 1 includes only one user interface 120, the system may actually comprise a plurality of user interfaces, and only one has been shown and described for simplicity.
  • the computer 1 10 may be a user device. It should be noted that the computer 1 10 is intended to be representative of a broad category of data processors.
  • the computer 1 10 may include a processor and memory and help translate input received by the keyboard 130.
  • the computer 1 10 may include any type of processor, memory or display. Additionally, the elements of the computer 1 10 may communicate via a bus, network or other wired or wireless interconnection.
  • the computer 1 10 may be configured as any type of personal computer, portable computer, workstation, personal digital assistant, video game player, retail point of sale device, communication device (including wireless phones and messaging devices), media device, including recorders and players (including televisions, cable boxes, music players, and video players) or other device capable of accepting input from a user and of processing information.
  • the computer 1 10 may be a data input or output device, such as a remote control or display device, that may communicate with a computer or media system (e.g., remote control for television) using a suitable wired or wireless technique.
  • a user 150 may interact with the system 100 by controlling the keyboard 130, which may be an input device for the computer 1 10.
  • the user may perform various gestures on or above the keyboard 130. Such gestures may involve, but not limited to, touching, pressing, waiving, or placing an object in proximity. Other gestures may be include pointing, reaching, grabbing, picking an object up, and/or putting an object down.
  • the system 100 may have a series of movements that are defined to the computer 1 10 as various commands or acts. Accordingly, when the user 150 performs one of the movements, the computer 1 10 performs the act associated with that movement.
  • the user interface 120 may be a display of the computer 1 10.
  • the user interface 120 may refer to the graphical, textual and/or auditory information a computer program may present to the user 150, and the control sequences (such as keystrokes with the computer keyboard) the user 150 may employ to control the program.
  • the user interface 120 may present various pages that represent applications available to the user 150.
  • the user interface 120 may facilitate interactions between the user 150 and computer systems by inviting and responding to user input and translating tasks and results to a language or image that the user 150 can understand.
  • the computer 1 10 may receive input from a plurality of input devices, such as a keyboard, mouse, touch device or verbal command.
  • the keyboard 130 may be arranged as a QWERTY keyboard and may require activation (e.g., pressing) of an individual key to produce a data character or function, or the simultaneous activation of two or more keys to produce a secondary data character or function.
  • the keyboard 130 may have keys which can be moved downward by pushing down on the touch surface of each key in order to indicate that the user intends to enter the character represented by the key.
  • the keyboard 130 may be a component of an interface device allowing a user to interact with a computer, a telephone system, or other electronic information system.
  • An interface device may include some form or forms of output interface, such as a display screen or audio signals, and some form or forms of input interface, such as buttons to push, a keyboard (e.g., the keyboard 130), a voice receiver, or a handwriting tablet.
  • the keyboard 130 may include at least one motion sensor 160, which may be a 3D gesture sensor.
  • the 3D gesture sensor 160 may be capable of detecting the interaction of the user 150 with the computer 1 10 through any type of gesture relative over one of the keys (e.g., spacebar key) on the keyboard.
  • the sensor 160 may detect any gestures by a stylus, fingers of a user, and/or other input object on or close to the keyboard 130.
  • the sensor 160 may track the input object up to a very small distance (e.g., 1/100th of a millimeter).
  • the computer 1 10 may implement an action associated with the detected gesture. Such actions may include controlling an object shown on a display, opening/closing an application, enlarging/minimizing an object shown on the display.
  • the keyboard 130 may comprise at least one sensitive key, such as the spacebar key.
  • the sensitive spacebar key includes one or more sensors (e.g., the sensor 160) that detect motion over an observed area (e.g., over the keyboard) using a motion and presence sensing technologies such as optical or infrared.
  • the sensor may be a global shutter image sensor, implemented in the spacebar key.
  • the image sensor 160 with a global shutter may be used for reducing motion blur artifacts in a captured image.
  • the keyboard 130 may comprise at least one elevated key, which may include a sensor and may be placed above for example one of the function keys.
  • the keyboard 130 may include a sensor array, which may act as a group of sensors deployed in certain geometry pattern.
  • the sensor array may be configured to fit within the area of the surface of a key on the keyboard 130.
  • a sensor array could be housed in a separate part, rather than being mounted in a keyboard key.
  • the sensor may be located on an elevated key on the keyboard.
  • the elevated key may be position right above a gap between two keys (e.g., key g and key h).
  • the elevated key may be placed above the function keys on the keyboard.
  • the elevated key may be above the gap between f9 and f 10 keys.
  • the keyboard 130 may include control circuitry to convert gesture detection into key codes (e.g., scan codes) and send the codes down a serial cable (e.g., a keyboard cord) to the computer 1 10.
  • the keyboard 130 may process the data collected by the sensor, and output the processed data in a form that may represent motion.
  • security features/tools may be implemented in various ways such as by a firewall, one time passwords, encryption programs, digital certificates, user application security, etc. Various combinations of these and/or other security features may be used.
  • these security approaches may be layered to provide a highly secure environment in which a user can interact with the user interface 120 and/or the digitally displayed keyboard 140.
  • the security features may require a user to log in before activating the sensors in the keyboard 130 or interpreting the gestures that may be detected by the sensors in the keyboard 130.
  • the security features may require the user to log in in order to determine whether the user has permission to set or edit the settings associated with the gesture interpretation.
  • Fig. 2 illustrates example components of the system 100 in accordance with an implementation.
  • the computer 1 10 illustrated in Fig. 2 represents a generalized depiction and that other components may be added or existing components may be removed, modified, or rearranged without departing from a scope of the present disclosure.
  • the computer 1 10 comprises a processor 210, a computer readable medium 220, a keyboard controller 250, each of which is described in greater detail below.
  • the components of the computer 1 10 may be connected via buses.
  • the processor 210 and the computer readable medium 220 may be connected via a bus 230a
  • the processor 210 and the keyboard controller 250 may be connected via a bus 230b.
  • the computer readable medium 220 may comprise various databases containing, for example, a user profile data 221.
  • the keyboard 130 may comprise the sensor device 240.
  • the keyboard 130 may have a processor, a switch and other components that may be added, removed, modified, or rearranged without departing from a scope of the present disclosure. It should be readily apparent that while the keyboard 130 illustrated in Fig. 2 includes only one sensor device 240, the system may actually comprise a plurality of sensor devices, and only one has been shown and described for simplicity.
  • the processor 210 may comprise at least one high-speed data processor adequate to execute program components for executing user and/or system-generated requests.
  • the processor 210 may retrieve and execute instructions stored in the computer readable medium 220.
  • the processor 210 may be, for example, a central processing unit (CPU), a semiconductor-based microprocessor, an application specific integrated circuit (ASIC), a field-programmable gate array (FPGA) configured to retrieve and execute instructions, other electronic circuitry suitable for the retrieval and execution instructions stored on a computer readable storage medium, or a combination thereof.
  • the processor 210 may fetch, decode, and execute instructions stored on the storage medium 220 to operate the computer 1 10 in accordance with the above-described examples.
  • the computer readable medium 220 may be a non-transitory computer-readable medium that stores machine readable instructions, codes, data, and/or other information.
  • the computer readable medium 220 may be integrated with the processor 210, while in other implementations, the computer readable medium 220 and the processor 210 may be discrete units.
  • the computer readable medium 220 may participate in providing instructions to the processor 210 for execution.
  • the computer readable medium 220 may be one or more of a non-volatile memory, a volatile memory, and/or one or more storage devices.
  • non-volatile memory include, but are not limited to, electronically erasable programmable read only memory (EEPROM) and read only memory (ROM).
  • volatile memory include, but are not limited to, static random access memory (SRAM) and dynamic random access memory (DRAM).
  • SRAM static random access memory
  • DRAM dynamic random access memory
  • storage devices include, but are not limited to, hard disk drives, compact disc drives, digital versatile disc drives, optical devices, and flash memory devices.
  • the computer readable medium 220 may have a user profile database.
  • the user database may store user profile data 221 such as user authentication data, user interface data, and profile management data and/or the like.
  • user authentication data may comprise (i) group membership information (e.g., finance, management), (ii) authorization information (e.g., unauthorized, authorized, forbid/blocked, guest, or quarantined), and/or (iii) security keys (e.g., 1 a2b3c4d).
  • the computer readable storage medium may have instructions stored thereon/in which can be used to program the computer to perform any of the processes of the embodiments described herein.
  • the detection instructions 222 can cause the processor 210 to detect gestures that may be performed in an observed area over the keyboard 130.
  • the observed area may comprise at least rectangular area 6 inches wide and 4 inches long above the space bar.
  • the gestures may include any bodily motion or state. Such bodily motion or state may be performed by any part of the body of the user, or it can be performed by using an object or tool.
  • the detection instructions can cause the processor to collect data related to the position of the object (e.g., a user's hand or fingers, a stylist, a remote control) performing the gesture.
  • the position may include starting point and end point of the gesture movement.
  • the gesture pattern matching instructions 224 can cause the processor 210 to process the gestures captured by the sensor device 240. captured from the keyboard 130. In one implementation, this process may involve interpreting the 3D motion. In other implementations, this process may include the identification and recognition of posture, gait, proxemics, and human behaviors.
  • the gesture pattern matching instructions 224 may receive the location of the gesture including its starting point and endpoint.
  • the gesture pattern matching module may be in communication with the memory, and can use the gesture data to determine if the data matches one or more gesture recognition templates. For example, the gesture data can be compared against each gesture recognition template to determine if a recognized gesture has occurred, and can determine the actuated key or action corresponding to the recognized gesture.
  • the keyboard controller 250 may serve as an interface between the keyboard 130 and the computer 1 10.
  • the keyboard controller 250 may receive data from the keyboard 130, process the data, and pass the processed data to the processor 210.
  • the data may be related to the 3D motion of the user 150's hands and fingers (e.g., pointing, waving, reaching, grabbing, and picking an object up and/or putting it down).
  • the keyboard controller 250 may be implemented in the computer 1 10.
  • the keyboard controller 250 may be implemented in the keyboard 130.
  • the keyboard controller 250 may be implemented using embedded controllers that may run specialized keyboard controller software.
  • the keyboard controller 250 may comprise all or part of one or more integrated circuits, firmware code, and/or software code that may receive electrical signals from the sensor device 240 and communicate with the computer 1 10.
  • the keyboard controller 250 may be located with or near the sensor device 240. In other embodiments, some elements of the keyboard controller 250 may be with the sensor device 240 and other elements of the keyboard controller 250 may reside in the computer 1 10.
  • the sensor device 240 may be coupled to the computer 1 10 through the keyboard controller 250 via a bus 260 of the computer 1 10.
  • the sensor device 240 can be connected to the computer 1 10 through any type of interface or connection, including 12C, SPI, PS/2, Universal Serial Bus (USB), Bluetooth, RF, IRDA, keyboard scan lines or any other type of wired or wireless connection to list several non- limiting examples.
  • the sensor device 240 may use a variety of techniques for detecting a gesture.
  • the sensor device 240 may include electrodes or other structures that are adapted to detect presence, and the sensor device 240 may include motion sensors.
  • the motion sensors may be based on technologies such as, but not limited to, infrared red, surface-acoustic way, resistive, or optical sensors.
  • the sensor device 240 may be used to detect a stylus, finger or other input object(s).
  • the sensor device 240 may output keyboard information responsive to the detected gesture and object presence. For example, the sensor device 240 may output object positional information in response to the detected presence.
  • the sensor device 240 may be sensitive to the position of one or more input objects, such as a stylus, finger and/or other input object placed above the keys of the keyboard 130. Further, the sensor device 240 may provide indicia (including keyboard information) of the detected object to the computer 1 10. The computer 1 10 may process the indicia to generate the appropriate response (e.g., scroll down, insert text, move images).
  • input objects such as a stylus, finger and/or other input object placed above the keys of the keyboard 130.
  • the sensor device 240 may provide indicia (including keyboard information) of the detected object to the computer 1 10.
  • the computer 1 10 may process the indicia to generate the appropriate response (e.g., scroll down, insert text, move images).
  • the sensor device 240 may provide input to the processor 210.
  • the sensor device 240 may notify the processor 210 when a gesture is detected above a surface (e.g., a key).
  • the processor 210 may be in data communication with the computer readable medium 220, which may include a combination of temporary and/or permanent storage.
  • the computer readable medium 220 may include program memory that includes all programs and software such as an operating system, user detection software component, and any other application software programs.
  • the computer readable medium 220 may also include data memory that may include system settings, a record of user options and preferences, and any other data required by any element of the computer 1 10.
  • the distance between the object and the surface of the key may be on the order of less than a millimeter, millimeters, centimeters, or more, and may vary significantly with the type of sensing technology used and the accuracy desired.
  • the sensors may be made from very thin (e.g., 0.8mm) printed circuit boards routed to the shape of the key it is implemented on (e.g., spacebar key).
  • a capacitive sensing may be used. Capacitive touch sensing may allow high-precision tracking of a user's finger motion.
  • the computer 1 10 may display an alert informing of an error in response to receiving the input device data from the sensor 240.
  • the computer 1 10 may display a "Not Recognized" message if the gesture detected cannot be matched to a gesture template.
  • Fig. 3 illustrates an example process flow diagram 300 in accordance with an implementation. It should be readily apparent that the processes illustrated in Fig. 3 represents generalized illustrations, and that other processes may be added or existing processes may be removed, modified, or rearranged without departing from the scope and spirit of the present disclosure. Further, it should be understood that the processes may represent executable instructions stored on memory that may cause a processor to respond, to perform actions, to change states, and/or to make decisions. Thus, the described processes may be implemented as executable instructions and/or operations provided by a memory associated with a system 100.
  • the processes may represent functions and/or actions performed by functionally equivalent circuits like an analog circuit, a digital signal processor circuit, an application specific integrated circuit (ASIC), or other logic devices associated with the system 100.
  • Fig. 3 is not intended to limit the implementation of the described implementations, but rather the figure illustrates functional information one skilled in the art could use to design/fabricate circuits, generate software, or use a combination of hardware and software to perform the illustrated processes.
  • the process 300 may begin at block 305, where the 3D gesture sensor detects a gesture from an object over the keyboard.
  • an object may be a user's hands or fingers, a stylus or any other object that may be configured to create a gesture within the observed area above the keyboard.
  • the user may be, e.g., a person such as an administrator of a computer and/or an automated machine capable of creating gestures.
  • the detection process may involve, for example, detecting an interruption of the infrared light being reflected from a light source in the keyboard when the user's fingers move within the observed area (e.g., the area observed by the sensor).
  • the system proceeds to identify the action associated with the gesture.
  • the keyboard controller may generate an output in the form of gesture information.
  • such information may be the starting and end points of the motion.
  • the gesture information may be compared to a plurality of gesture templates.
  • the system renders the identified action or command.
  • the result of the action may be displayed on the user interface of the computer. For example, if the gesture is recognized to be the action of scrolling a page down, the user may observe that the page displayed on the user interface is being scrolled down.

Abstract

An example method for interpreting a gesture in accordance with aspects of the present disclosure includes detecting, by a sensor, a gesture performed over an input device housing the sensor, identifying an action associated with the gesture, and performing the identified action. The gesture includes a 3D motion.

Description

GESTURE DETECTION
BACKGROUND
[0001] Electronic devices, including personal computers and mobile phones, may have keyboards for receiving input from a user. Keyboards are typically used to type text and numbers into a word processor, text editor or other programs. Keyboards may also be used for computer gaming, either with regular keyboards or by using keyboards with special gaming features, which can expedite frequently used keystroke combinations.
[0002] Keyboards have characters engraved or printed on the keys, and the user enters information into their computers by pressing the keys on a keyboard. Each press of a key corresponds to a single written symbol and can produce actions or computer commands.
BRIEF DESCRIPTION OF THE DRAWINGS
[0003] Example implementations are described in the following detailed description and in reference to the drawings, in which:
[0004] Fig. 1 illustrates an example system in accordance with an implementation;
[0005] Fig. 2 illustrates example components of an example system in accordance with an implementation; and
[0006] Fig. 3 illustrates an example process flow diagram in accordance with an implementation.
DETAILED DESCRIPTION
[0007] Various implementations described herein are directed to an interface device that incorporates sensors. More specifically, and as described in greater detail below, various aspects of the present disclosure are directed to a manner by which a sensor in a keyboard detects gestures performed above keyboard.
[0008] Aspects of the present disclosure described herein perform actions associated with gestures in response to detecting the gestures over a keyboard of an electronic device such as a computer. According to various aspects of the present disclosure, the approach described herein allows a user to communicate with the computer without touching the keyboard of the computer. Moreover, aspects of the present disclosure described herein detect and interpret 3D motions of an object (e.g., stylist) or user (e.g., user's hands or fingers). Among other things, this approach may allow communication through gestures including 3D motions, increase performance speed (e.g., typing speed, browsing speed) and lead to a more flexible and enjoyable experience for the user.
[0009] In one example in accordance with the present disclosure, a method for interpreting a gesture is provided. The method comprises detecting, by a sensor, a gesture performed over an input device housing the sensor, identifying an action associated with the gesture, and performing the identified action. The gesture comprises a 3D motion.
[00010] In another example in accordance with the present disclosure, a system is provided. The system comprises a sensor to detect a gesture in an area on the input device, and a controller, communicatively coupled to the sensor, to provide input device data to an output device. The input device data is generated in response to the gesture and indicates an action associated with the gesture. The output device performs an action in response to receiving the input device data.
[00011] In a further example in accordance with the present disclosure, a non- transitory computer-readable medium is provided. The non-transitory computer-readable medium comprises instructions that when executed cause a device to (i) detect, by a sensor, a gesture performed over an input device housing the sensor, and (ii) identify an action associated with the gesture. The gesture comprises a 3D motion.
[00012] Fig. 1 illustrates an example system 100 in accordance with an implementation. The system 100 comprises a computer 1 10 with a user interface 120, a keyboard 130, each of which is described in greater detail below. It should be readily apparent that the system 100 depicted in Fig. 1 represents a generalized illustration and that other components may be added or existing components may be removed, modified, or rearranged without departing from a scope of the present disclosure. For example, while the system 100 illustrated in Fig. 1 includes only one user interface 120, the system may actually comprise a plurality of user interfaces, and only one has been shown and described for simplicity.
[00013] The computer 1 10 may be a user device. It should be noted that the computer 1 10 is intended to be representative of a broad category of data processors. The computer 1 10 may include a processor and memory and help translate input received by the keyboard 130. In one implementation, the computer 1 10 may include any type of processor, memory or display. Additionally, the elements of the computer 1 10 may communicate via a bus, network or other wired or wireless interconnection.
[00014] As non-limiting examples, the computer 1 10 may be configured as any type of personal computer, portable computer, workstation, personal digital assistant, video game player, retail point of sale device, communication device (including wireless phones and messaging devices), media device, including recorders and players (including televisions, cable boxes, music players, and video players) or other device capable of accepting input from a user and of processing information. Alternatively or in addition, the computer 1 10 may be a data input or output device, such as a remote control or display device, that may communicate with a computer or media system (e.g., remote control for television) using a suitable wired or wireless technique.
[00015] In some implementations, a user 150 may interact with the system 100 by controlling the keyboard 130, which may be an input device for the computer 1 10. The user may perform various gestures on or above the keyboard 130. Such gestures may involve, but not limited to, touching, pressing, waiving, or placing an object in proximity. Other gestures may be include pointing, reaching, grabbing, picking an object up, and/or putting an object down. In another implementation, the system 100 may have a series of movements that are defined to the computer 1 10 as various commands or acts. Accordingly, when the user 150 performs one of the movements, the computer 1 10 performs the act associated with that movement.
[00016] In one implementation, the user interface 120 may be a display of the computer 1 10. The user interface 120 may refer to the graphical, textual and/or auditory information a computer program may present to the user 150, and the control sequences (such as keystrokes with the computer keyboard) the user 150 may employ to control the program. In one example system, the user interface 120 may present various pages that represent applications available to the user 150. The user interface 120 may facilitate interactions between the user 150 and computer systems by inviting and responding to user input and translating tasks and results to a language or image that the user 150 can understand. In another implementation, the computer 1 10 may receive input from a plurality of input devices, such as a keyboard, mouse, touch device or verbal command. [00017] The keyboard 130 may be arranged as a QWERTY keyboard and may require activation (e.g., pressing) of an individual key to produce a data character or function, or the simultaneous activation of two or more keys to produce a secondary data character or function. The keyboard 130 may have keys which can be moved downward by pushing down on the touch surface of each key in order to indicate that the user intends to enter the character represented by the key. In one implementation, the keyboard 130 may be a component of an interface device allowing a user to interact with a computer, a telephone system, or other electronic information system. An interface device may include some form or forms of output interface, such as a display screen or audio signals, and some form or forms of input interface, such as buttons to push, a keyboard (e.g., the keyboard 130), a voice receiver, or a handwriting tablet.
[00018] The keyboard 130 may include at least one motion sensor 160, which may be a 3D gesture sensor. The 3D gesture sensor 160 may be capable of detecting the interaction of the user 150 with the computer 1 10 through any type of gesture relative over one of the keys (e.g., spacebar key) on the keyboard. The sensor 160 may detect any gestures by a stylus, fingers of a user, and/or other input object on or close to the keyboard 130. The sensor 160 may track the input object up to a very small distance (e.g., 1/100th of a millimeter). In response to the detection, the computer 1 10 may implement an action associated with the detected gesture. Such actions may include controlling an object shown on a display, opening/closing an application, enlarging/minimizing an object shown on the display.
[00019] The keyboard 130 may comprise at least one sensitive key, such as the spacebar key. The sensitive spacebar key includes one or more sensors (e.g., the sensor 160) that detect motion over an observed area (e.g., over the keyboard) using a motion and presence sensing technologies such as optical or infrared. In one implementation, the sensor may be a global shutter image sensor, implemented in the spacebar key. The image sensor 160 with a global shutter may be used for reducing motion blur artifacts in a captured image. In another implementation, the keyboard 130 may comprise at least one elevated key, which may include a sensor and may be placed above for example one of the function keys.
[00020] In another implementation, the keyboard 130 may include a sensor array, which may act as a group of sensors deployed in certain geometry pattern. The sensor array may be configured to fit within the area of the surface of a key on the keyboard 130. In a further implementation, a sensor array could be housed in a separate part, rather than being mounted in a keyboard key. In some implementations, the sensor may be located on an elevated key on the keyboard. For example, the elevated key may be position right above a gap between two keys (e.g., key g and key h). In other implementations, the elevated key may be placed above the function keys on the keyboard. For example, the elevated key may be above the gap between f9 and f 10 keys.
[00021] In a further implementation, the keyboard 130 may include control circuitry to convert gesture detection into key codes (e.g., scan codes) and send the codes down a serial cable (e.g., a keyboard cord) to the computer 1 10. In another implementation, the keyboard 130 may process the data collected by the sensor, and output the processed data in a form that may represent motion.
[00022] Depending on the implementation, security features/tools may be implemented in various ways such as by a firewall, one time passwords, encryption programs, digital certificates, user application security, etc. Various combinations of these and/or other security features may be used. In one implementation, these security approaches may be layered to provide a highly secure environment in which a user can interact with the user interface 120 and/or the digitally displayed keyboard 140. For example, the security features may require a user to log in before activating the sensors in the keyboard 130 or interpreting the gestures that may be detected by the sensors in the keyboard 130. In other implementations, the security features may require the user to log in in order to determine whether the user has permission to set or edit the settings associated with the gesture interpretation.
[00023] Fig. 2 illustrates example components of the system 100 in accordance with an implementation. It should be readily apparent that the computer 1 10 illustrated in Fig. 2 represents a generalized depiction and that other components may be added or existing components may be removed, modified, or rearranged without departing from a scope of the present disclosure. The computer 1 10 comprises a processor 210, a computer readable medium 220, a keyboard controller 250, each of which is described in greater detail below. The components of the computer 1 10 may be connected via buses. For example, the processor 210 and the computer readable medium 220 may be connected via a bus 230a, and the processor 210 and the keyboard controller 250 may be connected via a bus 230b. The computer readable medium 220 may comprise various databases containing, for example, a user profile data 221. The keyboard 130 may comprise the sensor device 240. Moreover, the keyboard 130 may have a processor, a switch and other components that may be added, removed, modified, or rearranged without departing from a scope of the present disclosure. It should be readily apparent that while the keyboard 130 illustrated in Fig. 2 includes only one sensor device 240, the system may actually comprise a plurality of sensor devices, and only one has been shown and described for simplicity.
[00024] The processor 210 may comprise at least one high-speed data processor adequate to execute program components for executing user and/or system-generated requests. The processor 210 may retrieve and execute instructions stored in the computer readable medium 220. The processor 210 may be, for example, a central processing unit (CPU), a semiconductor-based microprocessor, an application specific integrated circuit (ASIC), a field-programmable gate array (FPGA) configured to retrieve and execute instructions, other electronic circuitry suitable for the retrieval and execution instructions stored on a computer readable storage medium, or a combination thereof. The processor 210 may fetch, decode, and execute instructions stored on the storage medium 220 to operate the computer 1 10 in accordance with the above-described examples. The computer readable medium 220 may be a non-transitory computer-readable medium that stores machine readable instructions, codes, data, and/or other information.
[00025] In certain implementations, the computer readable medium 220 may be integrated with the processor 210, while in other implementations, the computer readable medium 220 and the processor 210 may be discrete units.
[00026] Further, the computer readable medium 220 may participate in providing instructions to the processor 210 for execution. The computer readable medium 220 may be one or more of a non-volatile memory, a volatile memory, and/or one or more storage devices. Examples of non-volatile memory include, but are not limited to, electronically erasable programmable read only memory (EEPROM) and read only memory (ROM). Examples of volatile memory include, but are not limited to, static random access memory (SRAM) and dynamic random access memory (DRAM). Examples of storage devices include, but are not limited to, hard disk drives, compact disc drives, digital versatile disc drives, optical devices, and flash memory devices. [00027] In one implementation, the computer readable medium 220 may have a user profile database. The user database may store user profile data 221 such as user authentication data, user interface data, and profile management data and/or the like. In one implementation, user authentication data may comprise (i) group membership information (e.g., finance, management), (ii) authorization information (e.g., unauthorized, authorized, forbid/blocked, guest, or quarantined), and/or (iii) security keys (e.g., 1 a2b3c4d).
[00028] In one implementation, the computer readable storage medium (media) may have instructions stored thereon/in which can be used to program the computer to perform any of the processes of the embodiments described herein. The detection instructions 222 can cause the processor 210 to detect gestures that may be performed in an observed area over the keyboard 130. For example, the observed area may comprise at least rectangular area 6 inches wide and 4 inches long above the space bar. The gestures may include any bodily motion or state. Such bodily motion or state may be performed by any part of the body of the user, or it can be performed by using an object or tool. Further, the detection instructions can cause the processor to collect data related to the position of the object (e.g., a user's hand or fingers, a stylist, a remote control) performing the gesture. For example, the position may include starting point and end point of the gesture movement.
[00029] The gesture pattern matching instructions 224 can cause the processor 210 to process the gestures captured by the sensor device 240. captured from the keyboard 130. In one implementation, this process may involve interpreting the 3D motion. In other implementations, this process may include the identification and recognition of posture, gait, proxemics, and human behaviors. The gesture pattern matching instructions 224 may receive the location of the gesture including its starting point and endpoint. The gesture pattern matching module may be in communication with the memory, and can use the gesture data to determine if the data matches one or more gesture recognition templates. For example, the gesture data can be compared against each gesture recognition template to determine if a recognized gesture has occurred, and can determine the actuated key or action corresponding to the recognized gesture. Further, the action instructions 226 can cause the processor 210 to respond to the detection of the gesture based on the identified action corresponding to the recognized gesture. [00030] The keyboard controller 250 may serve as an interface between the keyboard 130 and the computer 1 10. In one implementation, the keyboard controller 250 may receive data from the keyboard 130, process the data, and pass the processed data to the processor 210. For example, the data may be related to the 3D motion of the user 150's hands and fingers (e.g., pointing, waving, reaching, grabbing, and picking an object up and/or putting it down). In one implementation, and as shown in Fig. 2, the keyboard controller 250 may be implemented in the computer 1 10. In another implementation, the keyboard controller 250 may be implemented in the keyboard 130. Furthermore, in some implementations, the keyboard controller 250 may be implemented using embedded controllers that may run specialized keyboard controller software.
[00031] The keyboard controller 250 may comprise all or part of one or more integrated circuits, firmware code, and/or software code that may receive electrical signals from the sensor device 240 and communicate with the computer 1 10. In one implementation, the keyboard controller 250 may be located with or near the sensor device 240. In other embodiments, some elements of the keyboard controller 250 may be with the sensor device 240 and other elements of the keyboard controller 250 may reside in the computer 1 10.
[00032] The sensor device 240 may be coupled to the computer 1 10 through the keyboard controller 250 via a bus 260 of the computer 1 10. In another implementation, the sensor device 240 can be connected to the computer 1 10 through any type of interface or connection, including 12C, SPI, PS/2, Universal Serial Bus (USB), Bluetooth, RF, IRDA, keyboard scan lines or any other type of wired or wireless connection to list several non- limiting examples.
[00033] The sensor device 240 may use a variety of techniques for detecting a gesture. In one implementation, the sensor device 240 may include electrodes or other structures that are adapted to detect presence, and the sensor device 240 may include motion sensors. The motion sensors may be based on technologies such as, but not limited to, infrared red, surface-acoustic way, resistive, or optical sensors. In one implementation, the sensor device 240 may be used to detect a stylus, finger or other input object(s). The sensor device 240 may output keyboard information responsive to the detected gesture and object presence. For example, the sensor device 240 may output object positional information in response to the detected presence. Accordingly, the sensor device 240 may be sensitive to the position of one or more input objects, such as a stylus, finger and/or other input object placed above the keys of the keyboard 130. Further, the sensor device 240 may provide indicia (including keyboard information) of the detected object to the computer 1 10. The computer 1 10 may process the indicia to generate the appropriate response (e.g., scroll down, insert text, move images).
[00034] In one implementation, the sensor device 240 may provide input to the processor 210. The sensor device 240 may notify the processor 210 when a gesture is detected above a surface (e.g., a key). As discussed in more detail above, the processor 210 may be in data communication with the computer readable medium 220, which may include a combination of temporary and/or permanent storage. The computer readable medium 220 may include program memory that includes all programs and software such as an operating system, user detection software component, and any other application software programs. The computer readable medium 220 may also include data memory that may include system settings, a record of user options and preferences, and any other data required by any element of the computer 1 10.
[00035] As discussed above in reference to Fig. 1 , the distance between the object and the surface of the key may be on the order of less than a millimeter, millimeters, centimeters, or more, and may vary significantly with the type of sensing technology used and the accuracy desired. The sensors may be made from very thin (e.g., 0.8mm) printed circuit boards routed to the shape of the key it is implemented on (e.g., spacebar key). In some implementations, there may be contact between the user (or any other object performing the gesture) and the sensor 240 is required to operate. In such implementation, a capacitive sensing may be used. Capacitive touch sensing may allow high-precision tracking of a user's finger motion.
[00036] In one implementation, the computer 1 10 may display an alert informing of an error in response to receiving the input device data from the sensor 240. For example, the computer 1 10 may display a "Not Recognized" message if the gesture detected cannot be matched to a gesture template.
[00037] Turning now to the operation of the system 100, Fig. 3 illustrates an example process flow diagram 300 in accordance with an implementation. It should be readily apparent that the processes illustrated in Fig. 3 represents generalized illustrations, and that other processes may be added or existing processes may be removed, modified, or rearranged without departing from the scope and spirit of the present disclosure. Further, it should be understood that the processes may represent executable instructions stored on memory that may cause a processor to respond, to perform actions, to change states, and/or to make decisions. Thus, the described processes may be implemented as executable instructions and/or operations provided by a memory associated with a system 100. Alternatively or in addition, the processes may represent functions and/or actions performed by functionally equivalent circuits like an analog circuit, a digital signal processor circuit, an application specific integrated circuit (ASIC), or other logic devices associated with the system 100. Furthermore, Fig. 3 is not intended to limit the implementation of the described implementations, but rather the figure illustrates functional information one skilled in the art could use to design/fabricate circuits, generate software, or use a combination of hardware and software to perform the illustrated processes.
[00038] The process 300 may begin at block 305, where the 3D gesture sensor detects a gesture from an object over the keyboard. As discussed in more detail above with respect to Fig. 1 , an object may be a user's hands or fingers, a stylus or any other object that may be configured to create a gesture within the observed area above the keyboard. The user may be, e.g., a person such as an administrator of a computer and/or an automated machine capable of creating gestures. In particular, the detection process may involve, for example, detecting an interruption of the infrared light being reflected from a light source in the keyboard when the user's fingers move within the observed area (e.g., the area observed by the sensor).
[00039] At block 310, the system proceeds to identify the action associated with the gesture. In particular, as part of the process of identification of the gesture, the keyboard controller may generate an output in the form of gesture information. In one implementation, such information may be the starting and end points of the motion. Further, the gesture information may be compared to a plurality of gesture templates. In the event that the gesture information matches one of the templates and the gesture is recognized to be associated with an action or command, at block 315, the system renders the identified action or command. In one implementation, the result of the action may be displayed on the user interface of the computer. For example, if the gesture is recognized to be the action of scrolling a page down, the user may observe that the page displayed on the user interface is being scrolled down. [00040] The present disclosure has been shown and described with reference to the foregoing exemplary implementations. It is to be understood, however, that other forms, details, and examples may be made without departing from the spirit and scope of the disclosure that is defined in the following claims. As such, all examples are deemed to be non-limiting throughout this disclosure.

Claims

WHAT IS CLAIMED IS:
1. A method for interpreting a gesture, comprising:
detecting, by a sensor, a gesture performed over an input device housing the sensor, the gesture being a 3D motion;
identifying an action associated with the gesture; and
performing the identified action.
2. The method of claim 1 , wherein the input device is a keyboard.
3. The method of claim 2, wherein the sensor is located under a key of the keyboard.
4. The method of claim 3, wherein the key under which the sensor is located is a spacebar key of the keyboard.
5. The method of claim 2, wherein the sensor is located on an elevated key on the keyboard.
6. The method of claim 5, wherein the elevated key is placed above a gap between two keys of the keyboard.
7. The method of claim 1 , further comprising displaying the performance of the identified action on an output device.
8. The method of claim 1 , wherein detecting the gesture performed over the input device housing the sensor further comprises detecting over a specified area on the input device.
9. The method of claim 1 , wherein identifying the action associated with the gesture further comprises matching data associated with the gesture to a gesture template.
10. The method of claim 9, wherein the data associated with the gesture comprises a starting point and an end point of the 3D motion.
1 1 . A non-transitory computer-readable medium comprising instructions that when executed cause a system to:
detect, by a sensor, a gesture performed over an input device housing the sensor, the gesture being a 3D motion; and
identify an action associated with the gesture.
12. The non-transitory computer-readable medium of claim 1 1 , wherein the sensor, in detecting the gesture performed over the input device, detects the 3D motion over a keyboard having a plurality of keys, and wherein the sensor is located under one of the plurality of keys.
13. The non-transitory computer-readable medium of claim 1 1 , further comprising instructions that when executed cause the system to display an error message if the identified action does not apply to the system.
14. An system, comprising:
a sensor to detect a gesture above an input device; and
a controller, communicatively coupled to the sensor, to provide input device data to an output device, the input device data generated in response to the gesture and indicating an action associated with the gesture,
wherein the output device is to perform an action in response to receiving the input device data.
15. The system of claim 14, further comprising a database to store information on gesture templates that are compared to the gesture to identify the action associated with the gesture.
16. The system of claim 14, further comprising a gesture matching module to compare the input data to a plurality of gesture templates to identify the action associated with the gesture.
PCT/US2013/052972 2013-07-31 2013-07-31 Gesture detection WO2015016894A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/US2013/052972 WO2015016894A1 (en) 2013-07-31 2013-07-31 Gesture detection

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/US2013/052972 WO2015016894A1 (en) 2013-07-31 2013-07-31 Gesture detection

Publications (1)

Publication Number Publication Date
WO2015016894A1 true WO2015016894A1 (en) 2015-02-05

Family

ID=52432254

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2013/052972 WO2015016894A1 (en) 2013-07-31 2013-07-31 Gesture detection

Country Status (1)

Country Link
WO (1) WO2015016894A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090073128A1 (en) * 2007-09-19 2009-03-19 Madentec Limited Cleanable touch and tap-sensitive keyboard
US20120038496A1 (en) * 2010-08-10 2012-02-16 Cliff Edwards Gesture-enabled keyboard and associated apparatus and computer-readable storage medium
US20120176314A1 (en) * 2011-01-10 2012-07-12 Samsung Electronics Co., Ltd. Method and system for controlling mobile device by tracking the finger
US20120274567A1 (en) * 2011-04-29 2012-11-01 Bradley Neal Suggs Touch-enabled input device
US20120326961A1 (en) * 2011-06-21 2012-12-27 Empire Technology Development Llc Gesture based user interface for augmented reality

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090073128A1 (en) * 2007-09-19 2009-03-19 Madentec Limited Cleanable touch and tap-sensitive keyboard
US20120038496A1 (en) * 2010-08-10 2012-02-16 Cliff Edwards Gesture-enabled keyboard and associated apparatus and computer-readable storage medium
US20120176314A1 (en) * 2011-01-10 2012-07-12 Samsung Electronics Co., Ltd. Method and system for controlling mobile device by tracking the finger
US20120274567A1 (en) * 2011-04-29 2012-11-01 Bradley Neal Suggs Touch-enabled input device
US20120326961A1 (en) * 2011-06-21 2012-12-27 Empire Technology Development Llc Gesture based user interface for augmented reality

Similar Documents

Publication Publication Date Title
US11755137B2 (en) Gesture recognition devices and methods
US10061510B2 (en) Gesture multi-function on a physical keyboard
US20150100911A1 (en) Gesture responsive keyboard and interface
CN103914138A (en) Identification and use of gestures in proximity to a sensor
KR20140145579A (en) Classifying the intent of user input
US9489086B1 (en) Finger hover detection for improved typing
US20140240234A1 (en) Input Device
US20140354550A1 (en) Receiving contextual information from keyboards
JP2015022745A (en) Determining input received via tactile input device
CN104679224B (en) Input equipment and input management system
CN104077065A (en) Method for displaying virtual keyboard by touch screen terminal and touch screen terminal
US9557825B2 (en) Finger position sensing and display
US20150103010A1 (en) Keyboard with Integrated Pointing Functionality
JP2014186530A (en) Input device and portable terminal device
KR101013219B1 (en) Method and system for input controlling by using touch type
WO2015016894A1 (en) Gesture detection
JP6011114B2 (en) Character input device, character input method, and character input program
KR101706909B1 (en) Finger Input Devices
KR20140086805A (en) Electronic apparatus, method for controlling the same and computer-readable recording medium
JP5806107B2 (en) Information processing device
US20150138102A1 (en) Inputting mode switching method and system utilizing the same
KR20160103381A (en) Character input apparatus based on hand gesture and method thereof
KR101919515B1 (en) Method for inputting data in terminal having touchscreen and apparatus thereof
US20150286812A1 (en) Automatic capture and entry of access codes using a camera
CN103902198A (en) Electronic device and method used for electronic device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13890278

Country of ref document: EP

Kind code of ref document: A1

DPE1 Request for preliminary examination filed after expiration of 19th month from priority date (pct application filed from 20040101)
NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 13890278

Country of ref document: EP

Kind code of ref document: A1