US20150091854A1 - Method for interacting with an apparatus implementing a capacitive control surface, interface and apparatus implementing this method - Google Patents

Method for interacting with an apparatus implementing a capacitive control surface, interface and apparatus implementing this method Download PDF

Info

Publication number
US20150091854A1
US20150091854A1 US14/396,599 US201314396599A US2015091854A1 US 20150091854 A1 US20150091854 A1 US 20150091854A1 US 201314396599 A US201314396599 A US 201314396599A US 2015091854 A1 US2015091854 A1 US 2015091854A1
Authority
US
United States
Prior art keywords
control
function
detection
neutral
control object
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US14/396,599
Inventor
Didier Roziere
Christophe Blondin
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Quickstep Technologies LLC
Original Assignee
Fogale Nanotech SA
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fogale Nanotech SA filed Critical Fogale Nanotech SA
Assigned to FOGALE NANOTECH reassignment FOGALE NANOTECH ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: ROZIERE, DIDIER, BLONDIN, Christophe
Publication of US20150091854A1 publication Critical patent/US20150091854A1/en
Assigned to QUICKSTEP TECHNOLOGIES LLC reassignment QUICKSTEP TECHNOLOGIES LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: FOGALE NANOTECH S.A.
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/044Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means
    • G06F3/0444Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means using a single conductive element covering the whole sensing surface, e.g. by sensing the electrical current flowing at the corners
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • G06F3/0418Control or interface arrangements specially adapted for digitisers for error correction or compensation, e.g. based on parallax, calibration or alignment
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • G06F3/04166Details of scanning methods, e.g. sampling time, grouping of sub areas or time sharing with display driving
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/044Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/044Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means
    • G06F3/0443Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means using a single layer of sensing electrodes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/044Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means
    • G06F3/0446Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means using a grid-like structure of electrodes in at least two directions, e.g. using row and column electrodes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/041Indexing scheme relating to G06F3/041 - G06F3/045
    • G06F2203/04104Multi-touch detection in digitiser, i.e. details about the simultaneous detection of a plurality of touching locations, e.g. multiple fingers or pen and finger
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/041Indexing scheme relating to G06F3/041 - G06F3/045
    • G06F2203/04107Shielding in digitiser, i.e. guard or shielding arrangements, mostly for capacitive touchscreens, e.g. driven shields, driven grounds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/041Indexing scheme relating to G06F3/041 - G06F3/045
    • G06F2203/04108Touchless 2D- digitiser, i.e. digitiser detecting the X/Y position of the input means, finger or stylus, also when it does not touch, but is proximate to the digitiser's interaction surface without distance measurement in the Z direction

Definitions

  • the present invention relates to a method for interacting with an electronic and/or computerized device, utilizing a capacitive control surface. It also relates to a human-machine interface and a user device utilizing this method.
  • the field of the invention is non-limitatively that of human-machine interfaces (HMI) for interacting with a user device, utilizing a capacitive detection technology. More particularly, the field of the invention is that of HMI devices comprising a control surface with capacitive technology, for interacting with a user device without making contact with said control surface.
  • HMI human-machine interfaces
  • control surface In tablet or smartphone-type devices, the control surface is transparent. It is integrated in or on the display screen of these devices in order to increase the size of the display screen.
  • the capacitive control surfaces make it possible to detect a control object, such as a finger or a stylus, in contact with the control surface or at a distance from the control surface.
  • Control surfaces using capacitive technology suffer from an undesirable phenomenon, called “edge effect”, which is generally minimized or ignored, on the one hand to avoid interference with surrounding objects and, on the other hand, to increase the detection range of the control surface.
  • edge effect an undesirable phenomenon
  • the detection of the control object is only taken into account when it comes into contact with the control surface or when it is presented opposite the control surface.
  • the control object is presented on the periphery of the control surface without being in contact with the control surface, it is not detected or taken into account even if it is in contact with the device.
  • devices equipped with such a capacitive control surface comprise, on the side or edge of the device, more generally on the periphery of the control surface, additional selection means for selecting certain functions such as volume control, or for switching the display screen on or off, etc.
  • selection means can comprise at least one electro-mechanical button or at least one capacitive sensor, which are added to the capacitive control surface, and which increase the complexity and the cost of manufacturing the device.
  • a purpose of the present invention is to overcome the aforesaid drawbacks.
  • Another purpose of the present invention is to propose a method for interacting with a device equipped with a capacitive control surface allowing the functionalities offered by the capacitive control surface to be better utilized.
  • Another purpose of the present invention is to propose a method for interacting with a device equipped with a capacitive control surface making interaction with the device simpler and quicker.
  • Another purpose of the present invention is to propose a method for interacting with a device equipped with a capacitive control surface allowing the complexity and the cost of manufacturing of the device to be reduced while preserving the functionalities of the device.
  • At least one of these purposes is achieved with a method for interacting with an electronic and/or computerized device, known as user device, said user device comprising:
  • the method according to the invention therefore allows the edge effect of a capacitive control surface to be used, on the one hand, to detect one or more control objects disposed on a neutral surface, not equipped with capacitive electrodes, and on the other hand, to trigger one or more functions carried out by the user device following such a detection.
  • a control object is opposite the neutral surface when it is at a distance from the neutral surface, while being visible to the neutral surface.
  • the method according to the invention makes it possible to use the edge effect of a capacitive control surface, which is an undesirable physical phenomenon for the methods and systems of the state of the art which generally minimize or ignore it, or even cancel it out using physical means such as guard electrodes or software and algorithmic means.
  • the method according to the invention makes it possible to interact with a device equipped with a capacitive control surface while utilizing all the functionalities offered by the capacitive control surface.
  • the interaction proposed by the method according to the invention with a user device equipped with a capacitive control surface is simpler and quicker because a function can be triggered by a single detection of a control object on a neutral surface.
  • the control object is a finger or a hand
  • functions can be triggered simply by the user grasping the user device.
  • a user device equipped with a capacitive control surface and utilizing the method of interaction according to the invention is less costly and less complex than current devices as it makes it possible to dispense with the use of additional control means such as electro-mechanical buttons or capacitive sensors in addition to the capacitive control surface.
  • neutral surface a surface of the user device, in particular not equipped with capacitive electrodes, located at the periphery of said control surface, and having a border in common with the control surface or not.
  • the neutral surface can be in the same plane as the control surface, or in a plane forming an angle that is not zero with the plane of the control surface.
  • the neutral surface can be formed by at least one part of a rim or of an edge of the user device, and can form an angle that is not zero with the control surface.
  • a neutral surface can be formed by at least one part of two opposite edges, even at least three edges, and optionally on each of the edges of the user device.
  • the neutral surface can be arranged at the periphery of a so-called interaction face of the user device also being composed of the control surface, and preferentially the display screen, on at least one side of the interaction face.
  • one neutral surface can be arranged on each of two opposite sides of the face, or on three sides of the face, optionally on each of all of the sides of the interaction face.
  • the capacitive control surfaces use two main measurement principles.
  • the most common principle utilizes the technology known as “mutual capacitance” using electrodes in the form of rows and columns.
  • the rows, or columns respectively are transmitters excited by an electrical signal and the columns, or rows respectively, are receivers.
  • the capacitance created between each row-column intersection varies as a function of the presence, or not, of an object. This technology is not very sensitive to edge effects because the field lines are very localized to the intersections of the rows and columns.
  • the second principle utilizes the technology known as “self capacitance”.
  • two architectures are possible for the capacitive electrodes: a first architecture using electrodes in the form of rows and columns, and a second architecture using individual matrix electrodes.
  • the field lines are attracted very easily towards any object connected to earth, which is more particularly the reference voltage of the object, and it becomes possible to detect an object at a distance, i.e. without there being any contact between this object and the control surface.
  • the disadvantage of the row-column architecture is that it is impossible to detect several objects: with more than 2 objects, ghosts appear, i.e. false objects.
  • the advantage of the matrix solution is that it is possible to detect several objects with contact and without contact.
  • the invention is more particularly appropriate for a control surface utilizing a technology known as “self capacitance” using the matrix architecture.
  • the method according to the invention can advantageously comprise, after the detection step and before the triggering step, a step of determining the nature of at least one control object disposed against the neutral surface, the triggering step being carried out as a function of said nature of said control object.
  • the determination step can determine whether the control object is a stylus or a finger or part of the hand, or a hand holding the device.
  • the determination step can determine whether it is the right hand or the left hand, or even the orientation of the hand.
  • the determination step can determine whether it is the user's thumb or another of the user's fingers.
  • the step of determining the nature of the control object can comprise the following operations:
  • movement of a control object is understood a displacement quantity, a displacement speed and/or a displacement acceleration of a control object over the neutral surface.
  • the detection step can comprise, for at least one, in particular for each object disposed in contact with or opposite the neutral surface, detection of the position of said control object over said neutral surface.
  • the detection of the position of a control object can be carried out as a function of the position of the electrodes detecting the control object by means of the edge effect.
  • the detection step can advantageously comprise detection of a number of control objects disposed in contact with or opposite the neutral surface.
  • the number of objects detected can be determined as a function of the number of independent groups of capacitive electrodes detecting a control object by means of the edge effect.
  • the detection step can advantageously comprise, for at least one, in particular for each control object disposed in contact with or opposite the neutral surface, detection of the shape of said control object.
  • the shape of a control object can be determined by means of functions/algorithms taking into account the strength of the measurement signal of each capacitive electrode adjacent to the same group detecting the control object by means of the edge effect.
  • the detection step can comprise detection of a distance between at least two control objects disposed in contact with or opposite the neutral surface.
  • Measurement of the distance between two control objects detected can be carried out in the following way. For each control object detected, the electrode providing the signal with the highest amplitude is identified in the group of electrodes detecting this object. Then, the distance is measured between the electrode with the highest amplitude detecting one of the objects and the electrode with the highest amplitude detecting the other one of the objects.
  • the distance measured can be a direct distance or a distance around the periphery taking into account a peripheral path around the capacitive control surface.
  • the detection step can advantageously comprise, for at least one, in particular for each control object disposed in contact with or opposite the neutral surface, detection/measurement of a movement of said control object.
  • the triggered function can comprise an adjustment function of the value of a parameter associated with a function carried out by said user device.
  • Such an adjustment function can, for example, be the adjustment of the sound, of the display brightness, of the display contrast, etc.
  • Such an adjustment function is particularly relevant when the detection comprises detection of the movement of a control object, and in particular the quantity of displacement of the control object.
  • the adjusted function can be the volume, which can be deactivated or activated as a function of the detection step.
  • a neutral surface can be dedicated to the adjustment of a predetermined function.
  • a neutral surface disposed on an upper part of an edge of the user device can be associated with the adjustment of the sound emitted by the user device.
  • a plurality of neutral surfaces in particular independent ones, can be predefined at the periphery of the control surface.
  • At least one of these neutral surfaces can be pre-associated with/pre-dedicated to a predetermined function regardless of the display produced or the navigation within the device.
  • At least one of these neutral surfaces can be used to trigger a function or adjust a parameter, the function or the parameter in question being chosen or selected as a function of the display being produced by the user device, or of the navigation being carried out within the user device or also by the user after he has set the parameters.
  • a surface is not associated with a single function but can be associated with several different functions as a function of the interaction with the user device.
  • the triggered function can advantageously comprise a function of launching an application, such as for example an application to take a picture or a video or for a game.
  • the triggered function can advantageously comprise a function associated with the display produced on a display screen of said user device or of another device controlled by said user device.
  • the triggered function can comprise illuminating the display screen.
  • the detection step detects that the device is being grasped by a hand, this means that the user has just grasped the device in order to interact with it.
  • the detection of the hand automatically triggers the illumination of the screen of the device.
  • the triggered function can comprise unlocking the display screen and/or the control surface.
  • the detection step detects that the device is being grasped by a hand, in particular the right hand, or the left hand respectively, if the user is predefined as being right-handed, or left-handed respectively, this means that the user has just grasped the device in order to interact with it.
  • the detection of the hand in particular the nature of the hand, automatically triggers unlocking the display screen and/or the control surface.
  • the triggered function can comprise repositioning at least one object, in particular a button or a symbol or also an icon associated with a function or with triggering a function, displayed on the display screen.
  • the triggered function can comprise adjusting the orientation of a display produced on the display screen.
  • the orientation of the display as a function, for example, of the position of a hand holding the device.
  • the detection shows that the device is being held in “landscape” mode by the user, or in “portrait” mode respectively
  • the display can be rotated to display the objects in “landscape” mode, or in “portrait” mode, respectively.
  • the triggered function can comprise deactivating at least one control button thought to be badly placed with respect to the control object detected.
  • one or more control buttons of a function can be deactivated as a function of the position of one or more fingers on the device or holding the device.
  • the triggered function can be chosen as a function of one or more neutral surfaces, against which or opposite which the control object(s) is/are disposed.
  • a human-machine interface comprising:
  • the human-machine interface according to the invention can moreover comprise a guard layer of the capacitive electrodes, set at a potential (V G ), called the guard potential, which is substantially the same as or exactly the same as the potential of said measurement electrodes.
  • the capacitive electrodes can be arranged in a matrix structure, each capacitive electrode measuring the capacitance between said capacitive electrode and the control object in “self capacitance” as described above.
  • an electronic and/or computerized device comprising a human-machine interface according to the invention.
  • Such a user device can in particular be a device intended to be held in the hand by the user during user.
  • Such a device can be of any shape: rectangular, square, round, etc.
  • Such a device can be a tablet, a telephone, a smartphone, a games console, a PDA, etc.
  • the device according to the invention can be any device, such as a computer, comprising a control surface associated with a display screen.
  • the device according to the invention preferentially comprises a display screen.
  • the control surface can be transparent and disposed on or in this display screen, allowing the display produced by the display screen to be seen because of the transparency.
  • FIG. 1 is a diagrammatic cross section of an interface according to the invention
  • FIG. 2 is a diagrammatic representation of a first example of a device according to the invention utilizing the interface and the method according to the invention;
  • FIG. 3 is a diagrammatic representation of a second example of a device according to the invention utilizing the interface and the method according to the invention;
  • FIG. 4 is a diagrammatic representation of an example of the method according to the invention.
  • FIGS. 5-8 are diagrammatic representations of several configurations of interaction with a user device according to the invention.
  • FIG. 1 is a diagrammatic cross section of a human-machine interface according to the invention.
  • the interface 100 represented in FIG. 1 comprises a control surface 102 and an assembly of capacitive electrodes 104 , known as measurement electrodes, intended to detect one or more control objects, with or without contact of the control object with the control surface 102 .
  • the human-machine interface 100 comprises an electronic measurement module 106 connected to the capacitive measurement electrodes 104 to receive from each of the capacitive measurement electrodes 104 a detection signal and to deduce from it data relating to the detection carried out and composing a detected profile.
  • the detected profile provided by the measurement module can comprise data relating to:
  • the interface 100 comprises moreover a software module for analysis 108 analysing the detected profile provided by the measurement module 106 , and providing data relating to the nature (hand, finger, stylus, etc.) of each control object, and optionally a movement associated with each control object and measured by the measurement module 106 .
  • the interface 100 comprises moreover a database 110 comprising:
  • FIG. 1 also represents field lines 112 , 114 corresponding to measurement electrodes 104 located at the periphery of the control surface 102 .
  • the peripheral field lines 112 and 114 are attracted towards the sides of the control surface 102 , deformed to a greater or lesser degree. The closer a measurement electrode 104 is to the periphery, i.e. close to the border of the control surface 102 , the more horizontal is the field line of this electrode.
  • This physical phenomenon is called the “edge effect” and is used in the present invention to detect one or more control objects which are not in contact with or opposite the control surface 102 , but at the periphery close to the control surface 102 , and in particular in contact with a neutral surface of a user device utilizing the interface 100 .
  • the field lines of the peripheral electrodes are capable of detecting one or more control objects located at the periphery of the control surface 102 .
  • the interface 100 also comprises a layer 116 , called guard layer, set at a guard potential V G identical to or substantially identical to the potential of each of the measurement electrodes 104 .
  • the guard layer 116 is positioned under the measurement electrodes 104 , so that the measurement electrodes 104 are located between the guard layer 116 and the control surface 102 .
  • FIG. 2 is a diagrammatic representation of a first example of a device according to the invention comprising the human-machine interface of FIG. 1 .
  • the device 200 represented in FIG. 2 comprises a display screen 202 on which are arranged the control surface 102 and the measurement electrodes 104 .
  • the control surface 102 can be formed at least in part by the display screen 202 .
  • the face 204 of the device 200 comprising the display screen 202 , is called the user interaction face in the following.
  • the device 200 comprises moreover, on the interaction face 204 , four neutral surfaces 204 - 210 which do not comprise measurement electrodes or means for detecting one or more control objects, or also electro-mechanical selection means.
  • each neutral surface 206 - 212 is in the same plane, or is in a plane which is parallel to the plane formed by the control surface 102 .
  • the neutral surfaces 206 - 212 are each disposed at the periphery of the control surface 102 , at each of the edges of the interaction face 204 . Each neutral surface 206 - 212 is independent of the other neutral surfaces 206 - 212 , and is distinct from the control surface 102 .
  • control objects such as a finger or a hand
  • this or these control objects are detected by the measurement electrodes 104 disposed under/in the control surface 102 at the periphery of the control surface, by utilizing the edge effect created by these electrodes.
  • the neutral surfaces 206 - 212 can touch to form a single neutral surface completely surrounding the control surface.
  • the device can comprise no neutral surface on one or more of its edges.
  • FIG. 3 is a diagrammatic representation of a second example of a device according to the invention comprising the human-machine interface of FIG. 1 .
  • the device 300 represented in FIG. 3 comprises a display screen 202 on which the control surface 102 and the measurement electrodes 104 are arranged.
  • the control surface 102 can be formed at least in part by the display screen 202 .
  • the face 204 of the device 300 comprising the display screen 202 , is called the user interaction face in the following.
  • the device 200 comprises moreover, on each of its edges or rims, a neutral surface 302 - 308 which does not comprise measurement electrodes or means for detecting one or more control objects, or also electro-mechanical selection means.
  • each neutral surface 206 - 212 is in a plane forming an angle which is not zero, in particular an angle of 90°, with the plane formed by the control surface 102 .
  • the neutral surfaces 302 - 308 are each disposed at the periphery of the control surface 102 , at each of the edges/rims of the device 300 connecting the interaction face with a face on the back of the device 300 .
  • Each neutral surface 302 - 308 is independent of the other neutral surfaces 302 - 308 , and is distinct from the control surface 102 .
  • control objects such as a finger or a hand
  • a neutral surface 302 - 308 for example when the user grasps the device on its edges/rims
  • this or these control objects for example the fingers or the hand of the user, are detected by the measurement electrodes 104 disposed under/in the control surface 102 at the periphery of the control surface 102 , by utilizing the edge effect created by these electrodes 104 .
  • the neutral surfaces 302 - 308 can touch to form a single neutral surface completely surrounding the control surface 102 .
  • the device 300 can comprise no neutral surface on one or more of its edges/rims.
  • the embodiment in FIG. 3 has the advantage of being able to use all of the surface of the interaction face as a display screen. As a result, this embodiment makes it possible either to have a larger display screen 202 for a given size of interaction face 204 , or to have a smaller device for a given size of display screen 202 .
  • a user device can be equipped with at least one neutral surface on the device interaction face, i.e. in the same plane as the control surface and/or the display screen, and at least one neutral surface, independently or not, on an edge/rim of the device.
  • FIG. 4 is a diagrammatic representation of an example of the method according to the invention being able to be utilized in the interface of FIG. 1 and/or in the device of FIG. 2 or 3 .
  • the method 400 represented in FIG. 4 comprises a step 402 during which at least one control object is detected by utilizing the edge effect created by the peripheral electrodes of the control surface.
  • a detected profile is determined, as a function of the signals provided by the peripheral electrodes of the control surface. This profile takes into account all of the control objects detected, as well as any movement(s) by any of the control objects
  • step 406 the nature of the control object(s) detected is determined by comparison of the profile detected with predetermined profiles or with predetermined data.
  • one or more functions are triggered by commands sent to the device, and more particularly to the device's processor and/or the elements of the device concerned with the functions triggered.
  • step 408 at least one of the following steps is carried out, either at the same time or one after the other: step 410 of adjusting the volume of the user device, step 412 of adjusting the brightness or the contrast of the display screen of the user device, step 414 of unlocking the display screen and/or the control surface, step 416 of changing the position or the size of an object or a control button being displayed, and step 418 of changing the orientation of the display produced on the display screen.
  • FIGS. 5-8 are diagrammatic representations of several configurations of interaction with a user device according to the invention.
  • the directions right, left, up and down are defined with reference to a user looking at the control surface 102 and/or the display screen 202 , in landscape mode.
  • a finger is detected on/against the right-hand edge or side of the device 200 , i.e. on/against the neutral surface 210 and/or the surface 306 .
  • several fingers are detected on the left-hand edge or rim of the device 200 , i.e. on/against the neutral surface 206 and/or the surface 302 .
  • This detected profile is associated with the detection of a control object corresponding to a right hand and triggers a predetermined positioning of the icons and of the control buttons displayed on the display screen 202 , associated with right-handed use.
  • a finger is detected on/against the left-hand edge or rim of the device 200 , i.e. on/against the neutral surface 206 and/or the surface 302 .
  • several fingers are detected on the right-hand edge or side of the device 200 , i.e on/against the neutral surface 210 and/or the surface 306 .
  • This detected profile is associated with the detection of a control object corresponding to a left hand and triggers a predetermined positioning of the icons and of the control buttons displayed on the display screen 202 , associated with left-handed use.
  • the finger placed on the left-hand edge or rim of the device 200 i.e on/against the neutral surface 206 and/or the surface 302 , is detected sliding downwards.
  • one parameter of a function is adjusted.
  • the adjusted function can be a function chosen by default, for example the volume as represented in configuration 506 : the volume is reduced if the movement is downwards as represented in FIG. 7 . The volume is increased if the movement is upwards, contrary to what is represented in FIG. 7 .
  • the adjusted function can also be pre-selected by the user, such as for example the brightness or the contrast of the display screen, or depending on the display or the menu displayed on the screen.
  • a finger is detected on/against each corner of the device 200 , i.e. at each end, on the one hand, of the neutral surface 206 and/or the surface 302 and, on the other hand, of the neutral surface 210 and/or the surface 306 .
  • This detected profile is associated with the detection of the device 200 being held in landscape mode and triggers a display of the icons and control buttons being displayed on the display screen 202 , in landscape mode.
  • this detected profile can be associated with a picture or video being taken. In this case, an application to take a picture or video is launched automatically.
  • a virtual control button 802 or a set of virtual control buttons can be displayed and associated with a neutral surface 804 , located at the periphery of the display screen 202 or on the rim or the edge of the device 200 .
  • a neutral surface 804 located at the periphery of the display screen 202 or on the rim or the edge of the device 200 .

Abstract

A method is provided for interacting with an electronic and/or computing apparatus, termed the user apparatus, the user apparatus including: a control surface furnished with a plurality of capacitive electrodes, and at least one neutral surface, arranged at the periphery of the control surface so that none of the capacitive electrodes of the control surface is disposed in the neutral surface; the method including the following steps: detection of at least one control object disposed in contact with or opposite the neutral surface of the apparatus, by utilization of the edge effect, and triggering, as a function of the detection, at least one function in the user apparatus. The invention also relates to a man-machine interface for interacting with a user apparatus and a user apparatus, implementing this method.

Description

    TECHNICAL FIELD
  • The present invention relates to a method for interacting with an electronic and/or computerized device, utilizing a capacitive control surface. It also relates to a human-machine interface and a user device utilizing this method.
  • The field of the invention is non-limitatively that of human-machine interfaces (HMI) for interacting with a user device, utilizing a capacitive detection technology. More particularly, the field of the invention is that of HMI devices comprising a control surface with capacitive technology, for interacting with a user device without making contact with said control surface.
  • STATE OF THE ART
  • Most electronic communication devices or audiovisual devices are equipped with screens or control surfaces, commonly called touch surfaces, allowing interaction with these devices. In tablet or smartphone-type devices, the control surface is transparent. It is integrated in or on the display screen of these devices in order to increase the size of the display screen.
  • The capacitive control surfaces make it possible to detect a control object, such as a finger or a stylus, in contact with the control surface or at a distance from the control surface.
  • Control surfaces using capacitive technology suffer from an undesirable phenomenon, called “edge effect”, which is generally minimized or ignored, on the one hand to avoid interference with surrounding objects and, on the other hand, to increase the detection range of the control surface. Thus, in capacitive control surfaces, the detection of the control object is only taken into account when it comes into contact with the control surface or when it is presented opposite the control surface. As a result, if the control object is presented on the periphery of the control surface without being in contact with the control surface, it is not detected or taken into account even if it is in contact with the device.
  • At the same time, devices equipped with such a capacitive control surface comprise, on the side or edge of the device, more generally on the periphery of the control surface, additional selection means for selecting certain functions such as volume control, or for switching the display screen on or off, etc. These selection means can comprise at least one electro-mechanical button or at least one capacitive sensor, which are added to the capacitive control surface, and which increase the complexity and the cost of manufacturing the device.
  • A purpose of the present invention is to overcome the aforesaid drawbacks.
  • Another purpose of the present invention is to propose a method for interacting with a device equipped with a capacitive control surface allowing the functionalities offered by the capacitive control surface to be better utilized.
  • Another purpose of the present invention is to propose a method for interacting with a device equipped with a capacitive control surface making interaction with the device simpler and quicker.
  • Finally, another purpose of the present invention is to propose a method for interacting with a device equipped with a capacitive control surface allowing the complexity and the cost of manufacturing of the device to be reduced while preserving the functionalities of the device.
  • DISCLOSURE OF THE INVENTION
  • At least one of these purposes is achieved with a method for interacting with an electronic and/or computerized device, known as user device, said user device comprising:
      • a control surface equipped with a plurality of capacitive electrodes, arranged to detect at least one control object situated at a distance that is not zero from said control surface, and
      • at least one surface, known as neutral, arranged at the periphery of said control surface so that none of said capacitive electrodes of said control surface is disposed in said neutral surface;
        said method being characterized in that it comprises the following steps:
      • detection of at least one control object disposed in contact with or opposite said neutral surface of said device, by utilizing the edge effect of at least one of said capacitive electrodes of said control surface which is located close to said neutral surface, and
      • triggering, as a function of said detection, of at least one function in said device.
  • The method according to the invention therefore allows the edge effect of a capacitive control surface to be used, on the one hand, to detect one or more control objects disposed on a neutral surface, not equipped with capacitive electrodes, and on the other hand, to trigger one or more functions carried out by the user device following such a detection.
  • According to the invention, a control object is opposite the neutral surface when it is at a distance from the neutral surface, while being visible to the neutral surface.
  • Thus, the method according to the invention makes it possible to use the edge effect of a capacitive control surface, which is an undesirable physical phenomenon for the methods and systems of the state of the art which generally minimize or ignore it, or even cancel it out using physical means such as guard electrodes or software and algorithmic means.
  • Thus, the method according to the invention makes it possible to interact with a device equipped with a capacitive control surface while utilizing all the functionalities offered by the capacitive control surface.
  • Moreover, the interaction proposed by the method according to the invention with a user device equipped with a capacitive control surface is simpler and quicker because a function can be triggered by a single detection of a control object on a neutral surface. Thus, when the control object is a finger or a hand, functions can be triggered simply by the user grasping the user device.
  • Moreover, a user device equipped with a capacitive control surface and utilizing the method of interaction according to the invention is less costly and less complex than current devices as it makes it possible to dispense with the use of additional control means such as electro-mechanical buttons or capacitive sensors in addition to the capacitive control surface.
  • In the present application, by “neutral surface” is understood a surface of the user device, in particular not equipped with capacitive electrodes, located at the periphery of said control surface, and having a border in common with the control surface or not.
  • The neutral surface can be in the same plane as the control surface, or in a plane forming an angle that is not zero with the plane of the control surface.
  • According to a particular embodiment, the neutral surface can be formed by at least one part of a rim or of an edge of the user device, and can form an angle that is not zero with the control surface. When the user device is generally rectangular in shape, a neutral surface can be formed by at least one part of two opposite edges, even at least three edges, and optionally on each of the edges of the user device.
  • According to another embodiment, the neutral surface can be arranged at the periphery of a so-called interaction face of the user device also being composed of the control surface, and preferentially the display screen, on at least one side of the interaction face. Advantageously, one neutral surface can be arranged on each of two opposite sides of the face, or on three sides of the face, optionally on each of all of the sides of the interaction face.
  • The capacitive control surfaces use two main measurement principles.
  • The most common principle utilizes the technology known as “mutual capacitance” using electrodes in the form of rows and columns. The rows, or columns respectively, are transmitters excited by an electrical signal and the columns, or rows respectively, are receivers. The capacitance created between each row-column intersection varies as a function of the presence, or not, of an object. This technology is not very sensitive to edge effects because the field lines are very localized to the intersections of the rows and columns.
  • The second principle utilizes the technology known as “self capacitance”. In the case of this second principle, two architectures are possible for the capacitive electrodes: a first architecture using electrodes in the form of rows and columns, and a second architecture using individual matrix electrodes. In both cases, the field lines are attracted very easily towards any object connected to earth, which is more particularly the reference voltage of the object, and it becomes possible to detect an object at a distance, i.e. without there being any contact between this object and the control surface. The disadvantage of the row-column architecture is that it is impossible to detect several objects: with more than 2 objects, ghosts appear, i.e. false objects. The advantage of the matrix solution is that it is possible to detect several objects with contact and without contact.
  • The invention is more particularly appropriate for a control surface utilizing a technology known as “self capacitance” using the matrix architecture.
  • The method according to the invention can advantageously comprise, after the detection step and before the triggering step, a step of determining the nature of at least one control object disposed against the neutral surface, the triggering step being carried out as a function of said nature of said control object.
  • Thus, the determination step can determine whether the control object is a stylus or a finger or part of the hand, or a hand holding the device.
  • When the control object is a hand holding the device, the determination step can determine whether it is the right hand or the left hand, or even the orientation of the hand.
  • When the control object is one of the user's fingers, the determination step can determine whether it is the user's thumb or another of the user's fingers.
  • The step of determining the nature of the control object can comprise the following operations:
      • constituting a so-called detected profile, as a function of detection data relating to one or more control objects detected, in particular simultaneously, provided by the measurement electrodes of the control surface
      • determining the nature of the control object, by comparison of said detected profile with a plurality of predetermined profiles each corresponding to a control object of which the nature is known.
  • The detected profile can advantageously comprise data relating to:
      • the position of each of the control objects detected,
      • the number of objects detected,
      • the distance between at least two, in particular each, of the objects detected, and
      • a movement of each control detected.
  • By “movement of a control object” is understood a displacement quantity, a displacement speed and/or a displacement acceleration of a control object over the neutral surface.
  • Advantageously, the detection step can comprise, for at least one, in particular for each object disposed in contact with or opposite the neutral surface, detection of the position of said control object over said neutral surface.
  • The detection of the position of a control object can be carried out as a function of the position of the electrodes detecting the control object by means of the edge effect.
  • The detection step can advantageously comprise detection of a number of control objects disposed in contact with or opposite the neutral surface.
  • The number of objects detected can be determined as a function of the number of independent groups of capacitive electrodes detecting a control object by means of the edge effect.
  • The detection step can advantageously comprise, for at least one, in particular for each control object disposed in contact with or opposite the neutral surface, detection of the shape of said control object.
  • The shape of a control object can be determined by means of functions/algorithms taking into account the strength of the measurement signal of each capacitive electrode adjacent to the same group detecting the control object by means of the edge effect.
  • Advantageously, the detection step can comprise detection of a distance between at least two control objects disposed in contact with or opposite the neutral surface.
  • Measurement of the distance between two control objects detected can be carried out in the following way. For each control object detected, the electrode providing the signal with the highest amplitude is identified in the group of electrodes detecting this object. Then, the distance is measured between the electrode with the highest amplitude detecting one of the objects and the electrode with the highest amplitude detecting the other one of the objects.
  • The distance measured can be a direct distance or a distance around the periphery taking into account a peripheral path around the capacitive control surface.
  • The detection step can advantageously comprise, for at least one, in particular for each control object disposed in contact with or opposite the neutral surface, detection/measurement of a movement of said control object.
  • According to the invention, the triggered function can comprise an adjustment function of the value of a parameter associated with a function carried out by said user device.
  • Such an adjustment function can, for example, be the adjustment of the sound, of the display brightness, of the display contrast, etc.
  • Such an adjustment function is particularly relevant when the detection comprises detection of the movement of a control object, and in particular the quantity of displacement of the control object.
  • In particular, the adjusted function can be the volume, which can be deactivated or activated as a function of the detection step.
  • According to the invention, a neutral surface can be dedicated to the adjustment of a predetermined function. For example, a neutral surface disposed on an upper part of an edge of the user device can be associated with the adjustment of the sound emitted by the user device.
  • More generally, according to the invention, a plurality of neutral surfaces, in particular independent ones, can be predefined at the periphery of the control surface.
  • At least one of these neutral surfaces can be pre-associated with/pre-dedicated to a predetermined function regardless of the display produced or the navigation within the device.
  • At least one of these neutral surfaces can be used to trigger a function or adjust a parameter, the function or the parameter in question being chosen or selected as a function of the display being produced by the user device, or of the navigation being carried out within the user device or also by the user after he has set the parameters. Thus, such a surface is not associated with a single function but can be associated with several different functions as a function of the interaction with the user device.
  • The triggered function can advantageously comprise a function of launching an application, such as for example an application to take a picture or a video or for a game.
  • The triggered function can advantageously comprise a function associated with the display produced on a display screen of said user device or of another device controlled by said user device.
  • According to a preferred embodiment, the triggered function can comprise illuminating the display screen.
  • For example, when the detection step detects that the device is being grasped by a hand, this means that the user has just grasped the device in order to interact with it. In this case, the detection of the hand automatically triggers the illumination of the screen of the device.
  • According to a particularly preferred embodiment, the triggered function can comprise unlocking the display screen and/or the control surface.
  • For example, when the detection step detects that the device is being grasped by a hand, in particular the right hand, or the left hand respectively, if the user is predefined as being right-handed, or left-handed respectively, this means that the user has just grasped the device in order to interact with it. In this case, the detection of the hand, in particular the nature of the hand, automatically triggers unlocking the display screen and/or the control surface.
  • According to a particularly preferred embodiment, the triggered function can comprise repositioning at least one object, in particular a button or a symbol or also an icon associated with a function or with triggering a function, displayed on the display screen.
  • In fact, it is possible to change the display produced or the position of control buttons or of displayed objects as a function of the position of the finger(s) on the user device in order to make it easier to see the objects or the control buttons displayed and to avoid them being hidden, either partly or completely, by a user's finger.
  • It is moreover possible to change the display produced or the position of control buttons or of objects displayed as a function of the nature, left or right, of the detected hand, in order to make it easier to select an object or a control button. In fact, some selection zones which are more or less easy to reach differ depending on whether the user is left- or right-handed. Thus, it is possible to move at least one displayed object or control button away from the user's thumb, to delete/deactivate/lock certain control objects/buttons which cannot be utilized as a function of the position of the hand.
  • According to a particularly preferred embodiment, the triggered function can comprise adjusting the orientation of a display produced on the display screen.
  • In fact, it is possible to change the orientation of the display as a function, for example, of the position of a hand holding the device. In particular, if the detection shows that the device is being held in “landscape” mode by the user, or in “portrait” mode respectively, the display can be rotated to display the objects in “landscape” mode, or in “portrait” mode, respectively.
  • Moreover, the triggered function can comprise deactivating at least one control button thought to be badly placed with respect to the control object detected. For example, one or more control buttons of a function can be deactivated as a function of the position of one or more fingers on the device or holding the device.
  • More generally, the triggered function can be chosen as a function of one or more neutral surfaces, against which or opposite which the control object(s) is/are disposed.
  • According to another aspect of the invention, a human-machine interface is proposed comprising:
      • a control surface equipped with a plurality of capacitive electrodes, arranged to detect at least one control object situated at a distance that is not zero from said control surface, and
      • means for carrying out the steps of the method according to the invention.
  • The human-machine interface according to the invention can moreover comprise a guard layer of the capacitive electrodes, set at a potential (VG), called the guard potential, which is substantially the same as or exactly the same as the potential of said measurement electrodes.
  • Advantageously, the capacitive electrodes can be arranged in a matrix structure, each capacitive electrode measuring the capacitance between said capacitive electrode and the control object in “self capacitance” as described above.
  • According to yet another aspect of the invention, an electronic and/or computerized device is proposed comprising a human-machine interface according to the invention.
  • Such a user device can in particular be a device intended to be held in the hand by the user during user.
  • Such a device can be of any shape: rectangular, square, round, etc.
  • Such a device can be a tablet, a telephone, a smartphone, a games console, a PDA, etc.
  • Of course, the device according to the invention can be any device, such as a computer, comprising a control surface associated with a display screen.
  • In fact, the device according to the invention preferentially comprises a display screen. In this case, according to a preferred embodiment, the control surface can be transparent and disposed on or in this display screen, allowing the display produced by the display screen to be seen because of the transparency.
  • DESCRIPTION OF THE FIGURES AND EMBODIMENTS
  • Other advantages and features will become apparent on examination of the detailed description of examples that are no way limitative and of the attached drawings in which:
  • FIG. 1 is a diagrammatic cross section of an interface according to the invention;
  • FIG. 2 is a diagrammatic representation of a first example of a device according to the invention utilizing the interface and the method according to the invention;
  • FIG. 3 is a diagrammatic representation of a second example of a device according to the invention utilizing the interface and the method according to the invention;
  • FIG. 4 is a diagrammatic representation of an example of the method according to the invention; and
  • FIGS. 5-8 are diagrammatic representations of several configurations of interaction with a user device according to the invention.
  • It is well understood that the embodiments described hereinafter are in no way limitative. Variants of the invention can in particular be envisaged comprising only a selection of the features described below in isolation from the other described features, if this selection of features is sufficient to confer a technical advantage or to differentiate the invention with respect to the state of the prior art. This selection comprises at least one preferred functional feature without structural details, or with only one part of the structural details if this part alone is sufficient to confer a technical advantage or to differentiate the invention with respect to the state of the prior art.
  • In particular, all the described variants and embodiments can be combined if there is no objection to this combination from a technical point of view.
  • In the figures, the elements common to several figures retain the same references.
  • FIG. 1 is a diagrammatic cross section of a human-machine interface according to the invention.
  • The interface 100 represented in FIG. 1 comprises a control surface 102 and an assembly of capacitive electrodes 104, known as measurement electrodes, intended to detect one or more control objects, with or without contact of the control object with the control surface 102.
  • The human-machine interface 100 comprises an electronic measurement module 106 connected to the capacitive measurement electrodes 104 to receive from each of the capacitive measurement electrodes 104 a detection signal and to deduce from it data relating to the detection carried out and composing a detected profile.
  • The detected profile provided by the measurement module can comprise data relating to:
      • the position of each control object detected,
      • the number of control objects detected,
      • the distance between at least two, in particular each control object detected, and
      • a movement of each control detected, i.e. a displacement distance, a displacement speed and/or an acceleration of the control object.
  • The interface 100 comprises moreover a software module for analysis 108 analysing the detected profile provided by the measurement module 106, and providing data relating to the nature (hand, finger, stylus, etc.) of each control object, and optionally a movement associated with each control object and measured by the measurement module 106.
  • The interface 100 comprises moreover a database 110 comprising:
      • one or more detection algorithms used by the measurement module 106 to determine a detected profile as a function of the signals provided by the capacitive measurement electrodes 104, and
      • a set of predetermined and pre-stored profiles, consulted by the analysis module 108 to determine the nature of a control object, as a function of a detected profile provided by the measurement module 106.
  • FIG. 1 also represents field lines 112, 114 corresponding to measurement electrodes 104 located at the periphery of the control surface 102. The peripheral field lines 112 and 114 are attracted towards the sides of the control surface 102, deformed to a greater or lesser degree. The closer a measurement electrode 104 is to the periphery, i.e. close to the border of the control surface 102, the more horizontal is the field line of this electrode. This physical phenomenon is called the “edge effect” and is used in the present invention to detect one or more control objects which are not in contact with or opposite the control surface 102, but at the periphery close to the control surface 102, and in particular in contact with a neutral surface of a user device utilizing the interface 100. In fact, the field lines of the peripheral electrodes are capable of detecting one or more control objects located at the periphery of the control surface 102.
  • The interface 100 also comprises a layer 116, called guard layer, set at a guard potential VG identical to or substantially identical to the potential of each of the measurement electrodes 104. In FIG. 1, the guard layer 116 is positioned under the measurement electrodes 104, so that the measurement electrodes 104 are located between the guard layer 116 and the control surface 102.
  • FIG. 2 is a diagrammatic representation of a first example of a device according to the invention comprising the human-machine interface of FIG. 1.
  • The device 200 represented in FIG. 2 comprises a display screen 202 on which are arranged the control surface 102 and the measurement electrodes 104. Alternatively, the control surface 102 can be formed at least in part by the display screen 202. The face 204 of the device 200, comprising the display screen 202, is called the user interaction face in the following.
  • The device 200 comprises moreover, on the interaction face 204, four neutral surfaces 204-210 which do not comprise measurement electrodes or means for detecting one or more control objects, or also electro-mechanical selection means.
  • Thus, each neutral surface 206-212 is in the same plane, or is in a plane which is parallel to the plane formed by the control surface 102.
  • The neutral surfaces 206-212 are each disposed at the periphery of the control surface 102, at each of the edges of the interaction face 204. Each neutral surface 206-212 is independent of the other neutral surfaces 206-212, and is distinct from the control surface 102.
  • When one or more control objects, such as a finger or a hand, are disposed against or opposite a neutral surface 206-212, this or these control objects are detected by the measurement electrodes 104 disposed under/in the control surface 102 at the periphery of the control surface, by utilizing the edge effect created by these electrodes.
  • Alternatively, the neutral surfaces 206-212 can touch to form a single neutral surface completely surrounding the control surface.
  • Alternatively, the device can comprise no neutral surface on one or more of its edges.
  • FIG. 3 is a diagrammatic representation of a second example of a device according to the invention comprising the human-machine interface of FIG. 1.
  • The device 300 represented in FIG. 3 comprises a display screen 202 on which the control surface 102 and the measurement electrodes 104 are arranged. Alternatively, the control surface 102 can be formed at least in part by the display screen 202. The face 204 of the device 300, comprising the display screen 202, is called the user interaction face in the following.
  • The device 200 comprises moreover, on each of its edges or rims, a neutral surface 302-308 which does not comprise measurement electrodes or means for detecting one or more control objects, or also electro-mechanical selection means.
  • Thus, each neutral surface 206-212 is in a plane forming an angle which is not zero, in particular an angle of 90°, with the plane formed by the control surface 102.
  • The neutral surfaces 302-308 are each disposed at the periphery of the control surface 102, at each of the edges/rims of the device 300 connecting the interaction face with a face on the back of the device 300.
  • Each neutral surface 302-308 is independent of the other neutral surfaces 302-308, and is distinct from the control surface 102.
  • When one or more control objects, such as a finger or a hand, are disposed against or opposite a neutral surface 302-308, for example when the user grasps the device on its edges/rims, this or these control objects, for example the fingers or the hand of the user, are detected by the measurement electrodes 104 disposed under/in the control surface 102 at the periphery of the control surface 102, by utilizing the edge effect created by these electrodes 104.
  • Alternatively, the neutral surfaces 302-308 can touch to form a single neutral surface completely surrounding the control surface 102.
  • Alternatively, the device 300 can comprise no neutral surface on one or more of its edges/rims.
  • The embodiment in FIG. 3 has the advantage of being able to use all of the surface of the interaction face as a display screen. As a result, this embodiment makes it possible either to have a larger display screen 202 for a given size of interaction face 204, or to have a smaller device for a given size of display screen 202.
  • Of course, it is possible to combine the embodiments represented in FIGS. 2 and 3. Thus, a user device according to the invention can be equipped with at least one neutral surface on the device interaction face, i.e. in the same plane as the control surface and/or the display screen, and at least one neutral surface, independently or not, on an edge/rim of the device.
  • FIG. 4 is a diagrammatic representation of an example of the method according to the invention being able to be utilized in the interface of FIG. 1 and/or in the device of FIG. 2 or 3.
  • The method 400 represented in FIG. 4 comprises a step 402 during which at least one control object is detected by utilizing the edge effect created by the peripheral electrodes of the control surface.
  • During step 404 a detected profile is determined, as a function of the signals provided by the peripheral electrodes of the control surface. This profile takes into account all of the control objects detected, as well as any movement(s) by any of the control objects
  • During step 406, the nature of the control object(s) detected is determined by comparison of the profile detected with predetermined profiles or with predetermined data.
  • As a function of the profile detected, i.e. of the nature of the control object(s), and if necessary, of the movement(s) of the control object(s), one or more functions are triggered by commands sent to the device, and more particularly to the device's processor and/or the elements of the device concerned with the functions triggered.
  • After step 408, at least one of the following steps is carried out, either at the same time or one after the other: step 410 of adjusting the volume of the user device, step 412 of adjusting the brightness or the contrast of the display screen of the user device, step 414 of unlocking the display screen and/or the control surface, step 416 of changing the position or the size of an object or a control button being displayed, and step 418 of changing the orientation of the display produced on the display screen.
  • FIGS. 5-8 are diagrammatic representations of several configurations of interaction with a user device according to the invention.
  • In the configurations which are to be described, the directions right, left, up and down are defined with reference to a user looking at the control surface 102 and/or the display screen 202, in landscape mode.
  • In the configuration of FIG. 5, a finger is detected on/against the right-hand edge or side of the device 200, i.e. on/against the neutral surface 210 and/or the surface 306. At the same time, several fingers are detected on the left-hand edge or rim of the device 200, i.e. on/against the neutral surface 206 and/or the surface 302. This detected profile is associated with the detection of a control object corresponding to a right hand and triggers a predetermined positioning of the icons and of the control buttons displayed on the display screen 202, associated with right-handed use.
  • In the configuration of FIG. 6, a finger is detected on/against the left-hand edge or rim of the device 200, i.e. on/against the neutral surface 206 and/or the surface 302. At the same time, several fingers are detected on the right-hand edge or side of the device 200, i.e on/against the neutral surface 210 and/or the surface 306. This detected profile is associated with the detection of a control object corresponding to a left hand and triggers a predetermined positioning of the icons and of the control buttons displayed on the display screen 202, associated with left-handed use.
  • In the configuration of FIG. 7, in addition to the detection carried out in the configuration of FIG. 6, the finger placed on the left-hand edge or rim of the device 200, i.e on/against the neutral surface 206 and/or the surface 302, is detected sliding downwards. In this case, one parameter of a function is adjusted. The adjusted function can be a function chosen by default, for example the volume as represented in configuration 506: the volume is reduced if the movement is downwards as represented in FIG. 7. The volume is increased if the movement is upwards, contrary to what is represented in FIG. 7. The adjusted function can also be pre-selected by the user, such as for example the brightness or the contrast of the display screen, or depending on the display or the menu displayed on the screen.
  • In the configuration of FIG. 8, a finger is detected on/against each corner of the device 200, i.e. at each end, on the one hand, of the neutral surface 206 and/or the surface 302 and, on the other hand, of the neutral surface 210 and/or the surface 306. This detected profile is associated with the detection of the device 200 being held in landscape mode and triggers a display of the icons and control buttons being displayed on the display screen 202, in landscape mode. Alternatively, this detected profile can be associated with a picture or video being taken. In this case, an application to take a picture or video is launched automatically. In addition, a virtual control button 802 or a set of virtual control buttons can be displayed and associated with a neutral surface 804, located at the periphery of the display screen 202 or on the rim or the edge of the device 200. Thus, it is possible to activate, by positioning a finger on this neutral surface 804, the taking/stopping of taking a picture or video.
  • Other non-limitative and non-exhaustive examples of triggered functions and the detected profile associated with these functions will now be described:
  • Fingers placed on three Unlocking the display screen and/or
    edges/rims of the device the control surface
    Detection of the palm of the Displacement of the objects (icons,
    hand on an edge/rim and of at control or selection buttons) towards
    least one finger on the the side opposite the palm of the
    opposite edge/side hand
    Detection of the presence, and Selection of a virtual control button,
    optionally of a movement, of a for example of a trigger button in a
    control object along one or game executed by the device, or
    more of the edges/rims of the displacement of an object displayed
    device by the device, following a
    predetermined direction or following
    the direction of movement detected
    Detection of the presence, and Selection of a photo, or displacement
    optionally of a movement, of a of a photo or of a list of photos
    control object along one or displayed by the device, following a
    more of the edges/sides of predetermined direction or following
    the device the direction of movement detected,
    or also according to the detected
    position of the control object
  • Of course, the invention is not limited to the examples which have just been described and numerous adjustments can be made to these examples without exceeding the scope of the invention.

Claims (21)

1. A method for interacting with an electronic and/or computerized device, known as user device, said user device comprising:
a control surface equipped with a plurality of capacitive electrodes, arranged to detect at least one control object situated at a distance that is not zero from said control surface; and
at least one surface, known as neutral, arranged at the periphery of said control surface so that none of said capacitive electrodes of said control surface is disposed in said neutral surface;
said method including:
detection of at least one control object disposed in contact with or opposite said neutral surface of said device, by utilizing the edge effect of at least one of said capacitive electrodes of said control surface which is located close to said neutral surface; and
triggering, as a function of said detection, of at least one function in said user device.
2. The method according to claim 1, characterized in that it comprises, after the detection step and before the triggering step, a step of determining the nature of the control object disposed against or opposite the neutral surface, the triggering step being carried out as a function of said nature of said control object.
3. The method according to claim 1, characterized in that the detection step comprises, for at least one, in particular for each object disposed in contact with or opposite the neutral surface, detection of the position of said control object on said neutral surface.
4. The method according to claim 1, characterized in that the detection step comprises the detection of the number of control objects disposed in contact with or opposite the neutral surface.
5. The method according to claim 1, characterized in that the detection step comprises, for at least one, in particular for each control object disposed in contact with or opposite the neutral surface, detection of the shape of said control object.
6. The method according to claim 1, characterized in that the detection step comprises the detection of a distance between at least two control objects disposed in contact with or opposite the neutral surface.
7. The method according to claim 1, characterized in that the detection step comprises, for at least one, in particular for each control object disposed in contact with or opposite the neutral surface, detection/measurement of a movement of said control object.
8. The method according to claim 1, characterized in that the triggered function comprises a function of adjusting the value of a parameter associated with a function carried out by said user device.
9. The method according to claim 1, characterized in that the triggered function comprises a function associated with the display produced on a display screen of said user device or of another device controlled by said user device.
10. The method according to claim 9, characterized in that the triggered function comprises illuminating the display screen and/or the control surface.
11. The method according to claim 9, characterized in that the triggered function comprises unlocking the display screen and/or the control surface.
12. The method according to claim 9, characterized in that the triggered function comprises repositioning at least one object displayed on the display screen.
13. The method according to claim 9, characterized in that the triggered function comprises adjusting the orientation of a display produced on the display screen.
14. The method according to claim 1, characterized in that the triggered function comprises launching an application to take a picture, an application to take a video or deactivating/activating the sound.
15. The method according to claim 1, characterized in that the triggered function comprises deactivating a control button.
16. The method according to claim 1, characterized in that the triggered function is chosen as a function of one or more of the neutral surfaces, against which or opposite to which the control object(s) is/are disposed.
17. A human-machine interface, comprising:
a control surface equipped with a plurality of capacitive electrodes, arranged to detect at least one control object situated at a distance that is not zero from said control surface; and
means for carrying out the steps of the method according to claim 1.
18. The interface according to claim 17, characterized in that it comprises moreover a guard layer of the capacitive electrodes, set at a potential (VG), called the guard potential, which is substantially the same as or exactly the same as the potential of said capacitive electrodes.
19. The interface according to claim 17, characterized in that the capacitive electrodes are disposed in a matrix structure, each capacitive electrode measuring the capacitance between said capacitive electrode and the control object.
20. An electronic and/or computerized device comprising a human-machine interface according to claim 17.
21. The device according to claim 20, characterized in that it comprises moreover a display screen, the control surface being transparent and disposed on said display screen.
US14/396,599 2012-04-25 2013-04-24 Method for interacting with an apparatus implementing a capacitive control surface, interface and apparatus implementing this method Abandoned US20150091854A1 (en)

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
FR1253820A FR2990020B1 (en) 2012-04-25 2012-04-25 CAPACITIVE DETECTION DEVICE WITH ARRANGEMENT OF CONNECTION TRACKS, AND METHOD USING SUCH A DEVICE.
FR1253820 2012-04-25
FR1353417 2013-04-16
FR1353417A FR2990033B1 (en) 2012-04-25 2013-04-16 METHOD FOR INTERACTING WITH AN APPARATUS USING A CAPACITIVE CONTROL SURFACE, INTERFACE AND APPARATUS USING THE SAME
PCT/EP2013/058428 WO2013160323A1 (en) 2012-04-25 2013-04-24 Method for interacting with an apparatus implementing a capacitive control surface, interface and apparatus implementing this method

Publications (1)

Publication Number Publication Date
US20150091854A1 true US20150091854A1 (en) 2015-04-02

Family

ID=48326258

Family Applications (2)

Application Number Title Priority Date Filing Date
US14/115,008 Active US9104283B2 (en) 2012-04-25 2013-04-16 Capacitive detection device with arrangement of linking tracks, and method implementing such a device
US14/396,599 Abandoned US20150091854A1 (en) 2012-04-25 2013-04-24 Method for interacting with an apparatus implementing a capacitive control surface, interface and apparatus implementing this method

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US14/115,008 Active US9104283B2 (en) 2012-04-25 2013-04-16 Capacitive detection device with arrangement of linking tracks, and method implementing such a device

Country Status (7)

Country Link
US (2) US9104283B2 (en)
EP (3) EP2842018A1 (en)
JP (3) JP6463669B2 (en)
KR (2) KR102028783B1 (en)
CN (3) CN106933417A (en)
FR (2) FR2990020B1 (en)
WO (2) WO2013160151A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170024124A1 (en) * 2014-04-14 2017-01-26 Sharp Kabushiki Kaisha Input device, and method for controlling input device
US10768752B2 (en) 2015-02-27 2020-09-08 Quickstep Technologies Llc Method for interacting with an electronic and/or computer device implementing a capacitive control surface and a peripheral surface, interface and device implementing this method
US20230152912A1 (en) * 2021-11-18 2023-05-18 International Business Machines Corporation Splitting a mobile device display and mapping content with single hand

Families Citing this family (34)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8381135B2 (en) 2004-07-30 2013-02-19 Apple Inc. Proximity detector in handheld device
US7920129B2 (en) 2007-01-03 2011-04-05 Apple Inc. Double-sided touch-sensitive panel with shield and drive combined layer
FR2949007B1 (en) 2009-08-07 2012-06-08 Nanotec Solution DEVICE AND METHOD FOR CONTROL INTERFACE SENSITIVE TO A MOVEMENT OF A BODY OR OBJECT AND CONTROL EQUIPMENT INCORPORATING THIS DEVICE.
FR2971066B1 (en) 2011-01-31 2013-08-23 Nanotec Solution THREE-DIMENSIONAL MAN-MACHINE INTERFACE.
FR2976688B1 (en) 2011-06-16 2021-04-23 Nanotec Solution DEVICE AND METHOD FOR GENERATING AN ELECTRICAL POWER SUPPLY IN AN ELECTRONIC SYSTEM WITH A VARIABLE REFERENCE POTENTIAL.
US9259904B2 (en) 2011-10-20 2016-02-16 Apple Inc. Opaque thin film passivation
FR2985048B1 (en) * 2011-12-21 2014-08-15 Nanotec Solution PRESSURE-SENSITIVE CAPACITIVE MEASUREMENT DEVICE AND METHOD FOR TOUCH-FREE CONTACT INTERFACES
FR2985049B1 (en) 2011-12-22 2014-01-31 Nanotec Solution CAPACITIVE MEASURING DEVICE WITH SWITCHED ELECTRODES FOR TOUCHLESS CONTACTLESS INTERFACES
FR2988176B1 (en) 2012-03-13 2014-11-21 Nanotec Solution CAPACITIVE MEASUREMENT METHOD BETWEEN AN OBJECT AND AN ELECTRODE PLAN BY PARTIAL SYNCHRONOUS DEMODULATION
FR2988175B1 (en) 2012-03-13 2014-04-11 Nanotec Solution METHOD FOR CAPACITIVE MEASUREMENT BY NON-REGULAR ELECTRODES, AND APPARATUS IMPLEMENTING SAID METHOD
US9201547B2 (en) 2012-04-30 2015-12-01 Apple Inc. Wide dynamic range capacitive sensing
FR3002052B1 (en) 2013-02-14 2016-12-09 Fogale Nanotech METHOD AND DEVICE FOR NAVIGATING A DISPLAY SCREEN AND APPARATUS COMPRISING SUCH A NAVIGATION
FR3003964B1 (en) 2013-04-02 2016-08-26 Fogale Nanotech DEVICE FOR INTERACTING, WITHOUT CONTACT, AN ELECTRONIC AND / OR COMPUTER APPARATUS, AND APPARATUS PROVIDED WITH SUCH A DEVICE
FR3004551A1 (en) 2013-04-15 2014-10-17 Fogale Nanotech MULTIZONE CAPACITIVE DETECTION METHOD, DEVICE AND APPARATUS USING THE METHOD
FR3005763B1 (en) * 2013-05-17 2016-10-14 Fogale Nanotech DEVICE AND METHOD FOR A CAPACITIVE CONTROL INTERFACE ADAPTED TO THE IMPLEMENTATION OF ELECTRODES OF HIGHLY RESISTIVE MEASUREMENTS
FR3013472B1 (en) 2013-11-19 2016-07-08 Fogale Nanotech COVERING ACCESSORY DEVICE FOR AN ELECTRONIC AND / OR COMPUTER PORTABLE APPARATUS, AND APPARATUS EQUIPPED WITH SUCH AN ACCESSORY DEVICE
GB2521835A (en) 2014-01-02 2015-07-08 Nokia Technologies Oy Electromagnetic shielding
FR3017723B1 (en) 2014-02-19 2017-07-21 Fogale Nanotech METHOD OF MAN-MACHINE INTERACTION BY COMBINING TOUCH-FREE AND CONTACTLESS CONTROLS
FR3019320B1 (en) 2014-03-28 2017-12-15 Fogale Nanotech BRACKET WATCH-TYPE ELECTRONIC DEVICE WITH CONTACTLESS CONTROL INTERFACE AND METHOD FOR CONTROLLING SUCH A DEVICE
WO2016028081A2 (en) * 2014-08-19 2016-02-25 크루셜텍(주) Apparatus for detecting touch
KR101621277B1 (en) 2014-08-19 2016-05-16 크루셜텍 (주) Touch detecting apparatus
FR3025623B1 (en) 2014-09-05 2017-12-15 Fogale Nanotech CONTROL INTERFACE DEVICE AND FINGERPRINT SENSOR
FR3028061B1 (en) * 2014-10-29 2016-12-30 Fogale Nanotech CAPACITIVE SENSOR DEVICE COMPRISING ADJUSTED ELECTRODES
FR3032287B1 (en) 2015-02-04 2018-03-09 Quickstep Technologies Llc MULTILAYER CAPACITIVE DETECTION DEVICE, AND APPARATUS COMPRISING THE DEVICE
CN105573545A (en) * 2015-11-27 2016-05-11 努比亚技术有限公司 Gesture correction method, apparatus and gesture input processing method
FR3051266B1 (en) * 2016-05-12 2019-07-05 Fogale Nanotech CAPACITIVE INTERFACE DEVICE WITH MIXED ELECTRODE STRUCTURE, AND APPARATUS COMPRISING THE DEVICE
CN114527893B (en) 2016-09-23 2023-11-10 苹果公司 Touch sensor panel with top shield and/or bottom shield
US10372282B2 (en) 2016-12-01 2019-08-06 Apple Inc. Capacitive coupling reduction in touch sensor panels
FR3070022B1 (en) * 2017-08-10 2020-11-06 Fogale Nanotech CAPACITIVE DRESSING ELEMENT FOR ROBOT, ROBOT EQUIPPED WITH SUCH A DRESSING ELEMENT
US10521049B2 (en) * 2017-09-29 2019-12-31 Apple Inc. Multi-via structures for touchscreens
FR3072176B1 (en) * 2017-10-10 2022-03-04 Fogale Nanotech IMPEDANCE MEASUREMENT DEVICE
CN108062181B (en) 2018-01-02 2021-08-17 京东方科技集团股份有限公司 Substrate, manufacturing method thereof and electronic equipment
JP7071234B2 (en) * 2018-06-29 2022-05-18 キヤノン株式会社 Electronics
US11789561B2 (en) 2021-09-24 2023-10-17 Apple Inc. Architecture for differential drive and sense touch technology

Citations (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020158838A1 (en) * 2001-04-30 2002-10-31 International Business Machines Corporation Edge touchpad input device
US20070150842A1 (en) * 2005-12-23 2007-06-28 Imran Chaudhri Unlocking a device by performing gestures on an unlock image
US7796124B2 (en) * 2005-10-27 2010-09-14 Alps Electric Co., Ltd. Input device and electronic apparatus
US20110020517A1 (en) * 2008-04-04 2011-01-27 Mark Rubin Taste modifying product
US20110205172A1 (en) * 2010-02-23 2011-08-25 Panasonic Corporation Touch screen device
US20120075238A1 (en) * 2010-09-28 2012-03-29 Sony Corporation Display device with touch detection function and electronic unit
US20120098766A1 (en) * 2010-09-24 2012-04-26 Research In Motion Limited Portable Electronic Device and Method of Controlling Same
US20120274603A1 (en) * 2011-04-27 2012-11-01 Cheol-Se Kim In-cell type touch panel
US20120280917A1 (en) * 2011-05-03 2012-11-08 Toksvig Michael John Mckenzie Adjusting Mobile Device State Based on User Intentions and/or Identity
US20120299868A1 (en) * 2011-05-25 2012-11-29 Broadcom Corporation High Noise Immunity and High Spatial Resolution Mutual Capacitive Touch Panel
US20130154955A1 (en) * 2011-12-19 2013-06-20 David Brent GUARD Multi-Surface Touch Sensor Device With Mode of Operation Selection
US20130206567A1 (en) * 2012-02-14 2013-08-15 Samsung Display Co., Ltd. Touch panel
US20130300668A1 (en) * 2012-01-17 2013-11-14 Microsoft Corporation Grip-Based Device Adaptations
US20140078086A1 (en) * 2012-09-20 2014-03-20 Marvell World Trade Ltd. Augmented touch control for hand-held devices
US8799827B2 (en) * 2010-02-19 2014-08-05 Microsoft Corporation Page manipulations using on and off-screen gestures
US9007325B2 (en) * 2002-05-16 2015-04-14 Sony Corporation Input method and input apparatus
US9274682B2 (en) * 2010-02-19 2016-03-01 Microsoft Technology Licensing, Llc Off-screen gestures to create on-screen input
US9727235B2 (en) * 2013-12-12 2017-08-08 Lenovo (Singapore) Pte. Ltd. Switching an interface mode using an input gesture

Family Cites Families (37)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7911456B2 (en) * 1992-06-08 2011-03-22 Synaptics Incorporated Object position detector with edge motion feature and gesture recognition
FR2756048B1 (en) * 1996-11-15 1999-02-12 Nanotec Ingenierie FLOATING CAPACITIVE MEASUREMENT BRIDGE AND ASSOCIATED MULTI-CAPACITIVE MEASUREMENT SYSTEM
US7663607B2 (en) * 2004-05-06 2010-02-16 Apple Inc. Multipoint touchscreen
JP2001127866A (en) * 1999-10-27 2001-05-11 Toshiba Corp Communication terminal
FR2844349B1 (en) 2002-09-06 2005-06-24 Nanotec Solution CAPACITIVE SENSOR PROXIMITY DETECTOR
JP2004302734A (en) * 2003-03-31 2004-10-28 Mitsubishi Electric Corp Information terminal and program for making computer execute its operation changeover
JP2006140700A (en) * 2004-11-11 2006-06-01 Canon Inc Digital camera
FR2893711B1 (en) 2005-11-24 2008-01-25 Nanotec Solution Soc Civ Ile DEVICE AND METHOD FOR CAPACITIVE MEASUREMENT BY FLOATING BRIDGE
JP2009055238A (en) * 2007-08-24 2009-03-12 Ntt Docomo Inc Portable terminal and starting method
US8633915B2 (en) * 2007-10-04 2014-01-21 Apple Inc. Single-layer touch-sensitive display
EP2212762A4 (en) * 2007-11-19 2011-06-29 Cirque Corp Touchpad combined with a display and having proximity and touch sensing capabilities
CN101470555A (en) * 2007-12-25 2009-07-01 义隆电子股份有限公司 Touch panel with function of preventing fault detection
TWI401596B (en) * 2007-12-26 2013-07-11 Elan Microelectronics Corp Method for calibrating coordinates of touch screen
TWM348999U (en) * 2008-02-18 2009-01-11 Tpk Touch Solutions Inc Capacitive touch panel
TWI376623B (en) * 2008-11-19 2012-11-11 Au Optronics Corp Touch panel and touch display panel
TW201025108A (en) * 2008-12-31 2010-07-01 Acrosense Technology Co Ltd Capacitive touch panel
CN101477430B (en) * 2009-01-16 2012-11-07 汕头超声显示器(二厂)有限公司 Condenser type touch screen
US20100201647A1 (en) * 2009-02-11 2010-08-12 Tpo Displays Corp. Capacitive touch sensor
JP5337061B2 (en) * 2009-02-20 2013-11-06 セイコーインスツル株式会社 Touch panel and display device including the same
JP2010262557A (en) * 2009-05-11 2010-11-18 Sony Corp Information processing apparatus and method
CN102334088B (en) * 2009-07-29 2015-04-15 阿尔卑斯电气株式会社 Manipulation device
JP5633565B2 (en) * 2009-08-12 2014-12-03 ソルーション デポ (シェンチェン) リミテッド Active touch system
JP2011044933A (en) * 2009-08-21 2011-03-03 Nec Saitama Ltd Cellular phone and restriction release method
US8441460B2 (en) * 2009-11-24 2013-05-14 Mediatek Inc. Apparatus and method for providing side touch panel as part of man-machine interface (MMI)
JP5476952B2 (en) * 2009-12-03 2014-04-23 日本電気株式会社 Mobile device
CN101719038B (en) * 2009-12-30 2011-06-15 友达光电股份有限公司 Touch-control display panel and touch-control base plate
KR101073215B1 (en) * 2010-03-05 2011-10-12 삼성모바일디스플레이주식회사 flat panel display integrated touch screen panel
CN102193226A (en) * 2010-03-17 2011-09-21 谊达光电科技股份有限公司 Panel with proximity sensing function
JP2011198009A (en) * 2010-03-19 2011-10-06 Sony Corp Electro-optical device with input function
US9990062B2 (en) * 2010-03-26 2018-06-05 Nokia Technologies Oy Apparatus and method for proximity based input
CN102375633A (en) * 2010-08-25 2012-03-14 毅齐科技股份有限公司 Multi-touch structure of surface capacitive touch panel and method thereof
JP2012065107A (en) * 2010-09-15 2012-03-29 Kyocera Corp Portable terminal apparatus
TW201214237A (en) * 2010-09-16 2012-04-01 Asustek Comp Inc Touch display device and control method thereof
JPWO2012049942A1 (en) * 2010-10-13 2014-02-24 Necカシオモバイルコミュニケーションズ株式会社 Mobile terminal device and display method of touch panel in mobile terminal device
US9652089B2 (en) * 2010-11-09 2017-05-16 Tpk Touch Solutions Inc. Touch panel stackup
JP2012194810A (en) * 2011-03-16 2012-10-11 Kyocera Corp Portable electronic apparatus, method for controlling contact operation, and contact operation control program
US8829926B2 (en) * 2012-11-19 2014-09-09 Zrro Technologies (2009) Ltd. Transparent proximity sensor

Patent Citations (42)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7088343B2 (en) * 2001-04-30 2006-08-08 Lenovo (Singapore) Pte., Ltd. Edge touchpad input device
US20020158838A1 (en) * 2001-04-30 2002-10-31 International Business Machines Corporation Edge touchpad input device
US9007325B2 (en) * 2002-05-16 2015-04-14 Sony Corporation Input method and input apparatus
US7796124B2 (en) * 2005-10-27 2010-09-14 Alps Electric Co., Ltd. Input device and electronic apparatus
US20130185677A1 (en) * 2005-12-23 2013-07-18 Apple Inc. Unlocking a Device by Performing Gestures on an Unlock Image
US20120293438A1 (en) * 2005-12-23 2012-11-22 Imran Chaudhri Unlocking a Device by Performing Gestures on an Unlock Image
US8640057B2 (en) * 2005-12-23 2014-01-28 Apple Inc. Unlocking a device by performing gestures on an unlock image
US8627237B2 (en) * 2005-12-23 2014-01-07 Apple Inc. Unlocking a device by performing gestures on an unlock image
US8046721B2 (en) * 2005-12-23 2011-10-25 Apple Inc. Unlocking a device by performing gestures on an unlock image
US20110296356A1 (en) * 2005-12-23 2011-12-01 Imran Chaudhri Unlocking a Device by Performing Gestures on an Unlock Image
US20120023458A1 (en) * 2005-12-23 2012-01-26 Imran Chaudhri Unlocking a Device by Performing Gestures on an Unlock Image
US20130190056A1 (en) * 2005-12-23 2013-07-25 Apple Inc. Unlocking a Device by Performing Gestures on an Unlock Image
US20130185680A1 (en) * 2005-12-23 2013-07-18 Apple Inc. Unlocking a Device by Performing Gestures on an Unlock Image
US8209637B2 (en) * 2005-12-23 2012-06-26 Apple Inc. Unlocking a device by performing gestures on an unlock image
US8286103B2 (en) * 2005-12-23 2012-10-09 Apple Inc. Unlocking a device by performing gestures on an unlock image
US8745544B2 (en) * 2005-12-23 2014-06-03 Apple Inc. Unlocking a device by performing gestures on an unlock image
US8527903B2 (en) * 2005-12-23 2013-09-03 Apple Inc. Unlocking a device by performing gestures on an unlock image
US8694923B2 (en) * 2005-12-23 2014-04-08 Apple Inc. Unlocking a device by performing gestures on an unlock image
US20070150842A1 (en) * 2005-12-23 2007-06-28 Imran Chaudhri Unlocking a device by performing gestures on an unlock image
US20150082252A1 (en) * 2005-12-23 2015-03-19 Apple Inc. Unlocking a device by performing gestures on an unlock image
US20090241072A1 (en) * 2005-12-23 2009-09-24 Imran Chaudhri Unlocking a Device by Performing Gestures on an Unlock Image
US20130185678A1 (en) * 2005-12-23 2013-07-18 Apple Inc. Unlocking a Device by Performing Gestures on an Unlock Image
US7657849B2 (en) * 2005-12-23 2010-02-02 Apple Inc. Unlocking a device by performing gestures on an unlock image
US20110020517A1 (en) * 2008-04-04 2011-01-27 Mark Rubin Taste modifying product
US8799827B2 (en) * 2010-02-19 2014-08-05 Microsoft Corporation Page manipulations using on and off-screen gestures
US9274682B2 (en) * 2010-02-19 2016-03-01 Microsoft Technology Licensing, Llc Off-screen gestures to create on-screen input
US20110205172A1 (en) * 2010-02-23 2011-08-25 Panasonic Corporation Touch screen device
US20120098766A1 (en) * 2010-09-24 2012-04-26 Research In Motion Limited Portable Electronic Device and Method of Controlling Same
US9383918B2 (en) * 2010-09-24 2016-07-05 Blackberry Limited Portable electronic device and method of controlling same
US9507479B2 (en) * 2010-09-28 2016-11-29 Japan Display Inc. Display device with touch detection function and electronic unit
US20150199057A1 (en) * 2010-09-28 2015-07-16 Japan Display Inc. Display device with touch detection function and electronic unit
US20120075238A1 (en) * 2010-09-28 2012-03-29 Sony Corporation Display device with touch detection function and electronic unit
US8780078B2 (en) * 2011-04-27 2014-07-15 Lg Display Co., Ltd. In-cell type touch panel
US20120274603A1 (en) * 2011-04-27 2012-11-01 Cheol-Se Kim In-cell type touch panel
US20120280917A1 (en) * 2011-05-03 2012-11-08 Toksvig Michael John Mckenzie Adjusting Mobile Device State Based on User Intentions and/or Identity
US20120299868A1 (en) * 2011-05-25 2012-11-29 Broadcom Corporation High Noise Immunity and High Spatial Resolution Mutual Capacitive Touch Panel
US20130154955A1 (en) * 2011-12-19 2013-06-20 David Brent GUARD Multi-Surface Touch Sensor Device With Mode of Operation Selection
US20130300668A1 (en) * 2012-01-17 2013-11-14 Microsoft Corporation Grip-Based Device Adaptations
US20130206567A1 (en) * 2012-02-14 2013-08-15 Samsung Display Co., Ltd. Touch panel
US8809717B2 (en) * 2012-02-14 2014-08-19 Samsung Display Co., Ltd. Touch panel
US20140078086A1 (en) * 2012-09-20 2014-03-20 Marvell World Trade Ltd. Augmented touch control for hand-held devices
US9727235B2 (en) * 2013-12-12 2017-08-08 Lenovo (Singapore) Pte. Ltd. Switching an interface mode using an input gesture

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170024124A1 (en) * 2014-04-14 2017-01-26 Sharp Kabushiki Kaisha Input device, and method for controlling input device
US10768752B2 (en) 2015-02-27 2020-09-08 Quickstep Technologies Llc Method for interacting with an electronic and/or computer device implementing a capacitive control surface and a peripheral surface, interface and device implementing this method
US20230152912A1 (en) * 2021-11-18 2023-05-18 International Business Machines Corporation Splitting a mobile device display and mapping content with single hand
US11861084B2 (en) * 2021-11-18 2024-01-02 International Business Machines Corporation Splitting a mobile device display and mapping content with single hand

Also Published As

Publication number Publication date
JP2015518215A (en) 2015-06-25
FR2990033A1 (en) 2013-11-01
JP2015519644A (en) 2015-07-09
WO2013160151A1 (en) 2013-10-31
KR20150010755A (en) 2015-01-28
KR20150010718A (en) 2015-01-28
KR101875995B1 (en) 2018-07-06
CN104335150B (en) 2019-02-22
FR2990033B1 (en) 2014-05-16
EP2842019A1 (en) 2015-03-04
CN104321726B (en) 2017-04-12
EP3079047A1 (en) 2016-10-12
WO2013160323A1 (en) 2013-10-31
FR2990020A1 (en) 2013-11-01
FR2990020B1 (en) 2014-05-16
EP2842019B1 (en) 2016-05-25
CN106933417A (en) 2017-07-07
EP3079047B1 (en) 2017-09-27
KR102028783B1 (en) 2019-10-04
US20150035792A1 (en) 2015-02-05
CN104321726A (en) 2015-01-28
US9104283B2 (en) 2015-08-11
CN104335150A (en) 2015-02-04
JP6463669B2 (en) 2019-02-06
EP2842018A1 (en) 2015-03-04
JP2018063732A (en) 2018-04-19

Similar Documents

Publication Publication Date Title
US20150091854A1 (en) Method for interacting with an apparatus implementing a capacitive control surface, interface and apparatus implementing this method
JP6074408B2 (en) Touch sensitive screen
JP5832784B2 (en) Touch panel system and electronic device using the same
US20130038564A1 (en) Touch Sensitive Device Having Dynamic User Interface
US20120068948A1 (en) Character Input Device and Portable Telephone
KR20130075770A (en) Touch and hover switching
KR20160144968A (en) Projected capacitive touch with force detection
US20130079139A1 (en) Overlays for touch sensitive screens to simulate buttons or other visually or tactually discernible areas
US10466833B2 (en) Touch control device comprising pressure-sensing layer and flat touch sensing layer
US10061445B2 (en) Touch input device
WO2017012294A1 (en) Touch control module, touch screen panel, touch positioning method thereof and display device
US20140282279A1 (en) Input interaction on a touch sensor combining touch and hover actions
US20110134071A1 (en) Display apparatus and touch sensing method
JP6236640B2 (en) Detection method using capacitance type sensor and electronic device
US10452194B2 (en) Display device and driving method for display device using the same
CN111435275A (en) Touch sensor with peripheral electrodes
EP2751657B1 (en) An apparatus, method and computer pprogram using a proximity detector
US20140210739A1 (en) Operation receiver
US20140085227A1 (en) Display and method in an electric device
KR101933049B1 (en) Underwater control method of camera
EP2681646B1 (en) Electronic apparatus, display method, and program
US9134843B2 (en) System and method for distinguishing input objects
KR101752315B1 (en) Underwater control method of camera
KR20170069022A (en) Display Device
US9645674B2 (en) Self-capacitive touch sensing devices, touch point positioning method, and display devices

Legal Events

Date Code Title Description
AS Assignment

Owner name: FOGALE NANOTECH, FRANCE

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:ROZIERE, DIDIER;BLONDIN, CHRISTOPHE;SIGNING DATES FROM 20130521 TO 20130530;REEL/FRAME:034021/0364

AS Assignment

Owner name: QUICKSTEP TECHNOLOGIES LLC, DELAWARE

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:FOGALE NANOTECH S.A.;REEL/FRAME:037552/0041

Effective date: 20151221

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION