CN103809865A - Touch action identification method for touch screen - Google Patents
Touch action identification method for touch screen Download PDFInfo
- Publication number
- CN103809865A CN103809865A CN201210449877.XA CN201210449877A CN103809865A CN 103809865 A CN103809865 A CN 103809865A CN 201210449877 A CN201210449877 A CN 201210449877A CN 103809865 A CN103809865 A CN 103809865A
- Authority
- CN
- China
- Prior art keywords
- touchable item
- touch
- touchable
- area
- contact position
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Abstract
The invention discloses a touch action identification method for a touch screen. The method comprises the following steps of receiving a touch position on the touch screen; calculating the area of the touch position, finding out all touchable items contacting with the touch position, and calculating the area of each touchable item contacting with the touch position; calculating a ratio value of the intersected area of the areas of each touchable item and the touch position and the area of each touchable item, and finding out the touchable item corresponding to the maximum ratio value; recording the touchable item as a selected touchable item.
Description
Technical field
The present invention relates to touch screen technology field, relate in particular to a kind of touch action recognition methods of touch-screen.
Background technology
Intelligent touch mobile phone generally carries out input information by touch-screen.For example, user can operate soft keyboard by touch-screen, carries out the input of information.In the time of user's browsing page, also operate accordingly by touching corresponding touchable item (menu item).
The determination methods of current touch action is generally to calculate the point coordinate of contact position (as, the center point coordinate of contact position), and this center point coordinate drops on some touchable items, triggers this touchable item.
But, because finger is also irregular with the shape of the contact position of touch-screen, make calculating the out of true of the center point coordinate of contact position.And, if user, in the time touching time finger and center point coordinate that skew makes contact position occurs do not drop on any one touchable item, cannot trigger the touchable item of desired selection, cause the inconvenience of use.
Summary of the invention
For the problems referred to above, be necessary to provide a kind of touch action recognition methods of the touch-screen that improves input information accuracy.
A touch action recognition methods for touch-screen, the method comprises the steps:
On touch-screen, receive touch location;
Calculate the area of this contact position, find out all touchable items that contact with this contact position, and calculate the area of each touchable item contacting with this contact position;
Calculate the ratio value of the common factor area of the area of each touchable item and the area of contact position and the area of each touchable item, and find out the corresponding touchable item of maximum ratio value;
Recording this touchable item is selected touchable item.
The touch action recognition methods of described touch-screen is selected corresponding touchable item by the contact area that compares contact position and each touchable item, so, in the time that the central point of contact position does not drop on any one touchable item, still can want selectively touchable item by the selected user of the touch action recognition methods of touch-screen of the present invention, improve the accuracy of touch action identification.
Accompanying drawing explanation
Fig. 1 to Fig. 3 is the process flow diagram of the touch action recognition methods of the touch-screen of preferred embodiments of the present invention.
Embodiment
Refer to Fig. 1 to Fig. 3, the touch action recognition methods of the touch-screen of preferred embodiments of the present invention comprises the steps:
Step S1: receive contact position on touch-screen.
Step S2: find the contact point of touch-screen and this contact position, calculate the center point coordinate of this contact position, and find the touchable item of this central point contact.Wherein, touch-screen is divided according to the longitudinal axis of the transverse axis of many interval deciles and many interval deciles, is touched the part that position covers and is described contact point in all intersection points between these transverse axis and the longitudinal axis.Described touchable item can activate corresponding function or event after referring to and correctly being touched.As, after some buttons of dummy keyboard are triggered, can carry out the input of corresponding key assignments, this button is touchable item.And for example, after the menu option of some menu pages of mobile phone is touched and chooses, can arrange accordingly or enter next menu option list, this menu option is touchable item.
Step S3: judge whether to find the touchable item contacting with this central point.If so, perform step S4; Otherwise execution step S5.
Step S4: send touch event to the touchable item contacting with this central point, to activate the corresponding function of the touchable item contacting with this central point.
Step S5: whether the quantity that judges contact point is greater than a default contact point maximal value.If so, explanation may be that palm or face etc. touch touch-screen, now returns to step S4; Otherwise execution step S6.
Step S6: judge that contact position is whether in current picture frame layer.If so, perform step S7; Otherwise execution step S18.Wherein, be covered with a not only picture frame layer of whole picture of touch-screen, for example, may comprise the first picture frame layer and be positioned at the first picture frame layer below and the second picture frame layer of being covered by the first picture frame layer part.Described the first picture frame layer is current picture frame layer.The highlighted demonstration of the first picture frame layer possibility, the display brightness of the part that the second picture frame layer is not covered by the first picture frame layer can be lower than the brightness of the first picture frame layer, to distinguish.
Step S7: calculate the area Tx of this contact position, find out all touchable items that contact with this contact position, and calculate the area A x of each touchable item contacting with this contact position.
Step S8: the common factor area Ix that calculates the area A x of each touchable item and the area Tx of contact position accounts for the ratio Px of the area A x of each touchable item, i.e. the corresponding touchable item of ratio Px that Px=Ix/Ax, and the value of finding out is maximum.
Step S9: judge whether this maximum ratio Px is greater than a first default minimum scale value.If so, perform step S10; Otherwise execution step S15.
Step S10: recording this touchable item is selected touchable item.Wherein, if there is the ratio Px value of more than two touchable item identical and be maximum, selecting the touchable item of common factor area Ix maximum is selected touchable item.If the common factor area Ix of the described plural touchable item with maximum Px value is also identical, selecting wherein any one touchable item is selected touchable item.
Step S11: judge whether this contact position place is released., judge whether user's finger has stopped contacting touch-screen.If so, perform step S12; Otherwise execution step S13.
Step S12: send touch event to this selected touchable item, to activate the corresponding function of this selected touchable item, flow process finishes.
Step S13: judge whether this contact position moves.If so, illustrating that user points on touch-screen slides, execution step S14; Otherwise return to step S11.
Step S14: judge whether this contact position exists common factor area at the terminal of its motion track and chosen touchable item.If so, return to step S12; Otherwise execution step S18.
Step S15: find out the touchable item with maximum common factor area Ix.
Step S16: whether the ratio Ix/Tx that judges the common factor area Ix of this maximum and the area A x of contact position is greater than a second default minimum scale value.If so, perform step S17; Otherwise illustrate and there is no suitable touchable item, execution step S18.
Step S17: selecting the touchable item corresponding to common factor area of this maximum is selected touchable item, returns to step S11.Wherein, if having the area of plural touchable item and the common factor area of contact position area to be maximum, choosing wherein any one touchable item is selected touchable item.
Step S18: send the viewing area that touch event falls into the coordinate of described central point, flow process finishes.Wherein, described viewing area refers to the viewing area that forms picture frame layer, and each picture frame layer may be divided into multiple viewing areas.In each viewing area, may comprise again one or more touchable items and be touched after can not trigger the non-touch of any corresponding function, as some patterns or word etc.In the present embodiment, send the viewing area that touch event falls into the coordinate of described central point, refer to when selected without any a touchable item in this viewing area, send touch event to described viewing area, to trigger this viewing area, as, make this highlighted demonstration in whole viewing area.
The touch action recognition methods of described touch-screen is selected corresponding touchable item by the contact area that compares contact position and each touchable item, so, in the time that the central point of contact position does not drop on any one touchable item, still can want selectively touchable item by the selected user of the touch action recognition methods of touch-screen of the present invention, improve the accuracy of touch action identification.
Claims (6)
1. a touch action recognition methods for touch-screen, the method comprises the steps:
(a) on touch-screen, receive touch location;
(b) calculate the area of this contact position, find out all touchable items that contact with this contact position, and calculate the area of each touchable item contacting with this contact position;
(c) calculate the ratio value of the common factor area of the area of each touchable item and the area of contact position and the area of each touchable item, and find out the corresponding touchable item of maximum ratio value;
(d) recording this touchable item is selected touchable item.
2. touch action recognition methods as claimed in claim 1, is characterized in that, step (c) is afterwards, step (d) also comprises before:
The ratio value of described maximum and first minimum scale value of presetting are compared;
If this maximum ratio value is less than a first default minimum scale value, find out the touchable item with maximum common factor area;
Selecting the touchable item that this maximum common factor area is corresponding is selected touchable item.
3. touch action recognition methods as claimed in claim 1, it is characterized in that: in step (d), if there is the ratio value of more than two touchable item identical and be maximum, selecting with the touchable item of the common factor area maximum of described contact position is selected touchable item.
4. touch action recognition methods as claimed in claim 1, is characterized in that, after step (d), also comprises:
Judge whether this contact position moves;
If this contact position moves, and there is common factor area in this contact position between the destination county of its motion track and chosen touchable item, activates the corresponding function of this touchable item.
5. touch action recognition methods as claimed in claim 1, is characterized in that: step (a) afterwards, step (b) also comprises before the center point coordinate that calculates this contact position, and judge whether the step of the touchable item contacting with this central point.
6. touch action recognition methods as claimed in claim 5, it is characterized in that: if there is not the touchable item contacting with this central point, find the contact point that this contact position contacts with touch-screen, and judge whether the quantity of contact point is greater than a default contact point maximal value.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201210449877.XA CN103809865A (en) | 2012-11-12 | 2012-11-12 | Touch action identification method for touch screen |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201210449877.XA CN103809865A (en) | 2012-11-12 | 2012-11-12 | Touch action identification method for touch screen |
Publications (1)
Publication Number | Publication Date |
---|---|
CN103809865A true CN103809865A (en) | 2014-05-21 |
Family
ID=50706726
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201210449877.XA Pending CN103809865A (en) | 2012-11-12 | 2012-11-12 | Touch action identification method for touch screen |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN103809865A (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106575170A (en) * | 2014-07-07 | 2017-04-19 | 三星电子株式会社 | Method of performing a touch action in a touch sensitive device |
CN108089776A (en) * | 2018-01-09 | 2018-05-29 | 厦门盈趣科技股份有限公司 | A kind of accurate positioning method based on capacitance touch point |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20040183833A1 (en) * | 2003-03-19 | 2004-09-23 | Chua Yong Tong | Keyboard error reduction method and apparatus |
US20100066694A1 (en) * | 2008-09-10 | 2010-03-18 | Opera Software Asa | Method and apparatus for providing finger touch layers in a user agent |
CN102117143A (en) * | 2009-12-31 | 2011-07-06 | 深圳迈瑞生物医疗电子股份有限公司 | Method and device for responding a touch screen |
CN102346648A (en) * | 2011-09-23 | 2012-02-08 | 惠州Tcl移动通信有限公司 | Method and system for realizing priorities of input characters of squared up based on touch screen |
-
2012
- 2012-11-12 CN CN201210449877.XA patent/CN103809865A/en active Pending
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20040183833A1 (en) * | 2003-03-19 | 2004-09-23 | Chua Yong Tong | Keyboard error reduction method and apparatus |
US20100066694A1 (en) * | 2008-09-10 | 2010-03-18 | Opera Software Asa | Method and apparatus for providing finger touch layers in a user agent |
CN102117143A (en) * | 2009-12-31 | 2011-07-06 | 深圳迈瑞生物医疗电子股份有限公司 | Method and device for responding a touch screen |
CN102346648A (en) * | 2011-09-23 | 2012-02-08 | 惠州Tcl移动通信有限公司 | Method and system for realizing priorities of input characters of squared up based on touch screen |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106575170A (en) * | 2014-07-07 | 2017-04-19 | 三星电子株式会社 | Method of performing a touch action in a touch sensitive device |
CN108089776A (en) * | 2018-01-09 | 2018-05-29 | 厦门盈趣科技股份有限公司 | A kind of accurate positioning method based on capacitance touch point |
CN108089776B (en) * | 2018-01-09 | 2021-01-22 | 厦门盈趣科技股份有限公司 | Accurate positioning method based on capacitive touch points |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
KR102240088B1 (en) | Application switching method, device and graphical user interface | |
EP2825944B1 (en) | Touch screen hover input handling | |
CN104272240B (en) | System and method for changing dummy keyboard on a user interface | |
US9024892B2 (en) | Mobile device and gesture determination method | |
US20100127995A1 (en) | System and method for differentiating between intended and unintended user input on a touchpad | |
CN103914196B (en) | Electronic equipment and the method for determining the validity that the touch key-press of electronic equipment inputs | |
US20060250372A1 (en) | Touchpad with smart automatic scroll function and control method therefor | |
CN103092505A (en) | Information processing device, information processing method, and computer program | |
US20070002027A1 (en) | Smart control method for cursor movement using a touchpad | |
CN103019588A (en) | Touch positioning method, device and terminal | |
CN104238794B (en) | The response method and terminal, mobile terminal of a kind of pair of contact action | |
CN105867821A (en) | Icon arranging method and device and terminal | |
CN102117165A (en) | Touch input processing method and mobile terminal | |
CN105824531A (en) | Method and device for adjusting numbers | |
US20110248946A1 (en) | Multi-mode prosthetic device to facilitate multi-state touch screen detection | |
CN104850328A (en) | Method and device for selecting object of smartwatch | |
CN106020698A (en) | Mobile terminal and realization method of single-hand mode | |
US20120179963A1 (en) | Multi-touch electronic device, graphic display interface thereof and object selection method of multi-touch display | |
CN106371745B (en) | A kind of interface switching method and mobile terminal | |
CN104704454A (en) | Terminal and method for processing multi-point input | |
CN107390931A (en) | Response control mehtod, device, storage medium and the mobile terminal of touch operation | |
CN101482799A (en) | Method for controlling electronic equipment through touching type screen and electronic equipment thereof | |
JPWO2009031213A1 (en) | Portable terminal device and display control method | |
CN104898880A (en) | Control method and electronic equipment | |
CN103246464A (en) | Electronic device and display processing method thereof |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
WD01 | Invention patent application deemed withdrawn after publication |
Application publication date: 20140521 |
|
WD01 | Invention patent application deemed withdrawn after publication |