CN102306069A - Camera shooting type touch control method and device - Google Patents

Camera shooting type touch control method and device Download PDF

Info

Publication number
CN102306069A
CN102306069A CN201110257221A CN201110257221A CN102306069A CN 102306069 A CN102306069 A CN 102306069A CN 201110257221 A CN201110257221 A CN 201110257221A CN 201110257221 A CN201110257221 A CN 201110257221A CN 102306069 A CN102306069 A CN 102306069A
Authority
CN
China
Prior art keywords
touch
touch control
coordinate
camera head
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201110257221A
Other languages
Chinese (zh)
Other versions
CN102306069B (en
Inventor
曾昭兴
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Changshu Intellectual Property Operation Center Co ltd
Guangdong Gaohang Intellectual Property Operation Co ltd
Original Assignee
Vtron Technologies Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vtron Technologies Ltd filed Critical Vtron Technologies Ltd
Priority to CN201110257221.3A priority Critical patent/CN102306069B/en
Publication of CN102306069A publication Critical patent/CN102306069A/en
Application granted granted Critical
Publication of CN102306069B publication Critical patent/CN102306069B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

The invention provides a camera shooting type touch control method and device. The method comprises the following steps: firstly computing the first initial coordinates of a touch object and then computing the theoretical value of the touch positioning information of the first initial coordinates which are shot by a third camera device according to the first initial coordinates and the coordinate of the third camera device; and then comparing the theoretical value of the touch positioning information with the actual value obtained through shooting and screening the first initial coordinates, thus directly determining the coordinate of the touch object according to the coordinate obtained after screening. The method and the device have the following beneficial effects that: the initial coordinates are unnecessary to compute twice; the computation amount for the coordinate of the touch object and positioning time delay are reduced and the speed of camera shooting type touch positioning is increased.

Description

A kind of camera type touch control method and device thereof
Technical field
The present invention relates to touch the control technology field, especially relate to a kind of camera type touch control method, and a kind of camera type touch control device.
Background technology
Touch control technology as a kind of novel computer input technology; Make man-machine interaction more directly perceived; Because to bringing great convenience property of user; Except being applied to the portable personal information products, application has spreaded all over every field such as information household appliances, public information, electronic game, business automation equipment.
Common touch control technology comprises capacitance touch technology, resistive touch technology, infrared touch technology or camera type touch technology etc.; Wherein to have use equipment simple for the camera type touch technology; Advantage such as easy for installation becomes touch control technology one more and more part and parcel.
The principle of a kind of camera type touch technology of prior art as shown in Figure 1, at least three camera heads of diverse location setting at the edge of display device; Set up coordinate system, obtain the coordinate information of each said camera head and the touch locating information that each said camera head is taken touch objects; When a plurality of touch objects,, calculate second preliminary coordinate of touch objects according to the said touch locating information of said first camera head and the shooting of said the 3rd camera head; Effective range according to said touch control area is screened said first preliminary coordinate and said second preliminary coordinate respectively; To pass through in first preliminary coordinate and said second preliminary coordinate after the screening identical coordinate points and confirm as the actual coordinate of touch objects, thereby touch objects will be positioned.
Yet; Though above-mentioned camera type touch technology can position each touch objects accurately, owing to calculate the preliminary coordinate of twice said touch point respectively, so operand is bigger; May make the location of touch objects certain delay occur, make the user not feel well.
Summary of the invention
The object of the present invention is to provide and a kind ofly can reduce operand, improve the camera type touch control method of locating speed, can reduce the time-delay of the location of the touch objects of bringing because of operand is excessive.
A kind of camera type touch control method may further comprise the steps:
Obtain the image of the said touch control area that first camera head, second camera head and the 3rd camera head at the edge, touch control area that is arranged on display device take respectively;
According to the said touch locating information of said first camera head and the shooting of said second camera head, calculate first preliminary coordinate of touch objects; Wherein, said touch locating information comprises the positional information of touch objects image in the image of the said touch control area that each said camera head is taken;
According to the coordinate of said first preliminary coordinate and the 3rd camera head, calculate the theoretical value of said the 3rd camera head to the touch locating information of said first preliminary coordinate shooting;
With the actual value of the touch locating information of the theoretical value of said touch locating information and said the 3rd camera head relatively, if comparative result is identical, then said first preliminary coordinate of correspondence is confirmed as touch article coordinate.
Compared with prior art; In the camera type touch control method of the present invention; At first calculate first preliminary coordinate of touch objects, the coordinate Calculation according to said first preliminary coordinate and the 3rd camera head goes out the theoretical value of said the 3rd camera head to the touch locating information of said first preliminary coordinate shooting then; Theoretical value with said touch locating information compares with the actual value of taking acquisition again, thereby said first preliminary coordinate is screened, and directly can confirm touch article coordinate according to the coordinate after the screening.Need not to carry out the calculating of twice preliminary coordinate, reduced operand the calculating of touch article coordinate, and the time-delay of location, the speed of camera type touch location improved.
Another object of the present invention is to provide a kind of and can reduce operand, improve the camera type touch control device of locating speed, can reduce the time-delay of the location of the touch objects of bringing because of operand is excessive.
A kind of camera type touch control device comprises:
Image collection module is used to obtain the image of the said touch control area that first camera head, second camera head and the 3rd camera head at the edge, touch control area that is arranged on display device take respectively;
The Primary Location module is used for the said touch locating information according to said first camera head and the shooting of said second camera head, calculates first preliminary coordinate of touch objects; Wherein, said touch locating information comprises the positional information of touch objects image in the image of the said touch control area that each said camera head is taken;
Computing module is used for the coordinate according to said first preliminary coordinate and the 3rd camera head, calculates the theoretical value of said the 3rd camera head to the touch locating information of said first preliminary coordinate shooting;
Comparer is used for the actual value of the touch locating information of the theoretical value of said touch locating information and said the 3rd camera head relatively;
Locating module is used for comparative result at said comparer when identical, and said first preliminary coordinate of correspondence is confirmed as touch article coordinate.
Compared with prior art; In the camera type touch control device of the present invention; Said Primary Location module is at first calculated first preliminary coordinate of touch objects, and said then computing module goes out the theoretical value of said the 3rd camera head to the touch locating information of said first preliminary coordinate shooting according to the coordinate Calculation of said first preliminary coordinate and the 3rd camera head; Said comparer compares the theoretical value of said touch locating information with the actual value of taking acquisition, thereby said first preliminary coordinate is screened, and said locating module directly can be confirmed touch article coordinate according to the coordinate after the screening.Need not to carry out the calculating of twice preliminary coordinate, reduced operand the calculating of touch article coordinate, and the time-delay of location, the speed of camera type touch location improved.
Description of drawings
Fig. 1 is the principle schematic of a kind of camera type touch technology of prior art;
Fig. 2 is the process flow diagram of camera type touch control method of the present invention;
Fig. 3 is the synoptic diagram that three camera heads are set at the display device side among the present invention;
Fig. 4 is the schematic flow sheet of a kind of preferred implementation of camera type touch control method of the present invention;
Fig. 5 is the synoptic diagram that among the present invention the touch control area of display device shown in Figure 3 is carried out boundary alignment;
Fig. 6 is the part schematic flow sheet of another preferred implementation of camera type touch control method of the present invention;
Fig. 7 is to the synoptic diagram of the subregion image division of each camera head among the present invention;
Fig. 8 is the synoptic diagram of among the present invention the shooting subregion of each camera head being divided;
Fig. 9 is the touch control area of display device described in the present invention when touch objects occurring, the synoptic diagram of the subregion image that each camera head is taken;
Figure 10 is the structural representation of camera type touch control device of the present invention;
Figure 11 is the structural representation of a kind of preferred implementation of camera type touch control device of the present invention;
Figure 12 is the structural representation of the another kind of preferred implementation of camera type touch control device of the present invention.
Embodiment
See also Fig. 2, Fig. 2 is the process flow diagram of camera type touch control method of the present invention.
Said camera type touch control method may further comprise the steps:
S201 obtains the image of the said touch control area that first camera head, second camera head and the 3rd camera head at the edge, touch control area that is arranged on display device take respectively;
Wherein, said touch control area is that positioned area is taken in user's touch control, is preferably the zone of the positive top of said display device.Touch operation in said touch control area will be to be detected, handles as the information that touches control.
For guaranteeing that each camera head can both photograph the image of the touch control area of complete said display device, therefore, the shooting visual angle of each said camera head all must comprise whole said touch control area.The shooting axle preferred parallel of each said camera head is in the plane of said camera head; From the side said touch control area is taken; When touch objects appears at said touch control area, photograph the image that said touch objects appears at said touch control area equally from the side.
The position is set is set on each position at edge, said touch control area according to those skilled in the art's needs of said three camera heads; Those skilled in the art only need obtain the position coordinates of said three camera heads and the relative position of said display device; Just can calculate the position relation of touch control area of photographic images and the said display device of each camera head, thereby the touch objects that occurs in the said touch control area is positioned through computing.
And as preferred embodiment a kind of; Three said camera heads are separately positioned on the two ends on said display device one side and the midpoint on said limit; And the shooting angle that is arranged on two said camera heads at said display device two ends on one side is 90 degree, and the shooting angle that is arranged on the camera head of said limit midpoint is 180 degree.Therefore can utilize three camera heads just whole said touch control area to be taken, and avoid touch objects to appear at two situation between the camera head.
The detection and location that touch objects is described are for ease calculated, and orientate the example explanation as with two-dimensional touch below:
See also Fig. 3, the plane at display device 11 places as true origin, is set up coordinate system with any point in the plane at display device 11 places.The touch control area of said display device 11 is the zone onesize with the positive viewing area of said display device 11.As being true origin with the first camera head A, then, the coordinate of this first camera head A is (0,0); If the length on one side 111 of this display device 11 is L, the coordinate of the second camera head B is (L, 0), and the coordinate of the 3rd camera head C is (L/2; 0), the coordinate of touch objects O be unknown-value (x, y).From the said first camera head A, the said second camera head B and said the 3rd camera head C, obtain the image that it is taken the touch control area of said display device 11 respectively respectively.
S202 according to the said touch locating information of said first camera head and the shooting of said second camera head, calculates first preliminary coordinate of touch objects; Wherein, said touch locating information comprises the positional information of touch objects image in the image of the said touch control area that each said camera head is taken;
Said touch objects image can be represented with the distance value of the image border of touch objects image and said touch control area in the said positional information of stating in the image that touches the control area; Also the information of shooting angles that can take said touch objects according to each camera head that said relative position information calculates is represented the position of touch objects image in the image of the said touch control area that each said camera head is taken with said information of shooting angles.
According to the said touch locating information of two camera heads, can confirm the intersection point of two straight lines from two camera heads to touch article coordinate, said intersection point is the preliminary coordinate of touch objects.
S203 according to the coordinate of said first preliminary coordinate and the 3rd camera head, calculates the theoretical value of said the 3rd camera head to the touch locating information of said first preliminary coordinate shooting;
Coordinate according to said first preliminary coordinate and the 3rd camera head; Can confirm the straight line of the coordinate of said first preliminary coordinate and the 3rd camera head; Thereby can calculate the information of shooting angles of said the 3rd camera head according to said straight line, perhaps further calculate the theoretical position information of image in the image of the said touch control area that said the 3rd camera head is taken of said first preliminary coordinate said first preliminary coordinate.
S204 with the actual value of the touch locating information of the theoretical value of said touch locating information and said the 3rd camera head relatively, if comparative result is identical, then confirms as touch article coordinate with said first preliminary coordinate of correspondence.
If comparative result is inequality, illustrate that then the touch locating information of said first preliminary coordinate and said the 3rd camera head shooting acquisition is inconsistent, said first preliminary coordinate is an invalid coordinates.
Compared with prior art; In the camera type touch control method of the present invention; At first calculate first preliminary coordinate of touch objects, the coordinate Calculation according to said first preliminary coordinate and the 3rd camera head goes out the theoretical value of said the 3rd camera head to the touch locating information of said first preliminary coordinate shooting then; Theoretical value with said touch locating information compares with the actual value of taking acquisition again, thereby said first preliminary coordinate is screened, and directly can confirm touch article coordinate according to the coordinate after the screening.Need not to carry out the calculating of twice preliminary coordinate, reduced operand the calculating of touch article coordinate, and the time-delay of location, the speed of camera type touch location improved.
Please consult Fig. 4 in the lump, Fig. 4 is the schematic flow sheet of a kind of preferred implementation of camera type touch control method of the present invention.
In this embodiment, said camera type touch control method may further comprise the steps:
S401 obtains the image of the said touch control area that first camera head, second camera head and the 3rd camera head at the edge, touch control area that is arranged on display device take respectively;
S402 starts the border deterministic model;
The pattern that said border deterministic model positions for the effective range to said touch control area, the touch objects that under the deterministic model of said border, occurs in the said touch control area is processed device and is defaulted as the boundary alignment thing.
As a kind of preferred implementation, when starting the border deterministic model, send the cue that starts the border deterministic model, remind user touch point to hit the border of said touch control area, so that carry out boundary alignment.
S403; Under the deterministic model of border, obtain the boundary alignment information of the said touch control area of each said camera head shooting; Wherein, said boundary alignment information comprises the positional information of image in the image of the said touch control area that each said camera head is taken of boundary alignment thing;
S404 according to the said boundary alignment information that each said camera head is taken, calculates the coordinate of said boundary alignment thing;
S405 according to the coordinate of said boundary alignment thing, confirms the effective range of said touch control area;
,, can set according to the user and adopt the Different Boundary locator meams when under the deterministic model of border, obtaining said boundary alignment information to step S405 from step S403;
Wherein a kind of boundary alignment mode is:
Under the deterministic model of border, the positional information of image in the image of said touch control area of obtaining a plurality of boundary alignment things that occur successively is as said boundary alignment information;
When confirming the effective range of said touch control area; Calculate the coordinate of said a plurality of boundary alignment things respectively according to said boundary alignment information, and will be the effective range of the polygonal region on summit with the coordinate points of said a plurality of boundary alignment things as said touch control area.
This mode is clicked by the touch of user under the deterministic model of said border fully, confirms the effective range of said touch control area, so the user can set according to the needs of oneself more neatly, and is very convenient.
Another kind of boundary alignment mode is:
Under the deterministic model of border, the positional information of image in the image of said touch control area of obtaining several boundary alignment things that occur successively is as said boundary alignment information;
When confirming the effective range of said touch control area; Calculate the coordinate of said a plurality of boundary alignment things respectively according to said boundary alignment information, and the polygonal region that will be the summit with the coordinate points and the predefined coordinate points of said a plurality of boundary alignment things is as the effective range of said touch control area.
This mode is clicked and preset coordinate points according to the touch of user under the deterministic model of said border; Confirm the effective range of said touch control area; Therefore the user can only click one of them or several point; Just can easily confirm the effective range of said touch control area; Very convenient, and also can accelerate the speed of boundary alignment.
For example shown in Figure 5; Boundary alignment to the touch control area of the said display device 11 among Fig. 3; Just can only click the edge of said display device 11; With the some H on the diagonal line of the first camera head A, the positional information of image in the image of said touch control area of obtaining said some H is as said boundary alignment information; According to the said boundary alignment information that each said camera head is taken, the coordinate that calculates said some H be (L, h); Then, be that (L, h), and the coordinate of the first camera head A (0,0), the effective range that just can confirm said touch control area is (0≤x≤L, 0≤y≤h) according to the coordinate of said some H.
S406 switches to the touch control model;
Confirm that under the deterministic model of border automatic switchover perhaps switches to the touch control model according to user's input instruction after the effective range of said touch control area.
The pattern of said touch control model for detection and location are carried out in the touch control operation in the effective range of said touch control area.Under said touch control model, the touch objects that occurs in the said touch control area is processed device and is defaulted as general touch objects.
As a kind of preferred implementation, when switching to the touch control model, send and switch to the cue that touches control model, remind user border affirmation mode to finish, can touch control operation.
S407; In the touch locating information that touches the said touch control area that obtains each said camera head shooting under the control model; Wherein, said touch locating information comprises the positional information of touch objects image in the image of the said touch control area that each said camera head is taken;
S408 according to the said touch locating information of said first camera head and the shooting of said second camera head, calculates first preliminary coordinate of touch objects;
S409 screens first preliminary coordinate of said touch objects according to the effective range of said touch control area;
S410 according to the coordinate of screening remaining said first preliminary coordinate in back and the 3rd camera head, calculates the theoretical value of said the 3rd camera head to the touch locating information of said first preliminary coordinate shooting;
S411 with the actual value of the touch locating information of the theoretical value of said touch locating information and said the 3rd camera head relatively, if comparative result is identical, then confirms as touch article coordinate with said first preliminary coordinate of correspondence.
In the camera type touch control method of this embodiment; At first under the deterministic model of border, obtain image at the boundary alignment thing of said touch control area; Obtain the boundary alignment information of the said touch control area of each said camera head shooting, go out the effective range of said touch control area then according to said boundary alignment information calculations; Touching under the control model then; Obtain the image of touch objects and calculate first preliminary coordinate of touch objects; Effective range according to said touch control area is screened first preliminary coordinate of said touch objects; Thereby first preliminary coordinate that will exceed the effective range of said touch control area is screened, and calculates according to first preliminary coordinate after the screening and confirms touch article coordinate.Thereby can reduce the operand that the calculating of touch article coordinate is confirmed, and the time-delay of location, the speed of camera type touch location improved.
See also Fig. 6, Fig. 6 is the part schematic flow sheet of another preferred implementation of camera type touch control method of the present invention.
In this embodiment, after execution in step S407 obtained the said touch locating information of each said camera head shooting, execution in step S408 calculated before first preliminary coordinate of touch objects, further carries out following steps:
S412; The image of the said touch control area that each said camera head is taken is divided at least two sub regions images being parallel on the direction of said display device; And according to the dividing mode of said subregion image; And effective coordinate range of said touch control area, said touch control area is divided at least two respectively with respect to each camera head takes subregion;
Whether S413 when the touch objects image occurring at least three said subregion images, has common overlapping region according to each said shooting subregion corresponding with each said subregion image, judges whether the combination of each said subregion image is effective;
If judge that in step S413 the combination of each said subregion image is effective, illustrate that then said touch objects is all photographed by each camera head, and touch article coordinate is effective touch objects, execution in step S414 within the scope of said touch control area.If judge that the combination of each said subregion image is invalid, illustrate that then said touch objects is not all to be photographed by each camera head, perhaps touch article coordinate is invalid touch objects, execution in step S415 outside the scope of said touch control area.
S414 is divided into effective touch objects with said touch objects;
S415 is divided into invalid touch objects with said touch objects.
Then, in step S408, calculate the coordinate time of touch objects, can directly ignore invalid touch objects, only effective touch objects is carried out coordinate setting and calculate, save computing time, accelerate locating speed invalid touch article coordinate.
When in step S412, carrying out the division of said subregion image, the division number of said subregion image, and the size of each said subregion image etc. can be set according to user's self use needs.The overall principle is: the division number of said subregion image is many more; Anticipation elimination ability to touch objects is just strong more; But the operand that anticipation is got rid of can increase along with the division number of subregion image again and become big; Therefore the division number of said subregion image should be chosen moderate; Promptly both guarantee enough anticipation elimination abilities, and can not increase too much extra computing again.And the size of each said subregion image is preferably the equal portions division, can reduce operand.
As a kind of preferred implementation; In this step; The image of the said touch control area that two camera heads that are arranged on said display device two ends on one side can be taken is being parallel to the three sub regions images that are divided into equal in length on the said display device direction; And the image of the said touch control area taken of the camera head that will be arranged on the midpoint on said limit, be parallel to six sub regions images that are divided into equal in length on the said display device direction.Through overtesting, the dividing mode of this seed region image can realize the anticipation of touch objects is got rid of efficiently, also can guarantee higher arithmetic speed, is more excellent dividing mode.
As another kind of preferred implementation of the present invention, in this step, the image of said touch control area is being parallel to the three sub regions images that are divided into equal in length on the said display device direction; Then correspondence is divided into said touch control area to each said camera head: with said camera head place is the summit, three shooting subregions that drift angle equates.
As shown in Figure 7, the image division of the touch control area that this first camera head A takes becomes three sub regions image A 1 of equal in length, A2; A3; Then the touch control area of institute's display device is divided into three shooting subregion a1 that drift angle equates, a2, a3 with respect to this first camera head A;
The image division of the touch control area that this second camera head B takes becomes three sub regions image B 1 of equal in length; B2, B3, then the touch control area of institute's display device is divided into three shooting subregion b1 that drift angle equates with respect to this second camera head B; B2, b3;
The image division of the touch control area that the 3rd camera head C takes becomes three sub regions image C 1 of equal in length; C2, C3, then the touch control area of institute's display device is divided into three shooting subregion c1 that drift angle equates with respect to the 3rd camera head C; C2, c3.
Because when touch objects appears in said touch control area; Will inevitably be photographed the touch objects image simultaneously by three camera heads; Therefore; Each touch objects image all will inevitably appear at respectively in the sub regions image of three camera heads; That is; Each touch objects image will inevitably appear at all that (Ax, Bx is Cx) in the subregion image collection of Zu Chenging.Yet; Not all (Ax that appears at; Bx; Cx) touch objects in the subregion image collection of Zu Chenging all is effective touch objects; Because said subregion image is to be parallel to said display device plane from the side to take; Therefore, the scope of its shooting possibly exceed the touch control area of said display device, and photographs invalid touch objects.
And the shooting subregion of corresponding said subregion image division is within the effective range of said touch control area; That is to say; Any point in the effective range of said touch control area should appear at all that (ax, bx is in the shooting subregion of cx) the forming set.For example, among Fig. 8, the zone is 1 by a1, and b1, three of c1 take that subregion is overlapping to form, and zone 2 is by a2, and b1, three of c1 take that subregion is overlapping to form; The zone is 3 by a3, and b1, three of c1 take that subregion is overlapping to form; The zone is 4 by a3, and b2, three of c1 take that subregion is overlapping to form; The zone is 5 by a3, and b2, three of c2 take that subregion is overlapping to form; The zone is 6 by a2, and b1, three of c2 take that subregion is overlapping to form; The zone is 7 by a1, and b1, three of c2 take that subregion is overlapping to form; The zone is 8 by a1, and b2, three of c2 take that subregion is overlapping to form; The zone is 9 by a2, and b2, three of c2 take that subregion is overlapping to form; The zone is 10 by a2, and b3, three of c2 take that subregion is overlapping to form; The zone is 11 by a2, and b3, three of c3 take that subregion is overlapping to form; The zone is 12 by a1, and b3, three of c3 take that subregion is overlapping to form; The zone is 13 by a1, and b2, three of c3 take that subregion is overlapping to form; The zone is 14 by a1, and b1, three of c3 take that subregion is overlapping to form.
It is thus clear that each zone is all by corresponding overlapping the forming of shooting subregion of different camera heads.
In step S413, when there was touch objects said touch control area, each said camera head all can photograph the touch objects image, and said touch objects image appears at respectively in one of them subregion image of each said camera head shooting.Be illustrated in figure 9 as said touch control area zone 13 (a1, b2, when c3) touch objects occurring, the synoptic diagram of the subregion image that each camera head is taken.At this moment, according to the corresponding relation of said subregion image and said shooting subregion, the combination of each said subregion image is judged.
The rule of judging is: whether each the corresponding said shooting subregion of each said subregion image that said touch objects image occurs all has common overlapping region, and then the combination of the said subregion image of Dui Ying each is effective; Otherwise it is invalid.
At this whether effective method of two kinds of combinations of judging each said subregion image is provided:
A kind of determination methods is when touch objects occurs; Instant dividing mode according to each the said subregion image that the touch objects image occurs; And effective coordinate range of said touch control area, the coordinate range of each said shooting subregion of calculating correspondence respectively; And, judge whether each said shooting subregion has common overlapping region according to the coordinate range of each said shooting subregion; If common overlapping region is arranged, then judge being combined as effectively of each corresponding said subregion image; If there is not common overlapping region, judge that then being combined as of each corresponding said subregion image is invalid.This method is whether when touch objects occurs, calculate the combination of each said subregion image more effective, and this moment is because confirmed the shooting subregion that said touch objects is corresponding, so only need less computing.
Another determination methods is in advance according to the dividing mode of each said subregion image; And effective coordinate range of said touch control area; Calculate the coordinate range of all said shooting subregions in advance; And from said at least three camera heads a plurality of shooting subregions separately; Get one of them shooting subregion respectively and make up, judge each and take subregion and whether have common overlapping region; If common overlapping region is arranged, then the combination with each said subregion image of correspondence is divided into effective combination; If there is not common overlapping region, then the combination with each said subregion image of correspondence is divided into invalid combination;
Then, when the touch objects image occurring in the subregion image that each said camera head is taken, judge directly whether each said subregion image constitutes said effective combination.Thereby whether the combination that can judge each said subregion image fast is effective.
This method is before touch objects occurs; Calculate the combination of effective subregion image with regard to budget; When touch objects is arranged; Directly judge each the sub regions image corresponding effective combination whether that the touch objects image occurs; Need not to carry out again coordinate Calculation, can significantly reduce the operation time in the position fixing process.
Therefore, in step S408, calculate the coordinate time of touch objects, can directly ignore invalid touch objects, only effective touch objects is carried out coordinate setting and calculate, save computing time, accelerate locating speed invalid touch article coordinate.
In the camera type touch control method of this preferred implementation; The image division of the said touch control area that each said camera head is taken becomes a plurality of subregion images, and correspondingly said touch control area is divided into a plurality of shooting subregions for each camera head.When photographing the touch objects image, judge that the touch objects image appears at that subregion image, thereby can judge which shooting subregion that said touch objects appears at each camera head.Whether each shooting subregion that occurs according to said touch objects has common overlapping region; Whether the combination of judging each said subregion image is effective; If effectively; Illustrate that then corresponding said touch objects might all be photographed by all camera heads; And therefore touch article coordinate is divided into effective touch objects with these touch objects within the scope of said touch control area; If photograph touch objects in the corresponding subregion image; But the combination of each said subregion image is invalid; Illustrate that then said touch objects is not all photographed by each camera head, perhaps touch article coordinate is invalid touch objects outside the scope of said touch control area.
Therefore; Can be under the situation of the coordinate that does not calculate touch objects through said camera type touch control method; According to photographing the subregion image that touch objects occurs; Whether touch objects is effectively carried out preliminary judgement; The touch objects that exclusive segment is invalid, the coordinate time calculating each touch objects then need not to calculate the coordinate that is divided into invalid touch objects again; Significantly reduce the operand of touch location and the time-delay of location, improved the speed of camera type touch location.
See also Figure 10, Figure 10 is the structural representation of camera type touch control device of the present invention.
Said camera type touch control device comprises:
Image collection module 11 is used to obtain the image of the said touch control area that first camera head, second camera head and the 3rd camera head at the edge, touch control area that is arranged on display device take respectively;
Primary Location module 12 is used for the said touch locating information according to said first camera head and the shooting of said second camera head, calculates first preliminary coordinate of touch objects; Wherein, said touch locating information comprises the positional information of touch objects image in the image of the said touch control area that each said camera head is taken;
Computing module 13 is used for the coordinate according to said first preliminary coordinate and the 3rd camera head, calculates the theoretical value of said the 3rd camera head to the touch locating information of said first preliminary coordinate shooting;
Comparer 14 is used for the actual value of the touch locating information of the theoretical value of said touch locating information and said the 3rd camera head relatively;
Locating module 15 is used for comparative result at said comparer when identical, and said first preliminary coordinate of correspondence is confirmed as touch article coordinate.
Wherein, said touch objects image can be represented with the distance value of the image border of touch objects image and said touch control area in the said positional information of stating in the image that touches the control area; Also the information of shooting angles that can take said touch objects according to each camera head that said relative position information calculates is represented the position of touch objects image in the image of the said touch control area that each said camera head is taken with said information of shooting angles.
Said Primary Location module 12 can be calculated the intersection point of confirming two straight lines from two camera heads to touch article coordinate according to the said touch locating information of two camera heads, and said intersection point is the preliminary coordinate of touch objects.
Said computing module 13 is according to the coordinate of said first preliminary coordinate and the 3rd camera head; Can confirm the straight line of the coordinate of said first preliminary coordinate and the 3rd camera head; Thereby can calculate the information of shooting angles of said the 3rd camera head according to said straight line, perhaps further calculate the theoretical position information of image in the image of the said touch control area that said the 3rd camera head is taken of said first preliminary coordinate said first preliminary coordinate.
Said comparer 14 with the actual value of the touch locating information of the theoretical value of said touch locating information and said the 3rd camera head relatively, if comparative result is identical, then said locating module 15 is confirmed as touch article coordinate with said first preliminary coordinate of correspondence.
If the comparative result of said comparer 14 is inequality, illustrate that then the touch locating information of said first preliminary coordinate and said the 3rd camera head shooting acquisition is inconsistent, said first preliminary coordinate is an invalid coordinates.
Compared with prior art; In the camera type touch control device of the present invention; Said Primary Location module is at first calculated first preliminary coordinate of touch objects, and said then computing module goes out the theoretical value of said the 3rd camera head to the touch locating information of said first preliminary coordinate shooting according to the coordinate Calculation of said first preliminary coordinate and the 3rd camera head; Said comparer compares the theoretical value of said touch locating information with the actual value of taking acquisition, thereby said first preliminary coordinate is screened, and said locating module directly can be confirmed touch article coordinate according to the coordinate after the screening.Need not to carry out the calculating of twice preliminary coordinate, reduced operand the calculating of touch article coordinate, and the time-delay of location, the speed of camera type touch location improved.
See also Figure 11, Figure 11 is the structural representation of a kind of preferred implementation of camera type touch control device of the present invention.
In this embodiment, said camera type touch control device comprises:
Image collection module 21 is used to obtain the image of the said touch control area that first camera head, second camera head and the 3rd camera head at the edge, touch control area that is arranged on display device take respectively;
Schema management module 22 is used to start the border deterministic model or touches control model, or switches to the touch control model from starting the border deterministic model;
Boundary alignment module 23; Be used under the deterministic model of border, obtaining the boundary alignment information of the said touch control area that each said camera head takes; And the said boundary alignment information of taking according to each said camera head; Calculate the coordinate of said boundary alignment thing; According to the coordinate of said boundary alignment thing, confirm the effective range of said touch control area;
Wherein, said boundary alignment information comprises the positional information of image in the image of the said touch control area that each said camera head is taken of boundary alignment thing;
Primary Location module 24 is used for the said touch locating information according to said first camera head and the shooting of said second camera head, calculates first preliminary coordinate of touch objects;
Wherein, said touch locating information comprises the positional information of touch objects image in the image of the said touch control area that each said camera head is taken;
Screening module 25 is used for according to the effective range of the definite said touch control area of said boundary alignment module first preliminary coordinate of said touch objects being screened;
Computing module 26 is used for the coordinate according to said screening module 25 screening remaining first preliminary coordinate in back and the 3rd camera head, calculates the theoretical value of said the 3rd camera head to the touch locating information of said first preliminary coordinate shooting;
Comparer 27 is used for the actual value of the touch locating information of the theoretical value of said touch locating information and said the 3rd camera head relatively;
Locating module 28 is used for comparative result at said comparer when identical, and said first preliminary coordinate of correspondence is confirmed as touch article coordinate.
Wherein, when said camera type touch control device operate as normal, said schema management module 22 at first starts the border deterministic model;
The pattern that said border deterministic model positions for the effective range to said touch control area, the touch objects that under the deterministic model of said border, occurs in the said touch control area is processed device and is defaulted as the boundary alignment thing.
As a kind of preferred implementation, said schema management module 22 is sent the cue that starts the border deterministic model when starting the border deterministic model, reminds user touch point to hit the border of said touch control area, so that carry out boundary alignment.
As a kind of preferred implementation, under the deterministic model of border, said boundary alignment module 23 is obtained the positional information of image in the image of said touch control area of a plurality of boundary alignment things that occur successively as said boundary alignment information; And calculate the coordinate of said a plurality of boundary alignment things respectively, and will be the effective range of the polygonal region on summit with the coordinate points of said a plurality of boundary alignment things as said touch control area according to said boundary alignment information.
This mode is clicked by the touch of user under the deterministic model of said border fully, confirms the effective range of said touch control area, so the user can set according to the needs of oneself more neatly, and is very convenient.
As another kind of preferred implementation, under the deterministic model of border, said boundary alignment module 23 is obtained the positional information of image in the image of said touch control area of several boundary alignment things that occur successively as said boundary alignment information; And calculate the coordinate of said a plurality of boundary alignment things respectively according to said boundary alignment information, and the polygonal region that will be the summit with the coordinate points and the predefined coordinate points of said a plurality of boundary alignment things is as the effective range of said touch control area.
This mode is clicked and preset coordinate points according to the touch of user under the deterministic model of said border; Confirm the effective range of said touch control area; Therefore the user can only click one of them or several point; Just can easily confirm the effective range of said touch control area; Very convenient, and also can accelerate the speed of boundary alignment.
Confirm under the deterministic model of border after the effective range of said touch control area that said schema management module 22 automatic switchovers perhaps switch to the touch control model according to user's input instruction.
The pattern of said touch control model for detection and location are carried out in the touch control operation in the effective range of said touch control area.Under said touch control model, the touch objects that occurs in the said touch control area is processed device and is defaulted as general touch objects.
As a kind of preferred implementation, said schema management module 22 is sent and is switched to the cue that touches control model when switching to the touch control model, reminds user border affirmation mode to finish, and can touch control operation.
In the camera type touch control device of this embodiment; Said schema management module at first starts the border deterministic model; Said boundary alignment module is obtained the image at the boundary alignment thing of said touch control area under the deterministic model of border; Obtain the boundary alignment information of the said touch control area of each said camera head shooting, go out the effective range of said touch control area then according to said boundary alignment information calculations; Said then schema management module switches to the touch control model; Said Primary Location module is touching under the control model; Obtain the image of touch objects and calculate first preliminary coordinate of touch objects; Said screening module is screened first preliminary coordinate of said touch objects according to the effective range of said touch control area; Thereby first preliminary coordinate that will exceed the effective range of said touch control area is screened, and said locating module is confirmed touch article coordinate according to remaining first preliminary coordinate in screening back again.Thereby can reduce the operand that the calculating of touch article coordinate is confirmed, and the time-delay of location, the speed of camera type touch location improved.
See also Figure 12, Figure 12 is the structural representation of the another kind of preferred implementation of camera type touch control device of the present invention.
As a kind of preferred implementation of camera type touch control device of the present invention, said camera type touch control device further comprises:
Area dividing module 31; The image of the said touch control area that is used for each said camera head is taken is divided at least two sub regions images being parallel on the direction of said display device; And according to the dividing mode of said subregion image; And effective coordinate range of said touch control area, said touch control area is divided at least two respectively with respect to each camera head takes subregion;
Judge module 32; Be used for when the touch objects image appears at least three said subregion images; Whether each the said shooting subregion according to corresponding with each said subregion image has common overlapping region, judges whether the combination of each said subregion image is effective;
Sort module 33 is used in the judged result of said judge module 32 said touch objects being divided into effective touch objects when being; For not the time, said touch objects is divided into invalid touch objects in the judged result of said judge module 32.
If said judge module 32 judges that the combination of each said subregion image is effective, illustrate that then said touch objects is all photographed by each camera head, and touch article coordinate is effective touch objects within the scope of said touch control area.If said judge module 32 judges that the combination of each said subregion image is invalid, illustrate that then said touch objects is not all to be photographed by each camera head, perhaps touch article coordinate is invalid touch objects outside the scope of said touch control area.Therefore, the coordinate time that said Primary Location module 24 is calculated touch objects can directly be ignored invalid touch objects, only effective touch objects is carried out coordinate setting and calculates, and saves the computing time to invalid touch article coordinate, accelerates locating speed.
Wherein, said area dividing module 31 when carrying out the division of said subregion image, the division number of said subregion image, and the size of each said subregion image etc. can be set according to user's self use needs.The overall principle is: the division number of said subregion image is many more; Anticipation elimination ability to touch objects is just strong more; But the operand that anticipation is got rid of can increase along with the division number of subregion image again and become big; Therefore the division number of said subregion image should be chosen moderate; Promptly both guarantee enough anticipation elimination abilities, and can not increase too much extra computing again.And the size of each said subregion image is preferably the equal portions division, can reduce operand.
As a kind of preferred implementation; The image of the said touch control area that said area dividing module 31 can be taken two camera heads that are arranged on said display device two ends on one side is being parallel to the three sub regions images that are divided into equal in length on the said display device direction; And the image of the said touch control area taken of the camera head that will be arranged on the midpoint on said limit, be parallel to six sub regions images that are divided into equal in length on the said display device direction.Through overtesting, the dividing mode of this seed region image can realize the anticipation of touch objects is got rid of efficiently, also can guarantee higher arithmetic speed, is more excellent dividing mode.
As another kind of preferred implementation of the present invention, said area dividing module 31 also can be parallel to the three sub regions images that are divided into equal in length on the said display device direction with the image of said touch control area; Then correspondence is divided into said touch control area to each said camera head: with said camera head place is the summit, three shooting subregions that drift angle equates.
It all is within the effective range of said touch control area that each of said area dividing module 31 corresponding said subregion image division taken subregion; That is to say; Any point in the effective range of said touch control area all should appear at (ax; Bx; Cx) in the shooting subregion set of forming, each subregion of said touch control area is all by corresponding overlapping the forming of shooting subregion of different camera heads.
When there was touch objects said touch control area, each said camera head all can photograph the touch objects image, and said touch objects image appears at respectively in one of them subregion image of each said camera head shooting.At this moment, said judge module 32 is judged the combination of each said subregion image according to the corresponding relation of said subregion image and said shooting subregion.
The judgment rule of said judge module 32 is: whether each the corresponding said shooting subregion of each said subregion image that said touch objects image occurs all has common overlapping region, and then the combination of the said subregion image of Dui Ying each is effective; Otherwise it is invalid.
In a preferred implementation; Said judge module 32 can be when touch objects occurs; Instant dividing mode according to each the said subregion image that the touch objects image occurs; And effective coordinate range of said touch control area, the coordinate range of each said shooting subregion of calculating correspondence respectively; And, judge whether each said shooting subregion has common overlapping region according to the coordinate range of each said shooting subregion; If common overlapping region is arranged, then judge being combined as effectively of each corresponding said subregion image; If there is not common overlapping region, judge that then being combined as of each corresponding said subregion image is invalid.This method is whether when touch objects occurs, calculate the combination of each said subregion image more effective, and this moment is because confirmed the shooting subregion that said touch objects is corresponding, so only need less computing.
In another preferred implementation, said judge module 32 comprises:
Presetting module; Be used in advance dividing mode according to each said subregion image; And effective coordinate range of said touch control area; Calculate the coordinate range of all said shooting subregions in advance; And from said at least three camera heads a plurality of shooting subregions separately; Get one of them shooting subregion respectively and make up, judge each and take subregion and whether have common overlapping region; If common overlapping region is arranged, then the combination with each said subregion image of correspondence is divided into effective combination; If there is not common overlapping region, then the combination with each said subregion image of correspondence is divided into invalid combination;
The real-time judge module is used for when the touch objects image appears in the subregion image that each said camera head is taken, judging directly whether each said subregion image constitutes said effective combination.Thereby whether the combination that can judge each said subregion image fast is effective.
This method is before touch objects occurs, and said presetting module calculates effective shooting subregion combination, the perhaps combination of subregion image with regard to budget.And when touch objects is arranged; Said real-time judge module is directly judged each the sub regions image corresponding effective combination whether that the touch objects image occurs; If just can directly judge corresponding shooting subregion common overlapping region is arranged; Need not to carry out again coordinate Calculation, can significantly reduce the operation time in the position fixing process.
If said judge module 32 judges that the combination of each said subregion image is effective; Illustrate that then said touch objects is all photographed by each camera head; And touch article coordinate is within the scope of said touch control area; Be effective touch objects, therefore said sort module 33 is divided into effective touch objects with said touch objects.If said judge module 32 judges that the combination of each said subregion image is invalid; Illustrate that then said touch objects is not all to be photographed by each camera head; Perhaps touch article coordinate is outside the scope of said touch control area; Be invalid touch objects, therefore said sort module 33 is divided into invalid touch objects with said touch objects.
Therefore, said Primary Location module 24 can directly be ignored invalid touch objects when calculating first preliminary coordinate of touch objects, only effective touch objects is calculated, and saves the computing time to invalid touch article coordinate, accelerates locating speed.
Camera type touch control device through this preferred implementation can be under the situation of the coordinate that does not calculate touch objects; According to photographing the subregion image that touch objects occurs; Whether touch objects is effectively carried out preliminary judgement; The touch objects that exclusive segment is invalid; At the coordinate time that calculates each touch objects; Then need not to calculate the coordinate that is divided into invalid touch objects again, significantly reduced the operand of touch location and the time-delay of location, improved the speed of camera type touch location.
Coordinate of the present invention is identical or overlapping, all be the error in considering the shooting process of camera head, and defines on the basis of the suitable error of computation process, and identical or overlapping on the absolute sense frequently.Should in allowing certain error range, understand identical or the overlapping and similar description of coordinate of the present invention.
The present invention is not limited to above embodiment, for example: use camera type touch control method of the present invention and the device can carry out the multipoint positioning more than 3.When carrying out the multipoint positioning (for example four point locations) more than 3, the photographic images information of each camera head made up in twos find the solution, contrast obtains the touch objects actual coordinate.
In addition; Through understanding technical scheme of the present invention; Those skilled in the art will recognize; The photographic images information that makes these all camera heads combination is in twos respectively found the solution; Perhaps increase captured image information through the number that increases camera head; Can make the location of this camera type touch control method of the present invention and device thereof more accurate, also can realize the multipoint positioning more than 3 more easily.
Above-described embodiment of the present invention does not constitute the qualification to protection domain of the present invention.Any modification of within spirit of the present invention and principle, being done, be equal to replacement and improvement etc., all should be included within the claim protection domain of the present invention.

Claims (10)

1. camera type touch control method is characterized in that may further comprise the steps:
Obtain the image of the said touch control area that first camera head, second camera head and the 3rd camera head at the edge, touch control area that is arranged on display device take respectively;
According to the said touch locating information of said first camera head and the shooting of said second camera head, calculate first preliminary coordinate of touch objects; Wherein, said touch locating information comprises the positional information of touch objects image in the image of the said touch control area that each said camera head is taken;
According to the coordinate of said first preliminary coordinate and the 3rd camera head, calculate the theoretical value of said the 3rd camera head to the touch locating information of said first preliminary coordinate shooting;
With the actual value of the touch locating information of the theoretical value of said touch locating information and said the 3rd camera head relatively, if comparative result is identical, then said first preliminary coordinate of correspondence is confirmed as touch article coordinate.
2. camera type touch control method as claimed in claim 1 is characterized in that, before first preliminary coordinate of calculating touch objects, at first starts the border affirmation mode;
Under the affirmation mode of border, obtain the boundary alignment information of the said touch control area of each said camera head shooting; Wherein, said boundary alignment information comprises the positional information of image in the image of the said touch control area that each said camera head is taken of boundary alignment thing;
And, calculate the coordinate of said boundary alignment thing, and, confirm the effective range of said touch control area according to the coordinate of said boundary alignment thing according to the said boundary alignment information that each said camera head is taken;
Then, switch to the touch control model;
Touching first preliminary coordinate of calculating said touch objects under the control model;
And the effective range according to said touch control area is screened said first preliminary coordinate;
According to first preliminary coordinate after the screening and the coordinate of the 3rd camera head, calculate the theoretical value of said the 3rd camera head to the touch locating information of said first preliminary coordinate shooting.
3. camera type touch control method as claimed in claim 2 is characterized in that, the step of under the affirmation mode of border, obtaining said boundary alignment information comprises:
The positional information of image in the image of said touch control area of obtaining a plurality of boundary alignment things that occur successively is as said boundary alignment information;
When confirming the effective range of said touch control area; Calculate the coordinate of said a plurality of boundary alignment things respectively according to said boundary alignment information, and will be the effective range of the polygonal region on summit with the coordinate points of said a plurality of boundary alignment things as said touch control area.
4. camera type touch control method as claimed in claim 2 is characterized in that, the step of under the affirmation mode of border, obtaining said boundary alignment information comprises:
The positional information of image in the image of said touch control area of obtaining several boundary alignment things that occur successively is as said boundary alignment information;
When confirming the effective range of said touch control area; Calculate the coordinate of said a plurality of boundary alignment things respectively according to said boundary alignment information, and the polygonal region that will be the summit with the coordinate points and the predefined coordinate points of said a plurality of boundary alignment things is as the effective range of said touch control area.
5. like any described camera type touch control method of claim 2 to 4, it is characterized in that, when starting the border affirmation mode, send the cue that starts the border affirmation mode; When switching to the touch control model, send and switch to the cue that touches control model.
6. camera type touch control device is characterized in that comprising:
Image collection module is used to obtain the image of the said touch control area that first camera head, second camera head and the 3rd camera head at the edge, touch control area that is arranged on display device take respectively;
The Primary Location module is used for the said touch locating information according to said first camera head and the shooting of said second camera head, calculates first preliminary coordinate of touch objects; Wherein, said touch locating information comprises the positional information of touch objects image in the image of the said touch control area that each said camera head is taken;
Computing module is used for the coordinate according to said first preliminary coordinate and the 3rd camera head, calculates the theoretical value of said the 3rd camera head to the touch locating information of said first preliminary coordinate shooting;
Comparer is used for the actual value of the touch locating information of the theoretical value of said touch locating information and said the 3rd camera head relatively;
Locating module is used for comparative result at said comparer when identical, and said first preliminary coordinate of correspondence is confirmed as touch article coordinate.
7. camera type touch control device as claimed in claim 6 is characterized in that, further comprises:
The schema management module is used to start the border deterministic model or touches control model, or switches to the touch control model from starting the border deterministic model
The boundary alignment module; Be used under the deterministic model of border, obtaining the boundary alignment information of the said touch control area that each said camera head takes; And the said boundary alignment information of taking according to each said camera head; Calculate the coordinate of said boundary alignment thing; According to the coordinate of said boundary alignment thing, confirm the effective range of said touch control area;
Wherein, said boundary alignment information comprises the positional information of image in the image of the said touch control area that each said camera head is taken of boundary alignment thing;
Screening module is used for according to the effective range of the definite said touch control area of said boundary alignment module first preliminary coordinate of the said touch objects of said Primary Location module calculating being screened; And will screen remaining said first preliminary coordinate in back the theoretical value of calculating the touch locating information that said the 3rd camera head takes said first preliminary coordinate to the said computing module will be provided.
8. camera type touch control device as claimed in claim 7; It is characterized in that; Said boundary alignment module is obtained the positional information of image in the image of said touch control area of a plurality of boundary alignment things that occur successively as said boundary alignment information; And calculate the coordinate of said a plurality of boundary alignment things respectively, and will be the effective range of the polygonal region on summit with the coordinate points of said a plurality of boundary alignment things as said touch control area according to said boundary alignment information.
9. camera type touch control device as claimed in claim 7; It is characterized in that; Said boundary alignment module is obtained the positional information of image in the image of said touch control area of several boundary alignment things that occur successively as said boundary alignment information; And calculate the coordinate of said a plurality of boundary alignment things respectively according to said boundary alignment information, and the polygonal region that will be the summit with the coordinate points and the predefined coordinate points of said a plurality of boundary alignment things is as the effective range of said touch control area.
10. like any described camera type touch control device of claim 7 to 9, it is characterized in that said schema management module is sent the cue that starts the border affirmation mode when starting the border affirmation mode; When switching to the touch control model, send and switch to the cue that touches control model.
CN201110257221.3A 2011-09-01 2011-09-01 Camera shooting type touch control method and device Active CN102306069B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201110257221.3A CN102306069B (en) 2011-09-01 2011-09-01 Camera shooting type touch control method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201110257221.3A CN102306069B (en) 2011-09-01 2011-09-01 Camera shooting type touch control method and device

Publications (2)

Publication Number Publication Date
CN102306069A true CN102306069A (en) 2012-01-04
CN102306069B CN102306069B (en) 2014-12-24

Family

ID=45379936

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201110257221.3A Active CN102306069B (en) 2011-09-01 2011-09-01 Camera shooting type touch control method and device

Country Status (1)

Country Link
CN (1) CN102306069B (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080259053A1 (en) * 2007-04-11 2008-10-23 John Newton Touch Screen System with Hover and Click Input Methods
CN101576784A (en) * 2008-05-07 2009-11-11 联想(北京)有限公司 Input control device, notebook computer, input control method
CN101930324A (en) * 2010-07-04 2010-12-29 苏州佳世达电通有限公司 Object detecting system
US20110050640A1 (en) * 2009-09-03 2011-03-03 Niklas Lundback Calibration for a Large Scale Multi-User, Multi-Touch System

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080259053A1 (en) * 2007-04-11 2008-10-23 John Newton Touch Screen System with Hover and Click Input Methods
CN101576784A (en) * 2008-05-07 2009-11-11 联想(北京)有限公司 Input control device, notebook computer, input control method
US20110050640A1 (en) * 2009-09-03 2011-03-03 Niklas Lundback Calibration for a Large Scale Multi-User, Multi-Touch System
CN101930324A (en) * 2010-07-04 2010-12-29 苏州佳世达电通有限公司 Object detecting system

Also Published As

Publication number Publication date
CN102306069B (en) 2014-12-24

Similar Documents

Publication Publication Date Title
US20110234522A1 (en) Touch sensing method and system using the same
US20100283722A1 (en) Electronic apparatus including a coordinate input surface and method for controlling such an electronic apparatus
US20130141326A1 (en) Gesture detecting method, gesture detecting system and computer readable storage medium
US10120501B2 (en) Touch implementation method and device and electronic device
CN102981743B (en) The method of control operation object and electronic equipment
JP2011258244A (en) Interactive input system
JP2005202950A5 (en)
JP2013061848A (en) Noncontact input device
WO2015039775A1 (en) Password entry for double sided multi-touch display
CN101403951A (en) Multi-point positioning device and method for interactive electronic display system
CN103412720A (en) Method and device for processing touch-control input signals
KR20150074145A (en) Method, apparatus and terminal device for controlling movement of application interface
CN104714646A (en) 3D virtual touch control man-machine interaction method based on stereoscopic vision
CN101566898B (en) Positioning device of electronic display system and method
CN107797648A (en) Virtual touch system and image recognition localization method, computer-readable recording medium
CN102364419B (en) Camera type touch control method and system thereof
CN104063142A (en) Information processing method, device and electronic equipment
CN102622140B (en) Image pick-up multi-point touch system
CN106293260A (en) Optical touch device and sensing method thereof
CN202443449U (en) Photographic multi-point touch system
CN102331889B (en) Image pick-up type touch control method and device
WO2015127731A1 (en) Soft keyboard layout adjustment method and apparatus
CN102323866B (en) Camera shooting type touch control method and device
JP5947999B2 (en) Method, electronic device and computer program for improving operation accuracy for touch screen
CN102306069B (en) Camera shooting type touch control method and device

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CP03 Change of name, title or address
CP03 Change of name, title or address

Address after: 510670 Guangdong city of Guangzhou province Kezhu Guangzhou high tech Industrial Development Zone, Road No. 233

Patentee after: VTRON GROUP Co.,Ltd.

Address before: 510663 Guangzhou province high tech Industrial Development Zone, Guangdong, Cai road, No. 6, No.

Patentee before: VTRON TECHNOLOGIES Ltd.

TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20201123

Address after: 215500 No.13, Caotang Road, Changshu, Suzhou, Jiangsu Province

Patentee after: Changshu intellectual property operation center Co.,Ltd.

Address before: Unit 2414-2416, main building, no.371, Wushan Road, Tianhe District, Guangzhou City, Guangdong Province

Patentee before: GUANGDONG GAOHANG INTELLECTUAL PROPERTY OPERATION Co.,Ltd.

Effective date of registration: 20201123

Address after: Unit 2414-2416, main building, no.371, Wushan Road, Tianhe District, Guangzhou City, Guangdong Province

Patentee after: GUANGDONG GAOHANG INTELLECTUAL PROPERTY OPERATION Co.,Ltd.

Address before: 510670 Guangdong city of Guangzhou province Kezhu Guangzhou high tech Industrial Development Zone, Road No. 233

Patentee before: VTRON GROUP Co.,Ltd.

CP02 Change in the address of a patent holder
CP02 Change in the address of a patent holder

Address after: 215500 5th floor, building 4, 68 Lianfeng Road, Changfu street, Changshu City, Suzhou City, Jiangsu Province

Patentee after: Changshu intellectual property operation center Co.,Ltd.

Address before: No.13 caodang Road, Changshu City, Suzhou City, Jiangsu Province

Patentee before: Changshu intellectual property operation center Co.,Ltd.