CN104391573A - Method and device for recognizing throwing action based on single attitude sensor - Google Patents

Method and device for recognizing throwing action based on single attitude sensor Download PDF

Info

Publication number
CN104391573A
CN104391573A CN201410637623.XA CN201410637623A CN104391573A CN 104391573 A CN104391573 A CN 104391573A CN 201410637623 A CN201410637623 A CN 201410637623A CN 104391573 A CN104391573 A CN 104391573A
Authority
CN
China
Prior art keywords
vector
throwing
upper arm
sensor
throwing action
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201410637623.XA
Other languages
Chinese (zh)
Other versions
CN104391573B (en
Inventor
陈敏杰
张柯
孙昊
胡明昱
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
BEIJING HUARU TECHNOLOGY CO LTD
Original Assignee
BEIJING HUARU TECHNOLOGY CO LTD
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by BEIJING HUARU TECHNOLOGY CO LTD filed Critical BEIJING HUARU TECHNOLOGY CO LTD
Priority to CN201410637623.XA priority Critical patent/CN104391573B/en
Publication of CN104391573A publication Critical patent/CN104391573A/en
Application granted granted Critical
Publication of CN104391573B publication Critical patent/CN104391573B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality

Abstract

The invention discloses a method and a device for recognizing a throwing action based on a single attitude sensor. The method comprises the following steps of establishing a reference coordinate system relative to the right, the front and the lower part of a thrower, wherein a right upper arm vector is a unit vector pointing to the right elbow from the right shoulder, reference vectors (formula) respectively represent the lower part, the front and the right of the thrower, an inclined angle between the right upper arm vector and (formula) is theta, and an inclined angle between projection of the right upper arm vector on a plane formed by (formula) and (formula) and (formula) is phi; dividing the right half space of the reference coordinate system into a plurality of attitude areas and mapping a measured throwing area in the reference area; comparing the measured reference area, a time sequence and a predetermined throwing order, if the measured reference area, the time sequence and the predetermined throwing order are the same, judging a throwing action, otherwise, judging a non-throwing action. According to the method and the device, through configuring the single sensor, and collecting the attitude information and analyzing the attitude information in real time, the throwing attitude of a single pawn is recognized under the virtual training scene, the judgment accuracy rate is more than 98%, and the method and the device are easily popularized and applied in all kinds of immersed virtual training systems.

Description

A kind of method based on single attitude sensor identification throwing action and device
Technical field
The application relates to action and catches field, concrete, relates to a kind of method and apparatus utilizing single attitude sensor to identify throwing action.
Background technology
Catching conventional method for action has optical profile type motion capture at present.The task that optical profile type motion capture catches by carrying out execution to the Monitor and track of luminous point specific on human body, simultaneously its principle adopts multi-section video camera to take continuously the luminous point on human body, the picture taken by camera in any one moment can calculate this some position in space with the parameter preset, and then carries out attitude fusion again according to position and resolves thus obtain corresponding action.Its shortcoming is that cost is high, configuration is complicated.Prepare to catch the background environment that first action needs " pure ", human body needs to put on monochromatic clothes, at the position fit on luminous point of key, needs to settle multi-section two-forty camera simultaneously.For the scene of virtual training, the requirement of this technology cannot be reached.
For in individual soldier's virtual training scene of immersion, throwing is an important tactical operation, and the differentiation at present for throwing action is mainly still carried out based on the mode of " function button ".But carry out man-machine interaction based on the mode of button, effectively cannot catch the process of throwing, dynamics and direction, and mutual Experience Degree can be reduced.
Therefore, how can propose a kind of new action acquisition mode, identify throwing action, with while guarantee interactive experience degree, the differentiation accuracy improving throwing action identification becomes the technical matters that prior art needs solution badly.
Summary of the invention
The object of the invention is to solve at present for the multiple deficiency of throwing action identification, propose a kind of based on but the throwing action recognition methods of sensor and device, to improve accuracy and the Experience Degree of identification.
For reaching this object, the present invention by the following technical solutions:
A kind of based on the recognition methods of single-sensor throwing action, comprise the steps:
S110. reference frame establishment step:
Respectively relative under caster, front, rightly set up reference frame OXYZ, wherein, under OX axle points to, its unit direction vector in earth coordinates with under vector represent, before OY axle points to, its unit direction vector uses front vector in earth coordinates represent, OZ axle points to right, the right vector of its unit direction vector represent, the Angle Position of right upper arm in reference frame OXYZ can describe the attitude of right upper arm, and make right upper arm vector be the unit vector pointing to right elbow from right shoulder, right upper arm vector is used in sensor coordinates represent, right upper arm vector uses symbol in earth coordinates represent, order with angle be θ, 0 °≤θ≤90 °, ? with formed plane on projection with between angle be φ ,-180 °≤φ≤180 °;
S120. with reference to angle calculating step:
Right upper arm vector, reference vector be right, front, under, be expressed as in earth coordinates: with utilize above-mentioned vector by following formulae discovery θ and φ,
θ = arccos ( u → 2 · u → R ) ,
S130. reference zone is set up and mapping step:
According to upper arm vector in the region that reference frame OXYZ may pass through, the attitude of right for OXYZ semispace is divided into several attitude regions, wherein, configuration space is mapped in unit pole coordinate circle, several reference zones in the corresponding unit pole coordinate circle in several attitude regions; By the throwing of measuring the area maps of process in described reference zone;
S140. throwing action determining step:
Will at schedule time T mAXthe reference zone sequential of measured mapping and predetermined throwing order are made comparisons, if identical, are then judged as throwing action, if transfer sequence is inconsistent or arrive T mAXrear switching does not complete, and is judged as non-throwing action, wherein, and T mAXfor the maximum duration of a throwing action obtained according to batch experiment or experience.
Preferably, in step S130, described by the throwing of measuring the area maps of process to described reference zone be:
Sensor is installed on right upper arm, and under making right arm naturally droop state, certain axle of sensor is parallel with vertical as far as possible, makes the coordinate of right upper arm vector in sensor coordinate system be right upper arm vector is expressed as in earth coordinates wherein for being tied to the coordinate conversion matrix in earth coordinates from sensor coordinates, to be known be lower vector front vector with right vector vector before before starting with throwing action with right vector replace, and think that, until throwing action terminates, the right vector of front vector is constant, then utilizes with calculate θ and φ and attitude mapping status (r, φ).
Preferably, vector before calculating throwing action starts with right vector be, front vector u → F = u → 2 × u → D , Right vector u → R = u → D × u → F .
Preferably, in step s 130, which, division experimentally actual conditions increase or the minimizing of described reference zone, boundary value also can come to select by experiment.
Preferably, in step S140, throw predetermined order according to throw in save up strength, trunk is had an effect, right arm is had an effect and receive gesture to determine after throwing out.
The invention also discloses a kind of based on single-sensor throwing action recognition device, comprise as lower unit:
Reference frame sets up unit:
Respectively relative under caster, front, rightly set up reference frame OXYZ, wherein, under OX axle points to, its unit direction vector in earth coordinates with under vector represent, before OY axle points to, its unit direction vector uses front vector in earth coordinates represent, OZ axle points to right, the right vector of its unit direction vector represent, the Angle Position of right upper arm in reference frame OXYZ can describe the attitude of right upper arm, and make right upper arm vector be the unit vector pointing to right elbow from right shoulder, right upper arm vector is used in sensor coordinates represent, right upper arm vector uses symbol in earth coordinates represent, order with angle be θ, 0 °≤θ≤90 °, ? with formed plane on projection with between angle be φ ,-180 °≤φ≤180 °;
With reference to angle calculation unit:
Right upper arm vector, reference vector be right, front, under, be expressed as in earth coordinates: with utilize above-mentioned vector by following formulae discovery θ and φ,
θ = arccos ( u → 2 · u → R ) ,
Reference zone is set up and map unit:
According to upper arm vector in the region that reference frame OXYZ may pass through, the attitude of right for OXYZ semispace is divided into several attitude regions, wherein, configuration space is mapped in unit pole coordinate circle, several reference zones in the corresponding unit pole coordinate circle in several attitude regions; By the throwing of measuring the area maps of process in described reference zone;
Throwing action judging unit:
Will at schedule time T mAXthe reference zone sequential of measured mapping and predetermined throwing order are made comparisons, if identical, are then judged as throwing action, if transfer sequence is inconsistent or arrive T mAXrear switching does not complete, and is judged as non-throwing action, wherein, and T mAXfor the maximum duration of a throwing action obtained according to batch experiment or experience.
Preferably, to set up and in map unit at described reference zone, described by the throwing of measuring the area maps of process to described reference zone be:
Sensor is installed on right upper arm, and under making right arm naturally droop state, certain axle of sensor is parallel with vertical as far as possible, makes the coordinate of right upper arm vector in sensor coordinate system be right upper arm vector is expressed as in earth coordinates wherein for being tied to the coordinate conversion matrix in earth coordinates from sensor coordinates, to be known be lower vector front vector with right vector vector before before starting with throwing action with right vector replace, and think that, until throwing action terminates, the right vector of front vector is constant, then utilizes with calculate θ and φ and attitude mapping status (r, φ).
Preferably, vector before calculating throwing action starts with right vector be, front vector u → F = u → 2 × u → D , Right vector u → R = u → D × u → F .
Preferably, in reference zone foundation and map unit, division experimentally actual conditions increase or the minimizing of described reference zone, boundary value also can come to select by experiment.
Preferably, in throwing action judging unit, throw predetermined order according to throw in save up strength, trunk is had an effect, right arm is had an effect and receive gesture to determine after throwing out.
Therefore, single-sensor throwing action of the present invention recognition methods and device, by configuration single-sensor, Real-time Collection attitude information discriminance analysis, the throwing attitude that just can realize individual soldier under virtual training scene distinguishes.The present invention can provide the training environment of high Experience Degree, the differentiation accuracy rate of throwing action more than 98%, method and apparatus of the present invention be easy to all kinds of based on the immersion virtual training system of individual soldier in apply.
Accompanying drawing explanation
Fig. 1 is the process flow diagram of single-sensor throwing action recognition methods according to a particular embodiment of the invention;
Fig. 2 is reference frame schematic diagram according to a particular embodiment of the invention;
Fig. 3 is action mapping area schematic diagram according to a particular embodiment of the invention;
Fig. 4 is the module map of the single-sensor throwing action recognition device according to another specific embodiment of the present invention.
Embodiment
Below in conjunction with drawings and Examples, the present invention is described in further detail.Be understandable that, specific embodiment described herein is only for explaining the present invention, but not limitation of the invention.It also should be noted that, for convenience of description, illustrate only part related to the present invention in accompanying drawing but not entire infrastructure.
Throwing action can be regarded as within one period of short period, the combination of series of actions.By by right upper arm all possible attitude in certain coordinate system, be divided into several region with reference to the right semispace of coordinate system, a kind of basic status of each Regional Representative.In whole throwing process, the state of right upper arm, by from a kind of state in several possibility original state, is in turn switched to end-state continuously.
Therefore, first the present invention sets up reference frame, the mathematical description of each basic status is obtained by reference to angle corresponding in coordinate system, then the right semispace with reference to coordinate system is divided into several regions, by each process in throwing process, such as possible original state, possible final state, and one or several order of switching of state and associated time parameter (the minimax duration as whole action or certain state) are mapped in regional; Finally the regional change of measurement and throwing action regional change rule are compared, to have judged whether throwing action.
Generally speaking, throwing action simply can be divided into following four actions:
save up strength: trunk dextrorotation, may incline with right back; Right upper arm is put behind below.
trunk is had an effect: trunk is left-handed rapidly, may incline with left front; Right upper arm is servo-actuated.
right arm is had an effect: trunk is still in left-handed; Right upper arm is waved before top.
gesture is received: the right convolution of trunk after throwing out; Right upper arm is regained through below.
See Fig. 1, can comprise the steps: based on the recognition methods of single-sensor throwing action
S110. reference frame establishment step:
Describe throwing action, first will set up corresponding reference frame.
See Fig. 2, respectively relative under caster, front, rightly set up reference frame OXYZ, wherein, under OX axle points to, its unit direction vector in earth coordinates with under vector represent, before OY axle points to, its unit direction vector uses front vector in earth coordinates represent, OZ axle points to right, the right vector of its unit direction vector represent, the Angle Position of right upper arm in reference frame OXYZ can describe the attitude of right upper arm, and make right upper arm vector be the unit vector pointing to right elbow from right shoulder, right upper arm vector is used in sensor coordinates represent, right upper arm vector uses symbol in earth coordinates represent.
Order with angle be θ, 0 °≤θ≤90 °, ? with formed plane on projection with between angle be φ ,-180 °≤φ≤180 °, visible φ is around the angle of rotating, the forward of rotation adopts left hand or the right-hand rule, thus differentiates that φ's is positive and negative, (adopting the right-hand rule below).
S120. with reference to angle calculating step:
Right upper arm vector, reference vector be right, front, under, be expressed as in earth coordinates: with utilize above-mentioned vector by following formulae discovery θ and φ,
θ = arccos ( u → 2 · u → R ) ,
S130. reference zone is set up and mapping step:
According to upper arm vector in the region that reference frame OXYZ may pass through, the attitude of right for OXYZ semispace is divided into several attitude regions, wherein, configuration space is mapped in unit pole coordinate circle, several reference zones in the corresponding unit pole coordinate circle in several attitude regions.By the throwing of measuring the area maps of process in described reference zone.
See Fig. 3, in one preferably embodiment, several regions described are 17 regions, 17 regions in 17 corresponding unit pole coordinate circles in attitude region.Comprise the central area being positioned at center, and have other two-layer 16 regions outside central area, every layer has 8 regions accordingly, inside and outside toward each other two-layer, has corresponding dispersion angle, is only that radius is different.
At schematic diagram illustrated in fig. 3, for from the schematic diagram of caster's right flank to caster, make the multiple regions of the hand of caster respectively in this reference zone of approach.
Visible according to above Region dividing, when right upper arm is sagging, should be in No. 10 regions, be in No. 14 regions during upper act, be in No. 1 region during right flat act, be in No. 12 regions during front raise, the rest may be inferred.As long as sample frequency is enough high, be switched to region B from region A, then the region that B is necessarily adjacent with A.
Give an example, naturally droop 45 degree of grenades of throwing out obliquely from initial state arm and reset to hang to arm again, the order that state region switches may be: 10-> 17-> 16-> 15-> 14-> 13-> 5-> 4-> 3-> 11-> 10.
Certainly, 16 regions in outside are only examples, also can be more or less regions.That is, the division of reference zone can experimentally actual conditions increase or reduce, and boundary value also preferably needs to come by experiment to select.
S140: throwing action determining step
Will at schedule time T mAXthe reference zone sequential of measured mapping and predetermined throwing order are made comparisons, if identical, are then judged as throwing action, if transfer sequence is inconsistent or arrive T mAXrear switching does not complete, and is judged as non-throwing action.
Such as, a status switch variable can be defined in a program, once there is No. 10, possible throwing original state, then be saved in sequence start position by No. 10, if at time T mAXin the follow-up state transfer sequence that measures and predetermined sequence consensus, be then judged as throwing action, if transfer sequence is inconsistent or arrival T mAXrear switching does not complete, and is judged as non-throwing action, sequence clear 0, wherein, and T mAXfor the maximum duration of a throwing action obtained according to batch experiment or experience, preferably, T mAXit is 2.5 seconds.
In step S140, throw predetermined order according to throw in save up strength, trunk is had an effect, right arm is had an effect and receive gesture to determine after throwing out.
Wherein, in step S130, described by the throwing of measuring the area maps of process to described reference zone be:
Sensor is installed on right upper arm, and under making right arm naturally droop state, certain axle of sensor is parallel with vertical as far as possible.After supposing to install, right arm naturally droops, the OZ axle of sensor point to vertical on, the horizontal dextrad of OX, OY horizontal forward (right-handed system), then make the coordinate of right upper arm vector in sensor coordinate system be always
In right upper arm motion process, the coordinate conversion matrix be tied to earth coordinates from sensor coordinates that sensor obtains in real time is then right upper arm vector can be expressed as in earth coordinates u → 2 = C E A u → .
In earth coordinates, to be known be lower vector namely vertically downward, vector before but in experimentation with right vector be change, need Binding experiment experience to calculate.Experimentally can obtain, the continuous slowly change of the right vector of vector before in experimentation.Therefore, in an experiment any one exists in the short time interval of throwing action, vector before before throwing action identification can start with throwing action with right vector replace, and think that, until throwing action terminates, the right vector of front vector is constant.Now, according to with calculate θ and φ and attitude mapping status (r, φ).
In experimentation, the calculating of the right vector of front vector divides swing arm and static two kinds of situations.Swing arm refers to reciprocal swing arm when advancing.Right upper arm vector is in the projection of vertical vector dot product represent, work as u aDat time period T wwhen inside there is maximal value, right upper arm swings and just arrives right side of body.Take out the right upper arm vector in existing maximal projection value moment then front vector equals the multiplication cross of right upper arm vector and lower vector right vector equals the multiplication cross of lower vector and front vector and when static, can obtain by experiment, under normal circumstances, when right upper arm is static, be always in right side of body, in like manner, front vector right vector swing arm state can according to u aDperiod of change and amplitude identify.Stationary state can according at period T sinner sensor reading variation range is no more than threshold value and identifies.In experimentation, system constantly identifies swing arm or static, and the right vector of vector before calculating.
Therefore, calculate throwing action and start front vector with right vector time, front vector u → F = u → 2 × u → D , Right vector u → R = u → D × u → F .
Therefore, in an experiment any one exists in the short time interval of throwing action, and this period has calculated front vector with right vector and think that, until throwing action terminates, the right vector of front vector is constant.Now, according to calculate θ and φ and attitude mapping status (r, φ), when state enters 10 region, start to enter throwing action testing process, if the change order of state region and the length of time used and throwing action match, then think and recognize a throwing action, if exceed the throwing action that maximum detection time does not detect coupling yet detection time, then think and throwing action do not detected.
The invention also discloses a kind of single-sensor throwing action recognition device, comprise as lower unit:
Reference frame sets up unit 210:
Respectively relative under caster, front, rightly set up reference frame OXYZ, wherein, under OX axle points to, its unit direction vector in earth coordinates with under vector represent, before OY axle points to, its unit direction vector uses front vector in earth coordinates represent, OZ axle points to right, the right vector of its unit direction vector represent, the Angle Position of right upper arm in reference frame OXYZ can describe the attitude of right upper arm, and make right upper arm vector be the unit vector pointing to right elbow from right shoulder, right upper arm vector is used in sensor coordinates represent, right upper arm vector uses symbol in earth coordinates represent, order with angle be θ, 0 °≤θ≤90 °, ? with formed plane on projection with between angle be φ ,-180 °≤φ≤180 °;
With reference to angle calculation unit 220:
Right upper arm vector, reference vector be right, front, under, be expressed as in earth coordinates: with utilize above-mentioned vector by following formulae discovery θ and φ,
θ = arccos ( u → 2 · u → R ) ,
Reference zone is set up and map unit 230:
According to upper arm vector in the region that reference frame OXYZ may pass through, the attitude of right for OXYZ semispace is divided into several attitude regions, wherein, configuration space is mapped in unit pole coordinate circle, several reference zones in the corresponding unit pole coordinate circle in several attitude regions; By the throwing of measuring the area maps of process in described reference zone;
Throwing action judging unit 240:
Will at schedule time T mAXthe reference zone sequential of measured mapping and predetermined throwing order are made comparisons, if identical, are then judged as throwing action, if transfer sequence is inconsistent or arrive T mAXrear switching does not complete, and is judged as non-throwing action, wherein, and T mAXfor the maximum duration of a throwing action obtained according to batch experiment or experience.
Preferably, in described reference zone foundation and map unit, described by the throwing of measuring the area maps of process to described reference zone be: sensor is installed on right upper arm, under making right arm naturally droop state as far as possible, certain axle of sensor is parallel with vertical, makes the coordinate of right upper arm vector in sensor coordinate system be right upper arm vector is expressed as in earth coordinates wherein for being tied to the coordinate conversion matrix in earth coordinates from sensor coordinates, to be known be lower vector front vector with right vector vector before before starting with throwing action with right vector replace, and think that, until throwing action terminates, the right vector of front vector is constant, then utilizes with calculate θ and φ and attitude mapping status (r, φ).
Preferably, vector before calculating throwing action starts with right vector be, front vector u → F = u → 2 × u → D , Right vector u → R = u → D × u → F .
Preferably, in reference zone foundation and map unit 230, division experimentally actual conditions increase or the minimizing of described reference zone, boundary value also can come to select by experiment.
Further preferably, in throwing action judging unit 240, throw predetermined order according to throw in save up strength, trunk is had an effect, right arm is had an effect and receive gesture to determine after throwing out.
Therefore, single-sensor throwing action of the present invention recognition methods and device, by configuration single-sensor, Real-time Collection attitude information discriminance analysis, the throwing attitude that just can realize individual soldier under virtual training scene distinguishes.The present invention can provide the training environment of high Experience Degree, the differentiation accuracy rate of throwing action more than 98%, method and apparatus of the present invention be easy to all kinds of based on the immersion virtual training system of individual soldier in apply.
Obviously, those skilled in the art should be understood that, above-mentioned of the present invention each unit or each step can realize with general calculation element, they can concentrate on single calculation element, alternatively, they can realize with the executable program code of computer installation, thus they storages can be performed by calculation element in the storage device, or they are made into each integrated circuit modules respectively, or the multiple module in them or step are made into single integrated circuit module to realize.Like this, the present invention is not restricted to the combination of any specific hardware and software.
Above content is in conjunction with concrete preferred implementation further description made for the present invention; can not assert that the specific embodiment of the present invention is only limitted to this; for general technical staff of the technical field of the invention; without departing from the inventive concept of the premise; some simple deduction or replace can also be made, all should be considered as belonging to the present invention by submitted to claims determination protection domain.

Claims (10)

1., based on the recognition methods of single-sensor throwing action, comprise the steps:
S110. reference frame establishment step:
Respectively relative under caster, front, rightly set up reference frame OXYZ, wherein, under OX axle points to, its unit direction vector in earth coordinates with under vector represent, before OY axle points to, its unit direction vector uses front vector in earth coordinates represent, OZ axle points to right, the right vector of its unit direction vector represent, the Angle Position of right upper arm in reference frame OXYZ can describe the attitude of right upper arm, and make right upper arm vector be the unit vector pointing to right elbow from right shoulder, right upper arm vector is used in sensor coordinates represent, right upper arm vector uses symbol in earth coordinates represent, order with angle be θ, 0 °≤θ≤90 °, ? with formed plane on projection with between angle be φ ,-180 °≤φ≤180 °;
S120. with reference to angle calculating step:
Right upper arm vector, reference vector be right, front, under, be expressed as in earth coordinates: with utilize above-mentioned vector by following formulae discovery θ and φ,
θ = arccos ( u → 2 · u → R ) ,
S130. reference zone is set up and mapping step:
According to upper arm vector in the region that reference frame OXYZ may pass through, the attitude of right for OXYZ semispace is divided into several attitude regions, wherein, configuration space is mapped in unit pole coordinate circle, several reference zones in the corresponding unit pole coordinate circle in several attitude regions; By the throwing of measuring the area maps of process in described reference zone;
S140. throwing action determining step:
Will at schedule time T mAXthe reference zone sequential of measured mapping and predetermined throwing order are made comparisons, if identical, are then judged as throwing action, if transfer sequence is inconsistent or arrive T mAXrear switching does not complete, and is judged as non-throwing action, wherein, and T mAXfor the maximum duration of a throwing action obtained according to batch experiment or experience.
2. according to claim 1 based on the recognition methods of single-sensor throwing action, it is characterized in that:
In step S130, described by the throwing of measuring the area maps of process to described reference zone be:
Sensor is installed on right upper arm, and under making right arm naturally droop state, certain axle of sensor is parallel with vertical as far as possible, makes the coordinate of right upper arm vector in sensor coordinate system be u → = 0 , 0 , - 1 T , Right upper arm vector is expressed as in earth coordinates wherein for being tied to the coordinate conversion matrix in earth coordinates from sensor coordinates, to be known be lower vector front vector with right vector vector before before starting with throwing action with right vector replace, and think that, until throwing action terminates, the right vector of front vector is constant, then utilizes with calculate θ and φ and attitude mapping status (r, φ).
3. according to claim 2 based on the recognition methods of single-sensor throwing action, it is characterized in that:
Vector before calculating throwing action starts with right vector be, front vector right vector u → R = u → D × u → F .
4. according in claim 1-3 described in any one based on the recognition methods of single-sensor throwing action, it is characterized in that:
In step s 130, which, division experimentally actual conditions increase or the minimizing of described reference zone, boundary value also can come to select by experiment.
5. according to claim 4 based on the recognition methods of single-sensor throwing action, it is characterized in that:
In step S140, throw predetermined order according to throw in save up strength, trunk is had an effect, right arm is had an effect and receive gesture to determine after throwing out.
6., based on a single-sensor throwing action recognition device, comprise as lower unit:
Reference frame sets up unit:
Respectively relative under caster, front, rightly set up reference frame OXYZ, wherein, under OX axle points to, its unit direction vector in earth coordinates with under vector represent, before OY axle points to, its unit direction vector uses front vector in earth coordinates represent, OZ axle points to right, the right vector of its unit direction vector represent, the Angle Position of right upper arm in reference frame OXYZ can describe the attitude of right upper arm, and make right upper arm vector be the unit vector pointing to right elbow from right shoulder, right upper arm vector is used in sensor coordinates represent, right upper arm vector uses symbol in earth coordinates represent, order with angle be θ, 0 °≤θ≤90 °, ? with formed plane on projection with between angle be φ ,-180 °≤φ≤180 °;
With reference to angle calculation unit:
Right upper arm vector, reference vector be right, front, under, be expressed as in earth coordinates: with utilize above-mentioned vector by following formulae discovery θ and φ,
θ = arccos ( u → 2 · u → R ) ,
Reference zone is set up and map unit:
According to upper arm vector in the region that reference frame OXYZ may pass through, the attitude of right for OXYZ semispace is divided into several attitude regions, wherein, configuration space is mapped in unit pole coordinate circle, several reference zones in the corresponding unit pole coordinate circle in several attitude regions; By the throwing of measuring the area maps of process in described reference zone;
Throwing action judging unit:
Will at schedule time T mAXthe reference zone sequential of measured mapping and predetermined throwing order are made comparisons, if identical, are then judged as throwing action, if transfer sequence is inconsistent or arrive T mAXrear switching does not complete, and is judged as non-throwing action, wherein, and T mAXfor the maximum duration of a throwing action obtained according to batch experiment or experience.
7. according to claim 6 based on single-sensor throwing action recognition device, it is characterized in that:
To set up and in map unit at described reference zone, described by the throwing of measuring the area maps of process to described reference zone be:
Sensor is installed on right upper arm, and under making right arm naturally droop state, certain axle of sensor is parallel with vertical as far as possible, makes the coordinate of right upper arm vector in sensor coordinate system be u → = 0 , 0 , - 1 T , Right upper arm vector is expressed as in earth coordinates wherein for being tied to the coordinate conversion matrix in earth coordinates from sensor coordinates, to be known be lower vector front vector with right vector vector before before starting with throwing action with right vector replace, and think that, until throwing action terminates, the right vector of front vector is constant, then utilizes with calculate θ and φ and attitude mapping status (r, φ).
8. according to claim 7 based on single-sensor throwing action recognition device, it is characterized in that:
Vector before calculating throwing action starts with right vector be, front vector right vector u → R = u → D × u → F .
9. according in claim 6-8 described in any one based on single-sensor throwing action recognition device, it is characterized in that:
In reference zone foundation and map unit, division experimentally actual conditions increase or the minimizing of described reference zone, boundary value also can come to select by experiment.
10. according to claim 9 based on the identification of single-sensor throwing action, it is characterized in that:
In throwing action judging unit, throw predetermined order according to throw in save up strength, trunk is had an effect, right arm is had an effect and receive gesture to determine after throwing out.
CN201410637623.XA 2014-11-10 2014-11-10 Method and device for recognizing throwing action based on single attitude sensor Active CN104391573B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410637623.XA CN104391573B (en) 2014-11-10 2014-11-10 Method and device for recognizing throwing action based on single attitude sensor

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410637623.XA CN104391573B (en) 2014-11-10 2014-11-10 Method and device for recognizing throwing action based on single attitude sensor

Publications (2)

Publication Number Publication Date
CN104391573A true CN104391573A (en) 2015-03-04
CN104391573B CN104391573B (en) 2017-05-03

Family

ID=52609485

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410637623.XA Active CN104391573B (en) 2014-11-10 2014-11-10 Method and device for recognizing throwing action based on single attitude sensor

Country Status (1)

Country Link
CN (1) CN104391573B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107256082A (en) * 2017-05-11 2017-10-17 大连理工大学 A kind of ammunition ballistic trajectory calculating system based on network integration and binocular vision technology
CN110991293A (en) * 2019-11-26 2020-04-10 爱菲力斯(深圳)科技有限公司 Gesture recognition method and device, computer equipment and storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110289455A1 (en) * 2010-05-18 2011-11-24 Microsoft Corporation Gestures And Gesture Recognition For Manipulating A User-Interface
CN102654917A (en) * 2011-04-27 2012-09-05 清华大学 Method and system for sensing motion gestures of moving body
CN103127717A (en) * 2011-12-02 2013-06-05 深圳泰山在线科技有限公司 Method and system for control and operation of game
JP2013218423A (en) * 2012-04-05 2013-10-24 Utechzone Co Ltd Directional video control device and method
CN103440037A (en) * 2013-08-21 2013-12-11 中国人民解放军第二炮兵工程大学 Real-time interaction virtual human body motion control method based on limited input information

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110289455A1 (en) * 2010-05-18 2011-11-24 Microsoft Corporation Gestures And Gesture Recognition For Manipulating A User-Interface
CN102654917A (en) * 2011-04-27 2012-09-05 清华大学 Method and system for sensing motion gestures of moving body
CN103127717A (en) * 2011-12-02 2013-06-05 深圳泰山在线科技有限公司 Method and system for control and operation of game
JP2013218423A (en) * 2012-04-05 2013-10-24 Utechzone Co Ltd Directional video control device and method
CN103440037A (en) * 2013-08-21 2013-12-11 中国人民解放军第二炮兵工程大学 Real-time interaction virtual human body motion control method based on limited input information

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107256082A (en) * 2017-05-11 2017-10-17 大连理工大学 A kind of ammunition ballistic trajectory calculating system based on network integration and binocular vision technology
CN107256082B (en) * 2017-05-11 2020-04-14 大连理工大学 Throwing object trajectory measuring and calculating system based on network integration and binocular vision technology
CN110991293A (en) * 2019-11-26 2020-04-10 爱菲力斯(深圳)科技有限公司 Gesture recognition method and device, computer equipment and storage medium

Also Published As

Publication number Publication date
CN104391573B (en) 2017-05-03

Similar Documents

Publication Publication Date Title
US10970859B2 (en) Monitoring method and device for mobile target, monitoring system and mobile robot
CN101577812B (en) Method and system for post monitoring
CN103824070B (en) A kind of rapid pedestrian detection method based on computer vision
CN101916447B (en) Robust motion target detecting and tracking image processing system
Mu et al. Multiple Vehicle Detection and Tracking in Highway Traffic Surveillance Video Based on SIFT Feature Matching.
CN104008371A (en) Regional suspicious target tracking and recognizing method based on multiple cameras
CN104408743A (en) Image segmentation method and device
WO2019129255A1 (en) Target tracking method and device
CN112041848A (en) People counting and tracking system and method
CN106663126A (en) Video processing for motor task analysis
KR101551576B1 (en) Robot cleaner, apparatus and method for recognizing gesture
Moghadam et al. Road direction detection based on vanishing-point tracking
Milford et al. Condition-invariant, top-down visual place recognition
WO2012164562A1 (en) Computer vision based control of a device using machine learning
KR101125233B1 (en) Fusion technology-based security method and security system thereof
CN103105924B (en) Man-machine interaction method and device
CN103324932A (en) Video-based vehicle detecting and tracking method and system
CN103686086A (en) Method for carrying out video monitoring on specific area
CN102779274A (en) Intelligent television face recognition method based on binocular camera
CN103713755A (en) Touch recognizing device and recognizing method
CN104391573A (en) Method and device for recognizing throwing action based on single attitude sensor
CN103150552A (en) Driving training management method based on people counting
CN114140745A (en) Method, system, device and medium for detecting personnel attributes of construction site
CN103152558A (en) Intrusion detection method based on scene recognition
CN111191535A (en) Pedestrian detection model construction method based on deep learning and pedestrian detection method

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
PE01 Entry into force of the registration of the contract for pledge of patent right

Denomination of invention: Method and device for recognizing throwing action based on single attitude sensor

Effective date of registration: 20191104

Granted publication date: 20170503

Pledgee: Industrial Commercial Bank of China Ltd Beijing Cuiwei Road Branch

Pledgor: Beijing Huaru Technology Co.,Ltd.

Registration number: Y2019110000005

PE01 Entry into force of the registration of the contract for pledge of patent right
PC01 Cancellation of the registration of the contract for pledge of patent right

Date of cancellation: 20211025

Granted publication date: 20170503

Pledgee: Industrial Commercial Bank of China Ltd. Beijing Cuiwei Road Branch

Pledgor: BEIJING HUARU TECHNOLOGY Co.,Ltd.

Registration number: Y2019110000005

PC01 Cancellation of the registration of the contract for pledge of patent right