Mobile terminal gesture recognition method and device and readable storage medium
1. A mobile terminal gesture recognition method is characterized in that the mobile terminal comprises a motion sensor, and the mobile terminal gesture recognition method comprises the following steps:
acquiring motion data of the mobile terminal through the motion sensor;
calling a mobile terminal gesture recognition model to enable the mobile terminal gesture recognition model to recognize the motion data and determine the gesture of the mobile terminal;
the mobile terminal gesture recognition model is obtained by combining training sample data based on kinematics knowledge and machine learning training and is used for recognizing a gesture influencing mobile terminal positioning according to the motion data.
2. The method for recognizing the posture of the mobile terminal according to claim 1, wherein the step of calling a mobile terminal posture recognition model so that the mobile terminal posture recognition model recognizes the motion data and determining the posture of the mobile terminal comprises:
calling the mobile terminal gesture recognition model to enable the mobile terminal gesture recognition model to perform feature extraction on the motion data to obtain motion features of the motion data;
and comparing and identifying the motion characteristics with preset characteristics through the mobile terminal posture identification model to obtain the mobile terminal posture corresponding to the motion characteristics.
3. The mobile terminal gesture recognition method of claim 2, wherein the motion features of the motion data include moving and static features;
the step of comparing and recognizing the motion characteristics with preset characteristics through the mobile terminal gesture recognition model to obtain the mobile terminal gesture corresponding to the motion characteristics comprises the following steps:
enabling the mobile terminal gesture recognition model to judge whether the dynamic and static characteristics accord with preset static characteristics or not;
and when the dynamic and static characteristics accord with the preset static characteristics, determining the gesture of the mobile terminal to be a static gesture, otherwise, determining the gesture to be a motion gesture.
4. The mobile terminal pose recognition method of claim 3, wherein the motion features of the motion data further comprise amplitude features;
the step of comparing and recognizing the motion characteristics with preset characteristics through the mobile terminal gesture recognition model to obtain the mobile terminal gesture corresponding to the motion characteristics further comprises the following steps executed after the gesture of the mobile terminal is determined to be the motion gesture:
enabling the mobile terminal gesture recognition model to judge whether the amplitude of the amplitude feature is higher than a preset amplitude threshold value;
and when the amplitude of the amplitude characteristic is higher than the preset amplitude threshold value, determining that the motion attitude is a swing attitude, otherwise, determining that the motion attitude is a micro-motion attitude.
5. The mobile terminal pose recognition method of claim 4, wherein the motion features of the motion data further comprise tilt features.
6. The method for recognizing the posture of the mobile terminal according to claim 5, wherein the step of comparing and recognizing the motion feature with a preset feature through the mobile terminal posture recognition model to obtain the posture of the mobile terminal corresponding to the motion feature further comprises the following steps executed after determining that the motion posture is a swing posture:
enabling the mobile terminal gesture recognition model to judge whether the inclined features accord with longitudinal inclined features or transverse inclined features;
when the inclination feature accords with the longitudinal inclination feature, determining the swing gesture as a longitudinal swing gesture;
and when the inclination characteristic accords with the transverse inclination characteristic, determining the swing gesture as a transverse swing gesture.
7. The method for recognizing the gesture of the mobile terminal according to claim 5, wherein the step of comparing and recognizing the motion feature with a preset feature through the mobile terminal gesture recognition model to obtain the gesture of the mobile terminal corresponding to the motion feature further comprises the following steps executed after determining that the motion gesture is a jog gesture:
calling the mobile terminal gesture recognition model to enable the mobile terminal gesture recognition model to recognize the tilt features;
when the inclination characteristic accords with the longitudinal inclination characteristic, determining the micro-motion attitude as a longitudinal micro-motion attitude;
and when the tilt feature conforms to the transverse tilt feature, determining the micro-motion attitude to be a transverse micro-motion attitude.
8. A mobile terminal posture identifying apparatus, characterized in that the mobile terminal includes a motion sensor, the mobile terminal posture identifying apparatus comprising:
an input module configured to acquire motion data of the mobile terminal acquired by the motion sensor;
a recognition module configured to invoke a mobile terminal gesture recognition model, such that the mobile terminal gesture recognition model recognizes the motion data, determining a gesture of the mobile terminal;
the mobile terminal gesture recognition model is obtained by combining training sample data based on kinematics knowledge and machine learning training and is used for recognizing a gesture influencing mobile terminal positioning according to the motion data.
9. The mobile terminal gesture recognition apparatus of claim 8, wherein the recognition module is configured to the motion feature extraction unit and the motion feature comparison unit:
the motion characteristic extraction unit is configured to call the mobile terminal gesture recognition model, so that the mobile terminal gesture recognition model performs characteristic extraction on the motion data to obtain motion characteristics of the motion data;
the motion characteristic comparison unit is configured to compare and identify the motion characteristic with a preset characteristic through the mobile terminal gesture identification model to obtain a mobile terminal gesture corresponding to the motion characteristic.
10. The mobile terminal gesture recognition apparatus of claim 9, wherein the motion features of the motion data include moving and static features; the motion characteristic comparison unit comprises: a static feature comparison unit;
the stationary feature alignment unit is configured to:
enabling the mobile terminal gesture recognition model to judge whether the dynamic and static characteristics accord with preset static characteristics or not;
and when the dynamic and static characteristics accord with the preset static characteristics, determining the gesture of the mobile terminal to be a static gesture, otherwise, determining the gesture to be a motion gesture.
11. The mobile terminal gesture recognition apparatus of claim 9, wherein the motion characteristics of the motion data further include amplitude characteristics; the motion characteristic comparison unit further comprises: an amplitude characteristic comparison unit;
the amplitude feature comparison unit is configured to:
enabling the mobile terminal gesture recognition model to judge whether the amplitude of the amplitude feature is higher than a preset amplitude threshold value;
and when the amplitude of the amplitude characteristic is higher than the preset amplitude threshold value, determining that the motion attitude is a swing attitude, otherwise, determining that the motion attitude is a micro-motion attitude.
12. The apparatus for recognizing gesture of mobile terminal according to claim 11, wherein the motion characteristic comparing unit further comprises: a tilt feature comparison unit; the motion characteristics of the motion data further include a tilt characteristic.
13. The mobile terminal gesture recognition apparatus of claim 12, wherein the tilt feature comparison unit is configured to:
enabling the mobile terminal gesture recognition model to judge whether the inclined features accord with longitudinal inclined features or transverse inclined features;
when the inclination feature accords with the longitudinal inclination feature, determining the swing gesture as a longitudinal swing gesture;
and when the inclination characteristic accords with the transverse inclination characteristic, determining the swing gesture as a transverse swing gesture.
14. The mobile terminal gesture recognition apparatus of claim 12, wherein the tilt feature comparison unit is configured to:
enabling the mobile terminal gesture recognition model to judge whether the inclined features accord with longitudinal inclined features or transverse inclined features;
when the inclination characteristic accords with the longitudinal inclination characteristic, determining the micro-motion attitude as a longitudinal micro-motion attitude;
and when the tilt feature conforms to the transverse tilt feature, determining the micro-motion attitude to be a transverse micro-motion attitude.
15. A computer-readable storage medium, wherein at least one instruction or program is stored in the storage medium, and the instruction or program is recorded and executed by the processor to implement the method for recognizing the gesture of the mobile terminal according to any one of claims 1 to 7.
Background
The mobile terminal and the vehicle can establish communication connection through the communication module, and the vehicle can position the mobile terminal. And realizing the function of remotely locking the vehicle by the mobile terminal according to the acquired positioning information of the mobile terminal. Generally, the mobile terminal and the vehicle can establish communication connection through bluetooth, and the bluetooth frequency is 2.4 GHZ.
However, the bluetooth signal of the mobile terminal may change with different gestures of the mobile terminal, for example, when the mobile terminal is held in a horizontal screen, the bluetooth antenna may be shielded by a hand, so that the bluetooth signal is weakened, and the problems of signal packet loss and locking failure are easily caused.
Therefore, the gesture of the mobile terminal is accurately recognized, and the method has important practical significance in the aspect of controlling the vehicle through the mobile terminal. The more accurate the mobile terminal action recognition model is, the more effective the mobile terminal can be made to control the vehicle and prevent relay attack.
At present, in the related technology, a sensor is used for acquiring original data and original features of different postures of a mobile terminal, the acquired original data and original features are directly subjected to necessary feature engineering and machine learning model training, then a recognition model meeting project requirements is selected, and the posture of the mobile terminal is recognized by the recognition model. Machine learning algorithms commonly used in the process include SVM (Support Vector Machine), random forest and the like.
However, in the process of obtaining the recognition model in the related art, the kinematics knowledge is not covered in the model, so that some mobile phone gestures with particularly similar physical characteristics are difficult to distinguish in the process of using the model to recognize the gestures.
Disclosure of Invention
The application provides a mobile terminal gesture recognition method, a mobile terminal gesture recognition device and a readable storage medium, which can solve the problem that mobile phone gestures with particularly similar physical characteristics are difficult to distinguish in the related art.
In order to solve the technical problem described in the background art, the present application provides a method for recognizing a mobile terminal gesture, where the mobile terminal includes a motion sensor, and the method for recognizing the mobile terminal gesture includes:
acquiring motion data of the mobile terminal through the motion sensor;
calling a mobile terminal gesture recognition model to enable the mobile terminal gesture recognition model to recognize the motion data and determine the gesture of the mobile terminal;
the mobile terminal gesture recognition model is obtained by combining training sample data based on kinematics knowledge and machine learning training and is used for recognizing a gesture influencing mobile terminal positioning according to the motion data.
Optionally, the step of calling a mobile terminal gesture recognition model to enable the mobile terminal gesture recognition model to recognize the motion data and determine the gesture of the mobile terminal includes:
calling the mobile terminal gesture recognition model to enable the mobile terminal gesture recognition model to perform feature extraction on the motion data to obtain motion features of the motion data;
and comparing and identifying the motion characteristics with preset characteristics through the mobile terminal posture identification model to obtain the mobile terminal posture corresponding to the motion characteristics.
Optionally, the motion features of the motion data include moving and static features;
the step of comparing and recognizing the motion characteristics with preset characteristics through the mobile terminal gesture recognition model to obtain the mobile terminal gesture corresponding to the motion characteristics comprises the following steps:
enabling the mobile terminal gesture recognition model to judge whether the dynamic and static characteristics accord with preset static characteristics or not;
and when the dynamic and static characteristics accord with the preset static characteristics, determining the gesture of the mobile terminal to be a static gesture, otherwise, determining the gesture to be a motion gesture.
Optionally, the motion characteristics of the motion data further include amplitude characteristics;
the step of comparing and recognizing the motion characteristics with preset characteristics through the mobile terminal gesture recognition model to obtain the mobile terminal gesture corresponding to the motion characteristics further comprises the following steps executed after the gesture of the mobile terminal is determined to be the motion gesture:
enabling the mobile terminal gesture recognition model to judge whether the amplitude of the amplitude feature is higher than a preset amplitude threshold value;
and when the amplitude of the amplitude characteristic is higher than the preset amplitude threshold value, determining that the motion attitude is a swing attitude, otherwise, determining that the motion attitude is a micro-motion attitude.
Optionally, the motion characteristics of the motion data further comprise a tilt characteristic.
Optionally, the step of comparing and recognizing the motion feature with a preset feature through the mobile terminal gesture recognition model to obtain a mobile terminal gesture corresponding to the motion feature further includes the following steps executed after determining that the motion gesture is a swing gesture:
enabling the mobile terminal gesture recognition model to judge whether the inclined features accord with longitudinal inclined features or transverse inclined features;
when the inclination feature accords with the longitudinal inclination feature, determining the swing gesture as a longitudinal swing gesture;
and when the inclination characteristic accords with the transverse inclination characteristic, determining the swing gesture as a transverse swing gesture.
Optionally, the step of comparing and recognizing the motion feature with a preset feature through the mobile terminal gesture recognition model to obtain a mobile terminal gesture corresponding to the motion feature further includes the following steps executed after determining that the motion gesture is a micro-motion gesture:
calling the mobile terminal gesture recognition model to enable the mobile terminal gesture recognition model to recognize the tilt features;
when the inclination characteristic accords with the longitudinal inclination characteristic, determining the micro-motion attitude as a longitudinal micro-motion attitude;
and when the tilt feature conforms to the transverse tilt feature, determining the micro-motion attitude to be a transverse micro-motion attitude.
In order to solve the technical problem described in the background, another aspect of the present application provides a mobile terminal gesture recognition apparatus, where the mobile terminal includes a motion sensor, the mobile terminal gesture recognition apparatus including:
an input module configured to acquire motion data of the mobile terminal acquired by the motion sensor;
a recognition module configured to invoke a mobile terminal gesture recognition model, such that the mobile terminal gesture recognition model recognizes the motion data, determining a gesture of the mobile terminal;
the mobile terminal gesture recognition model is obtained by combining training sample data based on kinematics knowledge and machine learning training and is used for recognizing a gesture influencing mobile terminal positioning according to the motion data.
Optionally, the identification module is configured to a motion feature extraction unit and a motion feature comparison unit:
the motion characteristic extraction unit is configured to call the mobile terminal gesture recognition model, so that the mobile terminal gesture recognition model performs characteristic extraction on the motion data to obtain motion characteristics of the motion data;
the motion characteristic comparison unit is configured to compare and identify the motion characteristic with a preset characteristic through the mobile terminal gesture identification model to obtain a mobile terminal gesture corresponding to the motion characteristic.
Optionally, the motion features of the motion data include moving and static features; the motion characteristic comparison unit comprises: a static feature comparison unit;
the stationary feature alignment unit is configured to:
enabling the mobile terminal gesture recognition model to judge whether the dynamic and static characteristics accord with preset static characteristics or not;
and when the dynamic and static characteristics accord with the preset static characteristics, determining the gesture of the mobile terminal to be a static gesture, otherwise, determining the gesture to be a motion gesture.
Optionally, the motion characteristics of the motion data further include amplitude characteristics; the motion characteristic comparison unit further comprises: an amplitude characteristic comparison unit;
the amplitude feature comparison unit is configured to:
enabling the mobile terminal gesture recognition model to judge whether the amplitude of the amplitude feature is higher than a preset amplitude threshold value;
and when the amplitude of the amplitude characteristic is higher than the preset amplitude threshold value, determining that the motion attitude is a swing attitude, otherwise, determining that the motion attitude is a micro-motion attitude.
Optionally, the motion feature comparison unit further includes: a tilt feature comparison unit; the motion characteristics of the motion data further include a tilt characteristic
Optionally, the inclined feature comparing unit is configured to:
enabling the mobile terminal gesture recognition model to judge whether the inclined features accord with longitudinal inclined features or transverse inclined features;
when the inclination feature accords with the longitudinal inclination feature, determining the swing gesture as a longitudinal swing gesture;
and when the inclination characteristic accords with the transverse inclination characteristic, determining the swing gesture as a transverse swing gesture.
Optionally, the inclined feature comparing unit is configured to:
enabling the mobile terminal gesture recognition model to judge whether the inclined features accord with longitudinal inclined features or transverse inclined features;
when the inclination characteristic accords with the longitudinal inclination characteristic, determining the micro-motion attitude as a longitudinal micro-motion attitude;
and when the tilt feature conforms to the transverse tilt feature, determining the micro-motion attitude to be a transverse micro-motion attitude.
The present application also provides a computer-readable storage medium, in which at least one instruction or program is stored, and the instruction or program is recorded and executed by the processor to implement the mobile terminal gesture recognition method as described above.
The technical scheme at least comprises the following advantages: acquiring motion data of the mobile terminal through a motion sensor; the gesture recognition model of the mobile terminal is obtained by training based on the kinematics knowledge and combined with the training sample data set data, so that the gesture recognition model of the mobile terminal can recognize the collected motion data of the mobile terminal and determine the gesture of the mobile terminal, the gesture recognition function of the mobile terminal combined with the kinematics knowledge is realized, and the accuracy of gesture recognition of the mobile terminal is improved.
Drawings
In order to more clearly illustrate the detailed description of the present application or the technical solutions in the prior art, the drawings needed to be used in the detailed description of the present application or the prior art description will be briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present application, and other drawings can be obtained by those skilled in the art without creative efforts.
Fig. 1 is a schematic flowchart illustrating a method for recognizing a gesture of a mobile terminal according to an embodiment of the present application;
FIG. 2 is a flowchart illustrating an embodiment of step S2;
fig. 3 is a functional configuration diagram of a mobile terminal gesture recognition apparatus provided in an embodiment of the present application;
fig. 4 is a functional configuration diagram of a recognition module of a mobile terminal gesture recognition apparatus provided in an embodiment of the present application;
fig. 5 is a diagram showing a functional configuration of a motion feature comparison unit of the recognition module in one embodiment.
Detailed Description
The technical solutions in the present application will be described clearly and completely with reference to the accompanying drawings, and it is obvious that the described embodiments are some, but not all embodiments of the present application. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
In the description of the present application, it should be noted that the terms "center", "upper", "lower", "left", "right", "vertical", "horizontal", "inner", "outer", and the like indicate orientations or positional relationships based on the orientations or positional relationships shown in the drawings, and are only for convenience of description and simplicity of description, and do not indicate or imply that the device or element being referred to must have a particular orientation, be constructed and operated in a particular orientation, and thus, should not be construed as limiting the present application. Furthermore, the terms "first," "second," and "third" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance.
In the description of the present application, it is to be noted that, unless otherwise explicitly specified or limited, the terms "mounted," "connected," and "connected" are to be construed broadly, e.g., as meaning either a fixed connection, a removable connection, or an integral connection; the connection can be mechanical connection or electrical connection; the two elements may be directly connected or indirectly connected through an intermediate medium, or may be communicated with each other inside the two elements, or may be wirelessly connected or wired connected. The specific meaning of the above terms in the present application can be understood in a specific case by those of ordinary skill in the art.
In addition, the technical features mentioned in the different embodiments of the present application described below may be combined with each other as long as they do not conflict with each other.
In the related art, the mobile terminal is positioned through wireless communication between the mobile terminal and the vehicle, and the vehicle is locked through the mobile terminal according to the positioning information of the mobile terminal. Generally, the mobile terminal and the vehicle establish communication connection through bluetooth, and the bluetooth frequency is 2.4 GHZ.
However, the bluetooth signal of the mobile terminal may change with different gestures of the mobile terminal, for example, when the mobile terminal is held in a horizontal screen, the bluetooth antenna may be shielded by a hand, so that the bluetooth signal is weakened, and the problems of signal packet loss and locking failure are easily caused.
Therefore, in the process of positioning the mobile terminal, the gesture of the mobile terminal can be recognized to assist in accurate positioning, so that when the gesture affecting the positioning of the mobile terminal occurs, the gesture is recognized in time and matched with a corresponding signal compensation strategy.
Fig. 1 shows a flowchart of a mobile terminal gesture recognition method provided in an embodiment of the present application, and referring to fig. 1, the mobile terminal gesture recognition method includes the following steps:
step S1: and acquiring the motion data of the mobile terminal through the motion sensor.
The mobile terminal is provided with a motion sensor, the motion sensor is a device for detecting the motion of the mobile terminal, and the motion of the mobile terminal may include any one or more of moving, tilting, shaking, rotating or swinging.
Optionally, the motion sensor may include: any one or combination of more of a tilt sensor, an acceleration sensor, a rotation sensor, and a vibration frequency sensor.
Step S2: and calling a mobile terminal gesture recognition model to enable the mobile terminal gesture recognition model to recognize the motion data and determine the gesture of the mobile terminal.
The mobile terminal posture recognition model is obtained by machine learning training based on kinematics knowledge and combined with training sample data set data and is used for recognizing the posture influencing the positioning of the mobile terminal according to the collected motion data of the mobile terminal.
The training sample data set may include a posture of the mobile terminal that may affect the positioning of the mobile terminal, and mobile terminal motion data corresponding to the posture. The motion data corresponding to the gesture can be acquired by a motion sensor and obtained through necessary feature extraction.
The kinematics knowledge is a kinematics law embodied in the mobile terminal posture recognition model in an algorithm form, and the kinematics law comprises Newton's first law, Kepler's law, universal gravitation isopratic law, kinetic energy conservation law and other kinematics laws.
The mobile terminal posture recognition model in the related art is trained only according to a training sample data set containing a large number of elements, but the elements in the training sample data set are limited, so that the method for recognizing the posture of the mobile terminal by using the mobile terminal posture recognition model in the related art has a large error. The mobile terminal posture recognition model related to the application combines the kinematics knowledge and the training sample data set, and can reduce recognition errors caused by limited elements of the training sample data set to a great extent.
In summary, in the method provided in this embodiment, the motion sensor is used to acquire the motion data of the mobile terminal; the gesture recognition model of the mobile terminal is obtained by training based on the kinematics knowledge and combined with the training sample data set data, so that the gesture recognition model of the mobile terminal can recognize the collected motion data of the mobile terminal and determine the gesture of the mobile terminal, the gesture recognition function of the mobile terminal combined with the kinematics knowledge is realized, and the accuracy of gesture recognition of the mobile terminal is improved.
However, when the mobile terminal is in different postures, the motion characteristics reflected by the motion data of the mobile terminal are different. For example, when the mobile terminal is in a stationary state and the mobile terminal is in a moving state, the difference between the motion data collected by the respective motion sensors is large. That is, when the mobile terminal is in a static state, the corresponding motion data of the mobile terminal conforms to the static feature, and when the mobile terminal is in a motion state, the corresponding motion data of the mobile terminal does not conform to the static feature.
Therefore, when the posture of the mobile terminal is determined, the motion data can be subjected to feature extraction to obtain corresponding motion features, and then feature comparison and identification are carried out according to the obtained motion features. Referring to fig. 2, which shows a schematic flowchart of an embodiment of step S2, the step S2 includes a step S21 and a step S22 performed in sequence, wherein:
step S21: and calling the mobile terminal gesture recognition model to enable the mobile terminal gesture recognition model to perform feature extraction on the motion data to obtain the motion features of the motion data.
Step S22: and enabling the mobile terminal gesture recognition model to recognize the motion characteristics of the motion data to obtain the mobile terminal gesture corresponding to the motion characteristics.
Optionally, the motion characteristics described in steps S21 and S22 include dynamic and static characteristics, which can reflect whether the mobile terminal is in a stationary state or in a moving state.
When the motion feature includes a dynamic feature and a static feature, the step S21 includes: and calling the mobile terminal gesture recognition model to enable the mobile terminal gesture recognition model to perform feature extraction on the motion data to obtain the dynamic and static features of the motion data. With continued reference to fig. 2, the above step S22 includes sequentially performing:
step S221: and judging whether the dynamic and static characteristics accord with preset static characteristics or not by the mobile terminal gesture recognition model.
Step S222: and when the dynamic and static characteristics accord with the preset static characteristics, determining the gesture of the mobile terminal to be a static gesture, otherwise, determining the gesture to be a motion gesture.
The preset static features are preset in the gesture recognition model of the mobile terminal in advance and are configured as a judgment standard for judging whether the extracted dynamic and static features represent static gestures or motion gestures.
The static gesture and the motion gesture in the gestures of the mobile terminal can be distinguished through steps S221 and S222, but the motion gesture includes different gesture amplitudes, and the different gesture amplitudes have different effects on the positioning of the mobile terminal, so after the gesture of the mobile terminal is determined to be the motion gesture, the motion amplitude of the motion gesture can be determined according to the amplitude characteristic in the motion gesture, and the gesture expressed on the amplitude characteristic by the mobile terminal can be determined according to the motion amplitude.
To further determine the amplitude pose, the motion features further include amplitude features, and step S21 further includes: and calling the mobile terminal gesture recognition model to enable the mobile terminal gesture recognition model to perform feature extraction on the motion data to obtain the amplitude features of the motion data. With continued reference to fig. 2, step S22 further includes, after step S222 is completed, determining the mobile terminal' S pose as a motion pose:
step S223: and judging whether the amplitude of the amplitude characteristic is higher than a preset amplitude threshold value or not by the mobile terminal gesture recognition model.
Step S224: and when the amplitude of the amplitude characteristic is higher than the preset amplitude threshold value, determining that the motion attitude is a swing attitude, otherwise, determining that the motion attitude is a micro-motion attitude.
Wherein the preset amplitude threshold is preset in the mobile terminal gesture recognition model in advance and is configured as a judgment standard for judging whether the extracted amplitude feature represents a swing gesture or a jog gesture.
Through the steps S223 and S224, the swing gesture and the jog gesture with different motion amplitudes in the gestures of the mobile terminal can be distinguished, but different holding and carrying gestures can be provided for the same gesture, so after the gesture of the mobile terminal is determined to be the swing gesture, the tilt degree of the mobile terminal can be determined according to the tilt characteristic, and the holding and carrying gesture of the mobile terminal is determined to be the horizontal direction or the vertical direction according to the tilt degree.
To further determine the specific gripping and carrying posture, the motion feature further includes a tilt feature, and step S21 further includes: and calling the mobile terminal gesture recognition model to enable the mobile terminal gesture recognition model to perform feature extraction on the motion data to obtain the inclination features of the motion data. With continued reference to fig. 2, step S22 further includes, after step S224 is completed, determining the mobile terminal' S posture as a swing posture:
step S225: and judging whether the inclined characteristic accords with a longitudinal inclined characteristic or a transverse inclined characteristic by the mobile terminal posture recognition model.
Step S226: and when the inclination characteristic accords with the longitudinal inclination characteristic, determining the swing gesture as a longitudinal swing gesture.
Step S227: and when the inclination characteristic accords with the transverse inclination characteristic, determining the swing gesture as a transverse swing gesture.
Wherein the longitudinal inclination feature is preset in the mobile terminal gesture recognition model in advance and is configured as a judgment standard for judging whether the extracted inclination feature represents the gesture as the longitudinal direction. The transverse inclination feature is preset in the mobile terminal posture recognition model in advance and is configured as a judgment standard for judging whether the extracted inclination feature is expressed as a transverse direction.
Through the steps S223 and S224, the swing gesture and the fine motion gesture with different motion amplitudes in the gestures of the mobile terminal can be distinguished, but different holding and carrying gestures can be provided for the same gesture, so after the gesture of the mobile terminal is determined to be the fine motion gesture, the tilt degree of the mobile terminal can be determined according to the tilt characteristic, and whether the fine motion gesture of the mobile terminal is in the horizontal direction or the vertical direction can be determined according to the tilt degree.
To further determine a specific grip-carrying posture, the motion feature further comprises a tilt feature, and with continued reference to fig. 2, step S22 further comprises, after completion of step S224, determining the posture of the mobile terminal to be a jiggle posture:
step S228: and judging whether the inclined characteristic accords with a longitudinal inclined characteristic or a transverse inclined characteristic by the mobile terminal posture recognition model.
Step S229: and when the inclination characteristic accords with the longitudinal inclination characteristic, determining the micro-motion attitude as a longitudinal micro-motion attitude.
Step S2210: and when the tilt feature conforms to the transverse tilt feature, determining the micro-motion attitude to be a transverse micro-motion attitude.
Wherein the longitudinal inclination feature is preset in the mobile terminal gesture recognition model in advance and is configured as a judgment standard for judging whether the extracted inclination feature is expressed as a longitudinal direction. The transverse inclination feature is preset in the mobile terminal posture recognition model in advance and is configured as a judgment standard for judging whether the extracted inclination feature is expressed as a transverse direction.
Fig. 3 is a functional configuration diagram illustrating a mobile terminal gesture recognition apparatus according to an embodiment of the present application, which may be implemented as a part of the mobile terminal, and the mobile terminal further includes a motion sensor 33, where the apparatus includes: an input module 31 and an identification module 32.
The input module 31 is configured to acquire the motion data of the mobile terminal acquired by the motion sensor 33.
Optionally, the input module 31 can obtain any one or more combinations of moving and static feature data, motion amplitude feature data or inclination feature data of the mobile terminal.
The recognition module 32 is configured to invoke a mobile terminal pose recognition model, such that the mobile terminal pose recognition model recognizes the motion data, determining a pose of the mobile terminal.
The mobile terminal gesture recognition model is obtained by combining training sample data based on kinematics knowledge and machine learning training and is used for recognizing a gesture influencing mobile terminal positioning according to the motion data. The kinematics knowledge is a kinematics law embodied in the mobile terminal posture recognition model in an algorithm form, and the kinematics law comprises Newton's first law, Kepler's law, universal gravitation equal rate and kinetic energy conservation law and other kinematics laws.
Optionally, the recognition module 32 prestores feature data for recognizing a gesture, and can correspondingly compare any one or a combination of a plurality of kinds of the obtained moving and static feature data, motion amplitude feature data, or inclination feature data of the mobile terminal with preset feature data to recognize and determine a corresponding motion gesture.
The device provided by the embodiment acquires the motion data of the mobile terminal through the motion sensor; the gesture recognition model of the mobile terminal is obtained by training based on the kinematics knowledge and combined with the training sample data set data, so that the gesture recognition model of the mobile terminal can recognize the collected motion data of the mobile terminal and determine the gesture of the mobile terminal, the gesture recognition function of the mobile terminal combined with the kinematics knowledge is realized, and the accuracy of gesture recognition of the mobile terminal is improved.
However, when the mobile terminal is in different postures, the motion characteristics reflected by the motion data of the mobile terminal are different. For example, when the mobile terminal is in a stationary state and the mobile terminal is in a moving state, the motion data collected by the respective motion sensors is large. That is, when the mobile terminal is in a static state, the corresponding motion data of the mobile terminal conforms to the static feature, and when the mobile terminal is in a motion state, the corresponding motion data of the mobile terminal does not conform to the static feature.
Therefore, when the posture of the mobile terminal is determined, the motion data can be subjected to feature extraction to obtain corresponding motion features, and then feature comparison and identification are carried out according to the obtained motion features. Referring to fig. 4, there is shown a functional configuration diagram of a recognition module configured as a motion feature extraction unit 321 and a motion feature comparison unit 322, in which:
the motion feature extraction unit 321 is configured to invoke the mobile terminal gesture recognition model, so that the mobile terminal gesture recognition model performs feature extraction on the motion data to obtain the motion features of the motion data.
The motion feature comparison unit 322 is configured to compare and identify the motion feature with a preset feature through the mobile terminal gesture recognition model, so as to obtain a mobile terminal gesture corresponding to the motion feature.
Optionally, referring to fig. 5, which shows a functional configuration diagram of an embodiment of the motion characteristic comparison unit 322, the motion characteristic comparison unit includes: a stationary feature comparing unit 3221; the stationary feature comparing unit 3221 is configured to:
enabling the mobile terminal gesture recognition model to judge whether the motion characteristics accord with preset static characteristics or not;
and when the motion characteristics accord with preset static characteristics, determining the gesture of the mobile terminal to be a static gesture, otherwise, determining the gesture to be a motion gesture.
The static feature comparison unit can distinguish the static gesture from the motion gesture in the gestures of the mobile terminal, but for the motion gesture, the motion gesture comprises different gesture amplitudes, and the positioning influence of the different gesture amplitudes on the mobile terminal is different, so that after the gesture of the mobile terminal is determined to be the motion gesture, the motion amplitude of the motion gesture can be determined according to the amplitude feature in the motion gesture, and the gesture expressed by the mobile terminal on the amplitude feature is determined according to the motion amplitude.
To further determine the amplitude pose, the motion features further comprise amplitude features, the motion feature extraction unit further comprises an amplitude feature extraction unit configured to: and calling the mobile terminal gesture recognition model to enable the mobile terminal gesture recognition model to perform feature extraction on the motion data to obtain the amplitude features of the motion data.
With continuing reference to fig. 5, the motion feature comparing unit further includes an amplitude feature comparing unit 3222, the amplitude feature comparing unit 3222 is configured to:
and enabling the mobile terminal gesture recognition model to judge whether the amplitude of the amplitude feature is higher than a preset amplitude threshold value.
And when the amplitude of the amplitude characteristic is higher than a preset amplitude threshold value, determining that the motion attitude is a swing attitude, otherwise, determining that the motion attitude is a micro-motion attitude.
Through the amplitude characteristic comparison unit 3222, the swing gesture and the micro-motion gesture with different motion amplitudes in the gestures of the mobile terminal can be distinguished, but different holding and carrying gestures can be provided for the same amplitude gesture, so after the amplitude gesture of the mobile terminal is determined to be the swing gesture, the inclination degree of the mobile terminal can be determined according to the inclination characteristic, and whether the holding and carrying gesture of the mobile terminal is in the horizontal direction or the vertical direction can be determined according to the inclination degree.
To further determine a specific grip-carrying posture, the motion feature further comprises a tilt feature, the motion feature extraction unit further comprises a tilt feature extraction unit configured to: and calling the mobile terminal gesture recognition model to enable the mobile terminal gesture recognition model to perform feature extraction on the motion data to obtain the inclination features of the motion data.
The motion feature comparing unit 322 further comprises a tilted feature comparing unit 3223, wherein the tilted feature comparing unit 3223 is configured to:
and judging whether the inclined characteristic accords with a longitudinal inclined characteristic or a transverse inclined characteristic by the mobile terminal posture recognition model.
And when the inclination characteristic accords with the longitudinal inclination characteristic, determining the swing gesture as a longitudinal swing gesture.
And when the inclination characteristic accords with the transverse inclination characteristic, determining the swing gesture as a transverse swing gesture.
The oblique feature alignment unit is further configured to:
and judging whether the inclined characteristic accords with a longitudinal inclined characteristic or a transverse inclined characteristic by the mobile terminal posture recognition model.
And when the inclination characteristic accords with the longitudinal inclination characteristic, determining the micro-motion attitude as a longitudinal micro-motion attitude.
And when the tilt feature conforms to the transverse tilt feature, determining the micro-motion attitude to be a transverse micro-motion attitude.
Through the tilt characteristic comparison unit 3223, the swing gesture and the fine motion gesture with different motion amplitudes in the gestures of the mobile terminal can be distinguished, but different holding and carrying gestures can be provided for the same gesture, so that after the gesture of the mobile terminal is determined to be the swing gesture or the fine motion gesture, the tilt degree of the mobile terminal can be determined according to the tilt characteristic, and whether the swing gesture or the fine motion gesture of the mobile terminal is in the horizontal direction or the vertical direction can be determined according to the tilt degree.
The present application also provides a computer-readable storage medium, in which at least one instruction or program is stored, and the instruction or program is recorded and executed by the processor to implement the mobile terminal gesture recognition method as described above.
It should be understood that the above examples are only for clarity of illustration and are not intended to limit the embodiments. Other variations and modifications will be apparent to persons skilled in the art in light of the above description. And are neither required nor exhaustive of all embodiments. And obvious variations or modifications of this invention are intended to be covered by the scope of the invention as expressed herein.
- 上一篇:石墨接头机器人自动装卡簧、装栓机
- 下一篇:一种基于强化学习设计伦理智能体的方法