Target attitude estimation method, system and medium based on optical radar image fusion
1. A target attitude estimation method based on optical radar image fusion is characterized by comprising the following steps:
acquiring a first echo signal and first optical observation data of a target area, and obtaining a first radar image and a first optical image which are time-synchronized according to the first echo signal and the first optical observation data;
extracting two target feature structures from the first radar image and the first optical image, and determining a first projection length of the target feature structures in the distance direction of the first radar image, a second projection length of the target feature structures in the Doppler direction of the first radar image, a third projection length of the target feature structures in the vertical direction of the first optical image and a fourth projection length of the target feature structures in the horizontal direction of the first optical image;
constructing a first optimization model according to the first projection length, the third projection length and the fourth projection length, and solving the first optimization model by using a particle swarm algorithm to obtain an instantaneous attitude parameter and a size parameter of the target feature structure;
constructing a second optimization model according to the second projection length, the third projection length and the fourth projection length, and solving the second optimization model by using a global search algorithm to obtain a Doppler direction vector of the first radar image;
and constructing a third optimization model according to the second projection length, the instantaneous attitude parameter, the size parameter and the Doppler direction vector, solving the third optimization model by using a global search algorithm to obtain a radar equivalent rotation parameter, and calculating according to the radar equivalent rotation parameter to obtain a target spin rotation parameter.
2. The method according to claim 1, wherein the step of obtaining the first echo signal and the first optical observation data of the target region and obtaining the time-synchronized first radar image and the time-synchronized first optical image according to the first echo signal and the first optical observation data specifically comprises:
receiving a first echo signal within a preset time period through an ISAR radar system, and receiving first optical observation data within the preset time period through an optical sensor;
performing time synchronization processing on the first echo signal and the first optical observation data to obtain a second echo signal and second optical observation data;
obtaining a first optical image according to the second optical observation data, and performing range-Doppler processing on the second echo signal to obtain a first radar image;
and determining the distance direction vector of the first radar image according to the first target tracking parameter recorded by the ISAR radar system, and determining the horizontal direction vector and the vertical direction vector of the first optical image according to the second target tracking parameter recorded by the optical sensor.
3. The method for estimating the target pose based on the optical radar image fusion according to claim 1, wherein the first optimization model comprises a first objective function, and the first objective function is:
wherein the content of the first and second substances,αiand betaiInstantaneous attitude parameter, L, representing the ith target featureiA dimension parameter representing the ith target feature,a distance direction vector representing the first radar image,represents a horizontal direction vector of the first optical image,representing the vertical direction vector, r, of the first optical imageiA first projection length, v, representing the ith target featureiThird projection length, u, representing the ith target featureiA fourth projection length representing the ith target feature.
4. The method according to claim 1, wherein the step of solving the first optimization model by using a particle swarm algorithm to obtain instantaneous attitude parameters and size parameters of the target feature structure specifically comprises:
setting the shortest moving distance, and randomly generating a first particle swarm in a feasible region of the first optimization model, wherein the feasible solution of the first particle swarm is a first individual position;
searching according to the first cost function to obtain the current optimal position of the particles and the optimal position of the particle swarm;
after adjusting the particle position in the first particle swarm, updating the current optimal particle position and the current optimal particle swarm position until the maximum iteration step number or the minimum change criterion is met;
and determining the instantaneous attitude parameter and the size parameter of the target characteristic structure according to the current optimal first individual position.
5. The method for estimating the target pose based on the optical radar image fusion according to claim 1, wherein the second optimization model comprises a second objective function, and the second objective function is:
wherein, thetaopticalRepresents the included angle between the Doppler direction of the first radar image and the vertical direction of the first optical image, and-pi is not more than thetaoptical≤π,diSecond projection length, v, representing ith target featureiThird projection length, u, representing the ith target featureiA fourth projection length representing the ith target feature, i ∈ {1,2 };
the Doppler direction vector of the first radar image is:
wherein the content of the first and second substances,representing the doppler direction vector of the first radar image,represents a horizontal direction vector of the first optical image,representing the vertical direction vector of the first optical image.
6. The method of claim 1, wherein the third optimization model comprises a third objective function, and the third objective function is:
wherein the content of the first and second substances,αiand betaiInstantaneous attitude parameter, L, representing the ith target featureiSize parameter representing the ith target feature, diA second projection length, ω, representing the ith target featureeffThe equivalent rotation parameter of the radar is represented,representing the doppler direction vector of the first radar image.
7. The method for estimating the target attitude based on the optical radar image fusion according to any one of claims 1 to 6, characterized in that the target spin rotation parameter is calculated according to the following formula:
wherein the content of the first and second substances,representing the spin rotation parameter, omega, of the objecteffThe equivalent rotation parameter of the radar is represented,representing the doppler direction vector of the first radar image,representing the radar line of sight rotation component caused by the target orbital motion.
8. A target attitude estimation system based on optical radar image fusion is characterized by comprising:
the device comprises a first radar image and first optical image acquisition module, a second radar image and first optical image acquisition module and a second optical image acquisition module, wherein the first radar image and first optical image acquisition module is used for acquiring a first echo signal and first optical observation data of a target area and obtaining a first radar image and a first optical image which are time-synchronized according to the first echo signal and the first optical observation data;
the projection length determining module is used for extracting two target feature structures from the first radar image and the first optical image, and determining a first projection length of the target feature structures in the distance direction of the first radar image, a second projection length of the target feature structures in the Doppler direction of the first radar image, a third projection length of the target feature structures in the vertical direction of the first optical image and a fourth projection length of the target feature structures in the horizontal direction of the first optical image;
the instantaneous attitude parameter and size parameter determining module is used for constructing a first optimization model according to the first projection length, the third projection length and the fourth projection length, and solving the first optimization model by utilizing a particle swarm algorithm to obtain instantaneous attitude parameters and size parameters of the target feature structure;
a doppler direction vector determination module, configured to construct a second optimization model according to the second projection length, the third projection length, and the fourth projection length, and solve the second optimization model by using a global search algorithm to obtain a doppler direction vector of the first radar image;
and the target spin rotation parameter determining module is used for constructing a third optimization model according to the second projection length, the instantaneous attitude parameter, the size parameter and the Doppler direction vector, solving the third optimization model by using a global search algorithm to obtain radar equivalent rotation parameters, and calculating according to the radar equivalent rotation parameters to obtain target spin rotation parameters.
9. An object attitude estimation device based on optical radar image fusion is characterized by comprising:
at least one processor;
at least one memory for storing at least one program;
when executed by the at least one processor, cause the at least one processor to implement a method for optical radar image fusion based target pose estimation according to any one of claims 1 to 7.
10. A computer readable storage medium in which a processor executable program is stored, wherein the processor executable program, when executed by a processor, is adapted to perform a method of optical radar image fusion based target pose estimation according to any of claims 1 to 7.
Background
The spin space target instantaneous state estimation has very important significance for knowing the in-orbit running state of the spin space target and judging the aerospace development situation of the spin space target. The technology for accurately obtaining the absolute attitude of an important load component of a space target and the overall motion rule of the space target by using an image obtained by measuring the space target by using an Inverse Synthetic Aperture Radar (ISAR) can be practically applied to civil and military fields including space target fault rescue, threat degree evaluation and the like, and is a practical technology capable of realizing the on-orbit state estimation of a spinning space target at present.
At present, two main ways exist for measuring the on-orbit state of a spinning space target, namely, measuring the distance change of a pyramid reflector equipped for the space target by a laser sensor to further determine the on-orbit running state of the target, and setting a 3D model state parameter of the target to generate a 2D radar image, and determining the on-orbit state of the target according to the approximation degree between the 2D radar image and the observed target image. However, most of the existing schemes rely on strong prior conditions such as an accumulated database or azimuth calibration, and the influence of target spin on the observation characteristics is rarely parameterized and modeled. Thus, these data-driven algorithms are difficult to apply to non-cooperative targets in the absence of these observation priors, and have limitations in actual spatial target attitude measurements.
Disclosure of Invention
The present invention aims to solve at least to some extent one of the technical problems existing in the prior art.
To this end, an object of an embodiment of the present invention is to provide a target attitude estimation method based on optical radar image fusion, which first obtains a radar echo signal and optical observation data of a target area, obtains a first radar image and a first optical image that are time-synchronized, then extracts a first projection length and a second projection length of a target feature structure in a distance direction and a doppler direction in the first radar image, extracts a third projection length and a fourth projection length of the target feature structure in a vertical direction and a horizontal direction in the first optical image, and then sequentially constructs a first optimization model about an instantaneous attitude of the target feature structure, a second optimization model about a direction vector of the radar image, and a third optimization model about a target spin rotation parameter according to each projection length, and performs solution by using a particle swarm algorithm and a global search algorithm, thereby realizing the estimation of the instantaneous state of the target in the spin space.
Another objective of the embodiments of the present invention is to provide a target attitude estimation system based on optical radar image fusion.
In order to achieve the technical purpose, the technical scheme adopted by the embodiment of the invention comprises the following steps:
in a first aspect, an embodiment of the present invention provides a target attitude estimation method based on optical radar image fusion, including the following steps:
acquiring a first echo signal and first optical observation data of a target area, and obtaining a first radar image and a first optical image which are time-synchronized according to the first echo signal and the first optical observation data;
extracting two target feature structures from the first radar image and the first optical image, and determining a first projection length of the target feature structures in the distance direction of the first radar image, a second projection length of the target feature structures in the Doppler direction of the first radar image, a third projection length of the target feature structures in the vertical direction of the first optical image and a fourth projection length of the target feature structures in the horizontal direction of the first optical image;
constructing a first optimization model according to the first projection length, the third projection length and the fourth projection length, and solving the first optimization model by using a particle swarm algorithm to obtain an instantaneous attitude parameter and a size parameter of the target feature structure;
constructing a second optimization model according to the second projection length, the third projection length and the fourth projection length, and solving the second optimization model by using a global search algorithm to obtain a Doppler direction vector of the first radar image;
and constructing a third optimization model according to the second projection length, the instantaneous attitude parameter, the size parameter and the Doppler direction vector, solving the third optimization model by using a global search algorithm to obtain a radar equivalent rotation parameter, and calculating according to the radar equivalent rotation parameter to obtain a target spin rotation parameter.
Further, in an embodiment of the present invention, the step of acquiring a first echo signal and first optical observation data of a target area, and obtaining a time-synchronized first radar image and a time-synchronized first optical image according to the first echo signal and the first optical observation data specifically includes:
receiving a first echo signal within a preset time period through an ISAR radar system, and receiving first optical observation data within the preset time period through an optical sensor;
performing time synchronization processing on the first echo signal and the first optical observation data to obtain a second echo signal and second optical observation data;
obtaining a first optical image according to the second optical observation data, and performing range-Doppler processing on the second echo signal to obtain a first radar image;
and determining the distance direction vector of the first radar image according to the first target tracking parameter recorded by the ISAR radar system, and determining the horizontal direction vector and the vertical direction vector of the first optical image according to the second target tracking parameter recorded by the optical sensor.
Further, in one embodiment of the present invention, the first optimization model includes a first objective function, and the first objective function is:
wherein the content of the first and second substances,i∈{1,2},αiand betaiInstantaneous attitude parameter, L, representing the ith target featureiA dimension parameter representing the ith target feature,a distance direction vector representing the first radar image,represents a horizontal direction vector of the first optical image,representing the vertical direction vector, r, of the first optical imageiA first projection length, v, representing the ith target featureiThird projection length, u, representing the ith target featureiA fourth projection length representing the ith target feature.
Further, in an embodiment of the present invention, the step of solving the first optimization model by using a particle swarm optimization algorithm to obtain an instantaneous attitude parameter and a size parameter of the target feature specifically includes:
setting the shortest moving distance, and randomly generating a first particle swarm in a feasible region of the first optimization model, wherein the feasible solution of the first particle swarm is a first individual position;
searching according to the first cost function to obtain the current optimal position of the particles and the optimal position of the particle swarm;
after adjusting the particle position in the first particle swarm, updating the current optimal particle position and the current optimal particle swarm position until the maximum iteration step number or the minimum change criterion is met;
and determining the instantaneous attitude parameter and the size parameter of the target characteristic structure according to the current optimal first individual position.
Further, in an embodiment of the present invention, the second optimization model includes a second objective function, and the second objective function is:
wherein, thetaopticalRepresents the angle between the Doppler direction of the first radar image and the vertical direction of the first optical image, -pi ≦ theta optical ≦ pi, diSecond projection length, v, representing ith target featureiThird projection length, u, representing the ith target featureiA fourth projection length representing the ith target feature, i ∈ {1,2 };
the Doppler direction vector of the first radar image is:
wherein the content of the first and second substances,representing the doppler direction vector of the first radar image,represents a horizontal direction vector of the first optical image,representing the vertical direction vector of the first optical image.
Further, in an embodiment of the present invention, the third optimization model includes a third objective function, and the third objective function is:
wherein the content of the first and second substances,i∈{1,2},αiand betaiInstantaneous attitude parameter, L, representing the ith target featureiSize parameter representing the ith target feature, diA second projection length, ω, representing the ith target featureeffThe equivalent rotation parameter of the radar is represented,representing the doppler direction vector of the first radar image.
Further, in one embodiment of the present invention, the target spin rotation parameter is calculated according to the following formula:
wherein the content of the first and second substances,representing the spin rotation parameter, omega, of the objecteffThe equivalent rotation parameter of the radar is represented,representing the doppler direction vector of the first radar image,representing the radar line of sight rotation component caused by the target orbital motion.
In a second aspect, an embodiment of the present invention provides a target attitude estimation system based on optical radar image fusion, including:
the device comprises a first radar image and first optical image acquisition module, a second radar image and first optical image acquisition module and a second optical image acquisition module, wherein the first radar image and first optical image acquisition module is used for acquiring a first echo signal and first optical observation data of a target area and obtaining a first radar image and a first optical image which are time-synchronized according to the first echo signal and the first optical observation data;
the projection length determining module is used for extracting two target feature structures from the first radar image and the first optical image, and determining a first projection length of the target feature structures in the distance direction of the first radar image, a second projection length of the target feature structures in the Doppler direction of the first radar image, a third projection length of the target feature structures in the vertical direction of the first optical image and a fourth projection length of the target feature structures in the horizontal direction of the first optical image;
the instantaneous attitude parameter and size parameter determining module is used for constructing a first optimization model according to the first projection length, the third projection length and the fourth projection length, and solving the first optimization model by utilizing a particle swarm algorithm to obtain instantaneous attitude parameters and size parameters of the target feature structure;
a doppler direction vector determination module, configured to construct a second optimization model according to the second projection length, the third projection length, and the fourth projection length, and solve the second optimization model by using a global search algorithm to obtain a doppler direction vector of the first radar image;
and the target spin rotation parameter determining module is used for constructing a third optimization model according to the second projection length, the instantaneous attitude parameter, the size parameter and the Doppler direction vector, solving the third optimization model by using a global search algorithm to obtain radar equivalent rotation parameters, and calculating according to the radar equivalent rotation parameters to obtain target spin rotation parameters.
In a third aspect, an embodiment of the present invention provides a target attitude estimation apparatus based on optical radar image fusion, including:
at least one processor;
at least one memory for storing at least one program;
when executed by the at least one processor, the at least one program causes the at least one processor to implement a method for target pose estimation based on optical radar image fusion as described above.
In a fourth aspect, the present invention further provides a computer-readable storage medium, in which a processor-executable program is stored, and the processor-executable program is configured to, when executed by a processor, perform the above-mentioned target attitude estimation method based on optical radar image fusion.
Advantages and benefits of the present invention will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the invention:
according to the method, firstly, radar echo signals and optical observation data of a target area are obtained, a first radar image and a first optical image which are time-synchronized are obtained, then a first projection length and a second projection length of a target feature structure in the distance direction and the Doppler direction are extracted from the first radar image, a third projection length and a fourth projection length of the target feature structure in the vertical direction and the horizontal direction are extracted from the first optical image, then a first optimization model related to the instantaneous attitude of the target feature structure, a second optimization model related to the Doppler direction vector of the radar image and a third optimization model related to the target spin rotation parameter are sequentially constructed according to the projection lengths, and a particle swarm algorithm and a global search algorithm are utilized for solving, so that the estimation of the target instantaneous state in a spin space is achieved. The method and the device can realize accurate inversion of the on-orbit instantaneous state of the spin space target, reduce the influence on the overall estimation result caused by the structure extraction error in the practical application process by adopting the joint optimization estimation of the projection characteristic fusion of the optical image and the radar image, realize accurate estimation of the instantaneous state of the spin space target with a complex geometric form, and improve the accuracy of the attitude estimation of the spin space target.
Drawings
In order to more clearly illustrate the technical solution in the embodiment of the present invention, the following description is made on the drawings required to be used in the embodiment of the present invention, and it should be understood that the drawings in the following description are only for convenience and clarity of describing some embodiments in the technical solution of the present invention, and it is obvious for those skilled in the art that other drawings can be obtained according to these drawings without creative efforts.
Fig. 1 is a flowchart illustrating steps of a target attitude estimation method based on optical radar image fusion according to an embodiment of the present invention;
fig. 2 is an analysis diagram of a spatial target structure and a motion model used in a simulation experiment according to an embodiment of the present invention;
fig. 3(a) is a schematic diagram of a target feature structure of a first radar image extracted in a simulation experiment provided in an embodiment of the present invention;
fig. 3(b) is a schematic diagram of a target feature structure of a first optical image extracted in a simulation experiment provided by an embodiment of the present invention;
fig. 4 is a block diagram of a target attitude estimation system based on optical radar image fusion according to an embodiment of the present invention;
fig. 5 is a block diagram of a target attitude estimation apparatus based on optical radar image fusion according to an embodiment of the present invention.
Detailed Description
Reference will now be made in detail to embodiments of the present invention, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the accompanying drawings are illustrative only for the purpose of explaining the present invention, and are not to be construed as limiting the present invention. The step numbers in the following embodiments are provided only for convenience of illustration, the order between the steps is not limited at all, and the execution order of each step in the embodiments can be adapted according to the understanding of those skilled in the art.
In the description of the present invention, the meaning of a plurality is two or more, if there is a description to the first and the second for the purpose of distinguishing technical features, it is not understood as indicating or implying relative importance or implicitly indicating the number of indicated technical features or implicitly indicating the precedence of the indicated technical features. Furthermore, unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art.
Referring to fig. 1, an embodiment of the present invention provides a target attitude estimation method based on optical radar image fusion, which specifically includes the following steps:
s101, acquiring a first echo signal and first optical observation data of a target area, and obtaining a first radar image and a first optical image which are time-synchronized according to the first echo signal and the first optical observation data.
Specifically, in the embodiment of the invention, an ISAR radar system receives echo signals in a preset time period, an optical sensor receives image data in the preset time period, time synchronization is performed on the optical-radar data according to target tracking data in an observation sensor, then range-Doppler processing is performed on the echo signals in the preset time period, a radar image of a target area, which is time-synchronized with an optical observation image, is obtained, and ISAR imaging geometry and optical observation geometry can be determined at the same time. Step S101 specifically includes the following steps:
s1011, receiving a first echo signal in a preset time period through an ISAR radar system, and receiving first optical observation data in the preset time period through an optical sensor;
s1012, performing time synchronization processing on the first echo signal and the first optical observation data to obtain a second echo signal and second optical observation data;
s1013, obtaining a first optical image according to the second optical observation data, and performing range-Doppler processing on the second echo signal to obtain a first radar image;
s1014, determining a distance direction vector of the first radar image according to the first target tracking parameter recorded by the ISAR radar system, and determining a horizontal direction vector and a vertical direction vector of the first optical image according to the second target tracking parameter recorded by the optical sensor.
Specifically, the distance direction vector of the first radar image is determined according to the target tracking parameters recorded by the ISAR radar system as follows:
wherein the content of the first and second substances,a distance direction vector, t, representing the first radar imagemFor slow time corresponding to the sampling of the signals in the azimuth dimension, the pitch angle theta (t)m) Is the angle between the instantaneous radar sight line direction vector and the body coordinate XOY plane, the azimuth angle phi (t)m) For instantaneous radar visionThe angle between the projection of the linear direction vector on XOY and the Y axis, t0For ISAR imaging the central time of the coherent integration period, when tm=t0And the instantaneous radar sight direction is consistent with the radar image distance axis.
Meanwhile, the horizontal direction vector of the first optical image can be determined according to the target tracking parameter recorded by the optical sensorVertical direction vector
S102, two target feature structures are extracted from the first radar image and the first optical image, and a first projection length of the target feature structures in the distance direction of the first radar image, a second projection length of the target feature structures in the Doppler direction of the first radar image, a third projection length of the target feature structures in the vertical direction of the first optical image and a fourth projection length of the target feature structures in the horizontal direction of the first optical image are determined.
Specifically, 2 target feature structures are respectively extracted from the first radar image and the first optical image, and the projection lengths of the target feature structures in two dimensions of distance and Doppler in the first radar image and the projection lengths in two directions of horizontal and vertical in the first optical image are recorded.
S103, constructing a first optimization model according to the first projection length, the third projection length and the fourth projection length, and solving the first optimization model by utilizing a particle swarm algorithm to obtain instantaneous attitude parameters and size parameters of the target feature structure.
Specifically, according to the ISAR imaging geometry and the optical observation geometry determined in the foregoing, a first optimization function is established by combining the projection lengths of the extracted target feature structures in the horizontal and vertical directions of the optical image and the projection lengths of the extracted target feature structures in the distance direction of the radar image, and the instantaneous attitude parameters and the size parameters of the 2 target feature structures are solved by using a particle swarm algorithm.
As a further optional implementation, the first optimization model includes a first objective function, and the first objective function is:
wherein the content of the first and second substances,i∈{1,2},αiand betaiInstantaneous attitude parameter, L, representing the ith target featureiA dimension parameter representing the ith target feature,a distance direction vector representing the first radar image,represents a horizontal direction vector of the first optical image,representing the vertical direction vector, r, of the first optical imageiA first projection length, v, representing the ith target featureiThird projection length, u, representing the ith target featureiA fourth projection length representing the ith target feature.
In particular, αiIs the angle between the ith target feature and the XOY plane of the body coordinate system, betaiIs the angle between the projection of the ith target feature on the XOY plane and the Y axis, LiIs the length of the ith target feature in real space. It will be appreciated that α is solved for by the first optimization modeli、βiAnd LiThe instantaneous pose of the target feature can then be determined.
Further as an optional implementation manner, the step of solving the first optimization model by using a particle swarm algorithm to obtain an instantaneous attitude parameter and a size parameter of the target feature structure specifically includes:
setting the shortest moving distance, and randomly generating a first particle swarm in a feasible region of the first optimization model, wherein the feasible solution of the first particle swarm is a first individual position;
searching according to the first cost function to obtain the current optimal position of the particles and the optimal position of the particle swarm;
after adjusting the position of the particles in the first particle swarm, updating the current optimal position of the particles and the optimal position of the particle swarm until the maximum iteration step number or the minimum change criterion is met;
and determining the instantaneous attitude parameter and the size parameter of the target characteristic structure according to the current optimal first object position.
Specifically, the steps of solving the first optimization model by using the particle swarm algorithm to obtain the instantaneous attitude parameter and the size parameter of the target feature structure are as follows:
s1031, setting the shortest moving distance, and randomly generating a first particle swarm in a feasible domain of the first optimization model, wherein a feasible solution is defined as a first individual position Xj=(α,β,L)T;
S1032, searching the current particle optimal position Pbest and the particle swarm optimal position Gbest according to the following first cost function:
s1033, adjusting the position of the particle in the first particle swarm according to the following updating formula:
Vi(t+1)=A5Vi(t)+A6rand1(Pbest-Xi(t))+A7rand2(Gbest-Xi(t))
Xi(t+1)=Xi(t)+Vi(t)
wherein, Vi(t) and Xi(t) is the motion and position of the tth individual in the ith iteration, A5Is an inertia factor, A6And A7Is a weight factor for balancing individual experience and group experience, and a random parameter rand1And rand2Obey [0,1]Are uniformly distributed.
S1034, recalculating the individual objective function value according to the first cost function, and updating the current optimal particle position and the optimal particle swarm position. Stopping iteration and skipping step S1035 if the maximum iteration step number or the minimum change criterion is met; otherwise, go to step S1033; the minimum change criterion refers to that the change of the optimal positions of the particles and the optimal positions of the particle swarm needs to be larger than a minimum moving threshold value in the updating process.
S1035, stopping iteration and outputting the current optimal first individual position
And S104, constructing a second optimization model according to the second projection length, the third projection length and the fourth projection length, and solving the second optimization model by using a global search algorithm to obtain the Doppler direction vector of the first radar image.
As a further optional implementation manner, the second optimization model includes a second objective function, and the second objective function is:
wherein, thetaopticalRepresents the angle between the Doppler direction of the first radar image and the vertical direction of the first optical image, -pi ≦ theta optical ≦ pi, diSecond projection length, v, representing ith target featureiThird projection length, u, representing the ith target featureiA fourth projection length representing the ith target feature, i ∈ {1,2 };
the doppler direction vector of the first radar image is:
wherein the content of the first and second substances,representing the doppler direction vector of the first radar image,represents a horizontal direction vector of the first optical image,representing the vertical direction vector of the first optical image.
Specifically, the estimator θ is solved by a global search algorithmopticalThen, the Doppler direction vector of the first radar image can be determined.
And S105, constructing a third optimization model according to the second projection length, the instantaneous attitude parameter, the size parameter and the Doppler direction vector, solving the third optimization model by using a global search algorithm to obtain radar equivalent rotation parameters, and calculating according to the radar equivalent rotation parameters to obtain target spin rotation parameters.
As a further optional implementation, the third optimization model includes a third objective function, and the third objective function is:
wherein the content of the first and second substances,i∈{1,2},αiand betaiInstantaneous attitude parameter, L, representing the ith target featureiSize parameter representing the ith target feature, diA second projection length, ω, representing the ith target featureeffThe equivalent rotation parameter of the radar is represented,representing the doppler direction vector of the first radar image.
Further as an optional implementation mode, the target spin rotation parameter is calculated according to the following formula:
wherein the content of the first and second substances,representing the spin rotation parameter, omega, of the objecteffThe equivalent rotation parameter of the radar is represented,representing the doppler direction vector of the first radar image,representing the radar line of sight rotation component caused by the target orbital motion.
In particular, the amount of the solvent to be used,and isSolving in the feasible domain of the third optimization model according to a global search algorithm to obtain an estimator omegaeffThen, the target spin rotation parameters can be determined.The radar sight angle component brought by the target orbit motion can be obtained by cross multiplication of an instantaneous radar sight direction vector at the starting moment and an instantaneous radar sight direction vector at the stopping moment.
According to the method, firstly, radar echo signals and optical observation data of a target area are obtained, a first radar image and a first optical image which are time-synchronized are obtained, then a first projection length and a second projection length of a target feature structure in the distance direction and the Doppler direction are extracted from the first radar image, a third projection length and a fourth projection length of the target feature structure in the vertical direction and the horizontal direction are extracted from the first optical image, then a first optimization model related to the instantaneous attitude of the target feature structure, a second optimization model related to the Doppler direction vector of the radar image and a third optimization model related to the target spin rotation parameter are sequentially constructed according to the projection lengths, and a particle swarm algorithm and a global search algorithm are utilized for solving, so that the estimation of the target instantaneous state in a spin space is achieved. Compared with the prior art, the embodiment of the invention has the following advantages:
1) the invention solves the core problem that the ISAR imaging plane is not definite under the spinning state by utilizing the geometrical complementary characteristic between the ISAR image and the optical image under the same visual angle of the space target and the distribution optimization estimation method, and realizes accurate inversion on the transient state of the spinning space target.
2) The invention adopts the joint optimization estimation of different characteristic structures on the same target, reduces the influence on the overall estimation result caused by the structure extraction error in the practical application process, and can realize the accurate estimation of the instantaneous state of the spinning space target with a complex geometric form.
In order to further verify the accuracy of the embodiment of the present invention, the effect of the embodiment of the present invention is further described below with reference to simulation experiments.
The structure of the space target adopted in the simulation experiment of the embodiment of the invention is shown in figure 2, wherein theta represents the observation pitch angle of the radar, namely the included angle between the sight line of the radar and the XOY plane of the body coordinate system, phi represents the observation azimuth angle of the radar, namely the included angle between the projection of the sight line of the radar on the XOY plane and the Y axis, LOS represents the sight line direction of the radar,the solar wing boundary of the target has obvious linear structure by representing the radar sight direction vector. The main parameters of the simulation experiment of the embodiment of the present invention are shown in table 1 below.
Size of image
512×512
Radar image range resolution
0.05m
Centre frequency of transmitted signal
10GHz
Pulse repetition frequency
100Hz
Optical image resolution
0.05m×0.05m
TABLE 1
Simulation experiment 1: the method of the present invention is adopted to extract the target feature structure from 2 synchronous observation images (radar image and optical image) of the space target in fig. 2, and as shown in fig. 3(a) and fig. 3(b), two target feature structures are respectively represented by feature structure 1 and feature structure 2. Then, three optimization models are sequentially constructed according to the projection lengths of the target feature structure in the horizontal and vertical directions of the optical image, the projection lengths of the target feature structure in the radar image distance and the Doppler direction, and the optical and radar observation geometric parameters, and each state parameter of the target is solved, and the result is shown in the following table 2.
TABLE 2
As can be seen from fig. 3(a) and 3(b), stable extraction of the feature structure of the spatial object can be substantially ensured. As can be seen from table 2, the orientation of the estimated feature is substantially consistent with the orientation of the real feature, the average error is within 3 degrees, the size of the estimated feature is substantially consistent with the size of the real feature, the estimated target spin direction is substantially consistent with the real target spin direction, and the estimated target spin speed is numerically closer to the real target spin speed, so that the on-orbit state of the space target can be determined.
Simulation experiment 2: the method is adopted to test in a target precession state, namely, the target rotates around a fixed shaft at a constant speed in a body coordinate system. The target rotation speed is set to 0.014rad/s, and the rotation speed of the rotary shaft is set to 0 rad/s. The target 6 instantaneous state parameters were estimated from 60 seconds of continuous observation, and the target state estimation results are shown in table 3 below.
TABLE 3
As can be seen from table 3, in the in-orbit precession state of the target, the orientation of the estimated feature is substantially consistent with the orientation of the real feature, the average error is within 3 degrees, the size of the estimated feature is substantially consistent with the size of the real feature, the estimated spin direction of the target is substantially consistent with the real spin direction of the target, the estimated spin speed of the target is numerically closer to the real spin speed of the target, and the in-orbit state of the spatial target can be determined.
Simulation experiment 3: the method is adopted to test the target in a nutation state, namely, the target rotates around a certain fixed shaft at a constant speed besides the basic precession. The target rotation speed was set to 0.012rad/s, and the spindle rotation speed was set to 0.004 rad/s. The target 6 instantaneous state parameters were estimated from 60 seconds of continuous observation, and the target state estimation results are shown in table 4 below.
TABLE 4
As can be seen from table 4, in the in-orbit nutation state of the target, the orientation of the estimated feature is substantially consistent with the orientation of the real feature, the average error is within 3 degrees, the size of the estimated feature is substantially consistent with the size of the real feature, the estimated spin direction of the target is substantially consistent with the spin direction of the real target, and the estimated spin speed of the target is numerically closer to the spin speed of the real target, so that the in-orbit state of the spatial target can be determined.
Simulation experiment 4: the method is adopted to test the target in a rolling state, namely the target rotation shaft and the rotation speed are continuously changed. The target 6 instantaneous state parameters were estimated from 60 seconds of continuous observation, and the target state estimation results are shown in table 5 below.
TABLE 5
As can be seen from table 5, in the target rolling state, the orientation of the estimated feature is substantially consistent with the orientation of the real feature, the average error is within 3 degrees, the size of the estimated feature is substantially consistent with the size of the real feature, the estimated target spin direction is substantially consistent with the real target spin direction, the estimated target spin speed is closer to the real target spin speed in value, and the on-orbit state of the space target can be determined.
Referring to fig. 4, an embodiment of the present invention provides a target attitude estimation system based on optical radar image fusion, including:
the first radar image and first optical image acquisition module is used for acquiring a first echo signal and first optical observation data of a target area and obtaining a first radar image and a first optical image which are time-synchronized according to the first echo signal and the first optical observation data;
the projection length determining module is used for extracting two target feature structures from the first radar image and the first optical image, and determining a first projection length of the target feature structures in the distance direction of the first radar image, a second projection length of the target feature structures in the Doppler direction of the first radar image, a third projection length of the target feature structures in the vertical direction of the first optical image and a fourth projection length of the target feature structures in the horizontal direction of the first optical image;
the instantaneous attitude parameter and size parameter determining module is used for constructing a first optimization model according to the first projection length, the third projection length and the fourth projection length, and solving the first optimization model by utilizing a particle swarm algorithm to obtain instantaneous attitude parameters and size parameters of the target characteristic structure;
the Doppler direction vector determining module is used for constructing a second optimization model according to the second projection length, the third projection length and the fourth projection length, and solving the second optimization model by using a global search algorithm to obtain a Doppler direction vector of the first radar image;
and the target spin rotation parameter determining module is used for constructing a third optimization model according to the second projection length, the instantaneous attitude parameter, the size parameter and the Doppler direction vector, solving the third optimization model by using a global search algorithm to obtain radar equivalent rotation parameters, and calculating according to the radar equivalent rotation parameters to obtain the target spin rotation parameters.
The contents in the above method embodiments are all applicable to the present system embodiment, the functions specifically implemented by the present system embodiment are the same as those in the above method embodiment, and the beneficial effects achieved by the present system embodiment are also the same as those achieved by the above method embodiment.
Referring to fig. 5, an embodiment of the present invention provides a target attitude estimation apparatus based on optical radar image fusion, including:
at least one processor;
at least one memory for storing at least one program;
when the at least one program is executed by the at least one processor, the at least one processor is enabled to implement the target attitude estimation method based on optical radar image fusion.
The contents in the above method embodiments are all applicable to the present apparatus embodiment, the functions specifically implemented by the present apparatus embodiment are the same as those in the above method embodiments, and the advantageous effects achieved by the present apparatus embodiment are also the same as those achieved by the above method embodiments.
An embodiment of the present invention further provides a computer-readable storage medium, in which a processor-executable program is stored, and the processor-executable program is configured to execute the above-mentioned target attitude estimation method based on optical radar image fusion when executed by a processor.
The computer-readable storage medium of the embodiment of the invention can execute the target attitude estimation method based on the optical radar image fusion provided by the embodiment of the invention, can execute any combination implementation steps of the embodiment of the method, and has corresponding functions and beneficial effects of the method.
The embodiment of the invention also discloses a computer program product or a computer program, which comprises computer instructions, and the computer instructions are stored in a computer readable storage medium. The computer instructions may be read by a processor of a computer device from a computer-readable storage medium, and executed by the processor to cause the computer device to perform the method illustrated in fig. 1.
In alternative embodiments, the functions/acts noted in the block diagrams may occur out of the order noted in the operational illustrations. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality/acts involved. Furthermore, the embodiments presented and described in the flow charts of the present invention are provided by way of example in order to provide a more thorough understanding of the technology. The disclosed methods are not limited to the operations and logic flows presented herein. Alternative embodiments are contemplated in which the order of various operations is changed and in which sub-operations described as part of larger operations are performed independently.
Furthermore, although the present invention is described in the context of functional modules, it should be understood that, unless otherwise stated to the contrary, one or more of the above-described functions and/or features may be integrated in a single physical device and/or software module, or one or more of the functions and/or features may be implemented in a separate physical device or software module. It will also be appreciated that a detailed discussion of the actual implementation of each module is not necessary for an understanding of the present invention. Rather, the actual implementation of the various functional modules in the apparatus disclosed herein will be understood within the ordinary skill of an engineer, given the nature, function, and internal relationship of the modules. Accordingly, those skilled in the art can, using ordinary skill, practice the invention as set forth in the claims without undue experimentation. It is also to be understood that the specific concepts disclosed are merely illustrative of and not intended to limit the scope of the invention, which is defined by the appended claims and their full scope of equivalents.
The above functions, if implemented in the form of software functional units and sold or used as a separate product, may be stored in a computer-readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the above method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
The logic and/or steps represented in the flowcharts or otherwise described herein, e.g., an ordered listing of executable instructions that can be considered to implement logical functions, can be embodied in any computer-readable medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions. For the purposes of this description, a "computer-readable medium" can be any means that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device.
More specific examples (a non-exhaustive list) of the computer-readable medium would include the following: an electrical connection (electronic device) having one or more wires, a portable computer diskette (magnetic device), a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber device, and a portable compact disc read-only memory (CDROM). Further, the computer readable medium could even be paper or another suitable medium upon which the above described program is printed, as the program can be electronically captured, via for instance optical scanning of the paper or other medium, then compiled, interpreted or otherwise processed in a suitable manner if necessary, and then stored in a computer memory.
It should be understood that portions of the present invention may be implemented in hardware, software, firmware, or a combination thereof. In the above embodiments, the various steps or methods may be implemented in software or firmware stored in memory and executed by a suitable instruction execution system. For example, if implemented in hardware, as in another embodiment, any one or combination of the following techniques, which are known in the art, may be used: a discrete logic circuit having a logic gate circuit for implementing a logic function on a data signal, an application specific integrated circuit having an appropriate combinational logic gate circuit, a Programmable Gate Array (PGA), a Field Programmable Gate Array (FPGA), or the like.
In the foregoing description of the specification, reference to the description of "one embodiment/example," "another embodiment/example," or "certain embodiments/examples," etc., means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention. In this specification, schematic representations of the above terms do not necessarily refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples.
While embodiments of the present invention have been shown and described, it will be understood by those of ordinary skill in the art that: various changes, modifications, substitutions and alterations can be made to the embodiments without departing from the principles and spirit of the invention, the scope of which is defined by the claims and their equivalents.
While the preferred embodiments of the present invention have been illustrated and described, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the invention as defined by the appended claims.