Unmanned aerial vehicle three-dimensional map modeling method
1. An unmanned aerial vehicle three-dimensional map modeling method is characterized by comprising the following steps:
the method comprises the following steps that firstly, a data acquisition module senses and scans the surrounding environment by using sensing equipment carried on an unmanned aerial vehicle to acquire corresponding environment information; the data processing module corrects the related data acquired by the sensing equipment;
step two, the image processing module carries out denoising and enhancement processing on the acquired image data; the central control module controls the image feature extraction module to extract features of the processed image by using a single chip microcomputer or a controller, and determines feature points of a scanning environment;
thirdly, the three-dimensional map generation module constructs an initial three-dimensional map based on the acquired environmental information; the three-dimensional environment construction module effectively extracts, stores, edits and memorizes the acquired three-dimensional map image characteristics and constructs a reasonable and accurately represented spatial three-dimensional environment;
step four, the positioning module calculates the current position and the attitude of the unmanned aerial vehicle by using the extracted environmental characteristic points, the navigation positioning data, the inertial measurement data and the motion sensing data; the three-dimensional map building module is used for building a three-dimensional map based on the generated initial three-dimensional map and a space three-dimensional environment;
fifthly, the track marking module marks the position and attitude data of the unmanned aerial vehicle in the constructed three-dimensional map; the three-dimensional map updating module judges whether the surrounding environment in the execution task is consistent with the constructed three-dimensional map model and the space three-dimensional environment; if not, continuously acquiring the environmental information, and updating the three-dimensional map;
and step six, the display module displays the real-time three-dimensional map marked with the position and attitude data of the unmanned aerial vehicle by using display equipment.
2. The three-dimensional map modeling method for unmanned aerial vehicle as claimed in claim 1, wherein in step one, the data processing module correcting the related data acquired by the sensing device comprises:
(1) acquiring a test data sequence corresponding to each sensor of the sensing equipment and a preset test time sequence;
(2) determining calibration data of each sensor according to the test data sequence corresponding to each sensor; acquiring first data to be processed from at least one sensor of a sensing device;
(3) correcting the first data according to prestored calibration data corresponding to each sensor to generate second data; and the second data is the corrected data.
3. The three-dimensional map modeling method for unmanned aerial vehicle as claimed in claim 2, wherein in step (2), said determining calibration data for each sensor according to the test data sequence corresponding to each sensor comprises:
acquiring an average value of the test data sequence corresponding to each sensor, and determining calibration data corresponding to each sensor; or filtering the acquired test data sequence by using a preset data range, reserving the test data in the data range, carrying out averaging processing on the reserved test data, and determining the calibration data corresponding to each sensor.
4. The three-dimensional map modeling method for unmanned aerial vehicle as claimed in claim 3, wherein in step (3), said generating second data by correcting said first data according to pre-stored calibration data corresponding to each of said sensors comprises:
comparing the size of the first data with preset reference data; if the first data is larger than or equal to the reference data, subtracting the calibration data from the first data to generate second data; and if the first data is smaller than the reference data, adding the first data and the calibration data to generate second data.
5. The modeling method of the three-dimensional map of the unmanned aerial vehicle as claimed in claim 1, wherein in step two, the image processing module denoising and enhancing the acquired image data comprises:
firstly, performing wavelet decomposition on an acquired image to obtain a low-frequency image and three high-frequency images;
secondly, carrying out self-adaptive Gamma correction on the low-frequency image to obtain a low-frequency image with enhanced contrast;
then, denoising the three high-frequency images respectively to obtain denoised high-frequency images;
and finally, performing wavelet inverse transformation on the obtained contrast-enhanced low-frequency image and the obtained de-noised high-frequency image to obtain an enhanced image.
6. The modeling method for three-dimensional maps of unmanned aerial vehicles according to claim 1, wherein in step three, the three-dimensional map generation module building an initial three-dimensional map based on the acquired environmental information comprises:
1) acquiring two-dimensional vector map data, unmanned aerial vehicle position information and acquired environment information of the same area;
2) rendering according to the acquired two-dimensional vector map data to generate two-dimensional map textures; triangularizing the area according to the acquired unmanned aerial vehicle position information and the acquired related environment information to generate a three-dimensional terrain;
3) texture mapping the two-dimensional map to the three-dimensional terrain to generate a textured three-dimensional terrain;
4) and rendering the three-dimensional terrain with the texture to obtain a three-dimensional map corresponding to the area to be generated.
7. A three-dimensional map modeling system for unmanned aerial vehicle implementing the three-dimensional map modeling method for unmanned aerial vehicle of claims 1-6, the three-dimensional map modeling system for unmanned aerial vehicle comprising:
the system comprises a data acquisition module, a data processing module, an image processing module, a central control module, an image feature extraction module, a three-dimensional map generation module, a three-dimensional environment construction module, a positioning module, a three-dimensional map construction module, a track marking module, a three-dimensional map updating module and a display module;
the data acquisition module is connected with the central control module and used for sensing and scanning the surrounding environment by using sensing equipment carried on the unmanned aerial vehicle to acquire corresponding environment information;
the data processing module is connected with the central control module and used for correcting the related data acquired by the sensing equipment;
the image processing module is connected with the central control module and is used for carrying out denoising and enhancement processing on the acquired image data;
the central control module is connected with the data acquisition module, the data processing module, the image feature extraction module, the three-dimensional map generation module, the three-dimensional environment construction module, the positioning module, the three-dimensional map construction module, the track marking module, the three-dimensional map updating module and the display module; the system is used for controlling each module to normally operate by utilizing a single chip microcomputer or a controller;
the image feature extraction module is connected with the central control module and used for extracting features of the processed image and determining feature points of a scanning environment;
the three-dimensional map generation module is connected with the central control module and used for constructing an initial three-dimensional map based on the acquired environmental information;
the three-dimensional environment construction module is connected with the central control module and is used for effectively extracting, storing, editing and memorizing the acquired three-dimensional map image characteristics and constructing a reasonable and accurately represented spatial three-dimensional environment;
the positioning module is connected with the central control module and used for calculating the current position and the attitude of the unmanned aerial vehicle by utilizing the extracted environmental characteristic points, the navigation positioning data, the inertial measurement data and the motion sensing data;
the three-dimensional map building module is connected with the central control module and used for building a three-dimensional map based on the generated initial three-dimensional map and a space three-dimensional environment;
the track marking module is connected with the central control module and used for marking the position and attitude data of the unmanned aerial vehicle in the constructed three-dimensional map;
the three-dimensional map updating module is connected with the central control module and is used for judging whether the surrounding environment in the executed task is consistent with the built three-dimensional map model and the space three-dimensional environment; if not, continuously acquiring the environmental information, and updating the three-dimensional map;
and the display module is connected with the central control module and used for displaying the three-dimensional map marked with the position and the attitude data of the unmanned aerial vehicle by using the display equipment.
8. The unmanned aerial vehicle three-dimensional map modeling system of the unmanned aerial vehicle three-dimensional map modeling method of claim 7, wherein the sensing device comprises: camera devices, satellite navigation positioning devices, motion sensors, inertial measurement unit sensors, radar, infrared or optical flow sensors, and other sensors.
9. A computer program product stored on a computer readable medium, comprising a computer readable program for providing a user input interface to implement the method of three-dimensional map modeling of a drone according to any one of claims 1 to 6 when executed on an electronic device.
10. A computer-readable storage medium storing instructions that, when executed on a computer, cause the computer to perform the method of three-dimensional map modeling of a drone of any one of claims 1 to 6.
Background
At present: the construction of the urban three-dimensional map model is always a hotspot of research of an international remote sensing geographic information system and related disciplines. In recent years, with the rapid development and the increasing popularity of the smart city, the establishment of the three-dimensional city landscape with real textures is particularly important. At present, the urban three-dimensional map modeling technologies mainly comprise the following 4 types:
firstly, a modeling technology based on software is based on large-scale topographic data, and three-dimensional manufacturing software (such as 3Ds MAX and the like) is used for modeling, so that fine structure and material characteristics of a city can be vividly represented, but the method has the defects of high production cost, long period, low efficiency and the like;
acquiring two-dimensional or three-dimensional vector data of a latest Digital Line Graphic (DLG) by adopting an aerial photogrammetry technology based on a three-dimensional modeling technology of the traditional photogrammetry, and modeling by using software, wherein the method has the advantages of high automation degree, high modeling speed, easiness in real-time updating and the like, but the texture information of the side surface of the building needs to be acquired and constructed by other means;
the laser scanning-based modeling technology can quickly establish a three-dimensional model of an object by using laser point cloud, but texture information still needs to be obtained by other means and is manually attached.
The oblique photogrammetry technology is used for urban three-dimensional modeling, so that the detailed analysis time of a model can be saved, the model texture acquisition and processing method can be simplified, the working efficiency can be improved, the cost is reduced, the oblique photogrammetry system provides a new method for acquiring a large-area urban real three-dimensional model quickly, but a small airplane or a power delta wing aircraft or even a transport 5 is required to be used as a platform for data acquisition due to the large volume and weight of the oblique photogrammetry system, and the price is high, so that the data production cost is extremely high, and the wide application of the technology is limited to a great extent.
To a certain extent, the method has the defects that the established three-dimensional map is not close to the environment, and is comprehensive, detailed and accurate, so that the development and popularization of urban three-dimensional modeling are restricted.
Through the above analysis, the problems and defects of the prior art are as follows:
(1) the existing three-dimensional map modeling method has high production cost, long period and low efficiency;
(2) the texture information of the building side surface of the model is constructed by other means, the texture information still needs to be obtained by other means, the texture is attached manually, and the data production and acquisition cost is extremely high;
(3) the constructed model is not close to the environment, is not comprehensive, detailed and accurate.
Disclosure of Invention
Aiming at the problems in the prior art, the invention provides a three-dimensional map modeling method for an unmanned aerial vehicle.
The invention is realized in such a way that an unmanned aerial vehicle three-dimensional map modeling method comprises the following steps:
the method comprises the following steps that firstly, a data acquisition module senses and scans the surrounding environment by using sensing equipment carried on an unmanned aerial vehicle to acquire corresponding environment information; the data processing module corrects the related data acquired by the sensing equipment;
step two, the image processing module carries out denoising and enhancement processing on the acquired image data; the central control module controls the image feature extraction module to extract features of the processed image by using a single chip microcomputer or a controller, and determines feature points of a scanning environment;
thirdly, the three-dimensional map generation module constructs an initial three-dimensional map based on the acquired environmental information; the three-dimensional environment construction module effectively extracts, stores, edits and memorizes the acquired three-dimensional map image characteristics and constructs a reasonable and accurately represented spatial three-dimensional environment;
step four, the positioning module calculates the current position and the attitude of the unmanned aerial vehicle by using the extracted environmental characteristic points, the navigation positioning data, the inertial measurement data and the motion sensing data; the three-dimensional map building module is used for building a three-dimensional map based on the generated initial three-dimensional map and a space three-dimensional environment;
fifthly, the track marking module marks the position and attitude data of the unmanned aerial vehicle in the constructed three-dimensional map; the three-dimensional map updating module judges whether the surrounding environment in the execution task is consistent with the constructed three-dimensional map model and the space three-dimensional environment; if not, continuously acquiring the environmental information, and updating the three-dimensional map;
and step six, the display module displays the real-time three-dimensional map marked with the position and attitude data of the unmanned aerial vehicle by using display equipment.
Further, in the first step, the modifying, by the data processing module, the related data acquired by the sensing device includes:
(1) acquiring a test data sequence corresponding to each sensor of the sensing equipment and a preset test time sequence;
(2) determining calibration data of each sensor according to the test data sequence corresponding to each sensor; acquiring first data to be processed from at least one sensor of a sensing device;
(3) correcting the first data according to prestored calibration data corresponding to each sensor to generate second data; and the second data is the corrected data.
Further, in the step (2), the determining calibration data of each sensor according to the test data sequence corresponding to each sensor includes:
acquiring an average value of the test data sequence corresponding to each sensor, and determining calibration data corresponding to each sensor; or filtering the acquired test data sequence by using a preset data range, reserving the test data in the data range, carrying out averaging processing on the reserved test data, and determining the calibration data corresponding to each sensor.
Further, in the step (3), the generating second data by correcting the first data according to the pre-stored calibration data corresponding to each sensor includes:
comparing the size of the first data with preset reference data; if the first data is larger than or equal to the reference data, subtracting the calibration data from the first data to generate second data; and if the first data is smaller than the reference data, adding the first data and the calibration data to generate second data.
Further, in the second step, the image processing module denoising and enhancing the acquired image data includes:
firstly, performing wavelet decomposition on an acquired image to obtain a low-frequency image and three high-frequency images;
secondly, carrying out self-adaptive Gamma correction on the low-frequency image to obtain a low-frequency image with enhanced contrast;
then, denoising the three high-frequency images respectively to obtain denoised high-frequency images;
and finally, performing wavelet inverse transformation on the obtained contrast-enhanced low-frequency image and the obtained de-noised high-frequency image to obtain an enhanced image.
Further, in the third step, the constructing an initial three-dimensional map based on the acquired environmental information by the three-dimensional map generating module includes:
1) acquiring two-dimensional vector map data, unmanned aerial vehicle position information and acquired environment information of the same area;
2) rendering according to the acquired two-dimensional vector map data to generate two-dimensional map textures; triangularizing the area according to the acquired unmanned aerial vehicle position information and the acquired related environment information to generate a three-dimensional terrain;
3) texture mapping the two-dimensional map to the three-dimensional terrain to generate a textured three-dimensional terrain;
4) and rendering the three-dimensional terrain with the texture to obtain a three-dimensional map corresponding to the area to be generated.
Another object of the present invention is to provide a three-dimensional map modeling system for an unmanned aerial vehicle, which implements the three-dimensional map modeling method for an unmanned aerial vehicle, the three-dimensional map modeling system for an unmanned aerial vehicle including:
the system comprises a data acquisition module, a data processing module, an image processing module, a central control module, an image feature extraction module, a three-dimensional map generation module, a three-dimensional environment construction module, a positioning module, a three-dimensional map construction module, a track marking module, a three-dimensional map updating module and a display module;
the data acquisition module is connected with the central control module and used for sensing and scanning the surrounding environment by using sensing equipment carried on the unmanned aerial vehicle to acquire corresponding environment information;
the data processing module is connected with the central control module and used for correcting the related data acquired by the sensing equipment;
the image processing module is connected with the central control module and is used for carrying out denoising and enhancement processing on the acquired image data;
the central control module is connected with the data acquisition module, the data processing module, the image feature extraction module, the three-dimensional map generation module, the three-dimensional environment construction module, the positioning module, the three-dimensional map construction module, the track marking module, the three-dimensional map updating module and the display module; the system is used for controlling each module to normally operate by utilizing a single chip microcomputer or a controller;
the image feature extraction module is connected with the central control module and used for extracting features of the processed image and determining feature points of a scanning environment;
the three-dimensional map generation module is connected with the central control module and used for constructing an initial three-dimensional map based on the acquired environmental information;
the three-dimensional environment construction module is connected with the central control module and is used for effectively extracting, storing, editing and memorizing the acquired three-dimensional map image characteristics and constructing a reasonable and accurately represented spatial three-dimensional environment;
the positioning module is connected with the central control module and used for calculating the current position and the attitude of the unmanned aerial vehicle by utilizing the extracted environmental characteristic points, the navigation positioning data, the inertial measurement data and the motion sensing data;
the three-dimensional map building module is connected with the central control module and used for building a three-dimensional map based on the generated initial three-dimensional map and a space three-dimensional environment;
the track marking module is connected with the central control module and used for marking the position and attitude data of the unmanned aerial vehicle in the constructed three-dimensional map;
the three-dimensional map updating module is connected with the central control module and is used for judging whether the surrounding environment in the executed task is consistent with the built three-dimensional map model and the space three-dimensional environment; if not, continuously acquiring the environmental information, and updating the three-dimensional map;
and the display module is connected with the central control module and used for displaying the three-dimensional map marked with the position and the attitude data of the unmanned aerial vehicle by using the display equipment.
Further, the sensing device includes: camera devices, satellite navigation positioning devices, motion sensors, inertial measurement unit sensors, radar, infrared or optical flow sensors, and other sensors.
It is another object of the present invention to provide a computer program product stored on a computer readable medium, comprising a computer readable program for providing a user input interface to implement the drone three-dimensional map modeling method when executed on an electronic device.
Another object of the present invention is to provide a computer-readable storage medium storing instructions which, when executed on a computer, cause the computer to execute the three-dimensional map modeling method for a drone.
By combining all the technical schemes, the invention has the advantages and positive effects that: the method can rapidly process the acquired data, and can improve the speed and efficiency of generating the three-dimensional map by combining the two-bit data; meanwhile, various kinds of information are acquired by using various devices carried by the unmanned aerial vehicle, and the acquired information is corrected, so that the accuracy of data is ensured, and the accuracy of the three-dimensional map is also improved; meanwhile, the three-dimensional map is constructed by utilizing various information together, so that the comprehensive, detailed and accurate construction of the three-dimensional map can be ensured.
The three-dimensional modeling of the invention can be continuously updated, and a three-dimensional map which is more close to the environment, comprehensive, detailed and accurate is obtained for all unmanned aerial vehicles to use.
The three-dimensional model established by the invention can update the previous three-dimensional map version in the process of executing the task again and again, and the obtained latest three-dimensional map is closer to the environment, more comprehensive, more detailed and more accurate.
Compared with the prior art that a three-dimensional model of an object can be quickly established by utilizing laser point cloud and a three-dimensional map established by an unmanned aerial vehicle oblique photogrammetry technology based on a software-based modeling technology, a traditional photogrammetry-based three-dimensional modeling technology and a laser scanning-based modeling technology, the method adopts a continuous updating method, so that the three-dimensional map is closer to the environment in the using process, and is more comprehensive, more detailed and more accurate.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings needed to be used in the embodiments of the present application will be briefly described below, and it is obvious that the drawings described below are only some embodiments of the present application, and it is obvious for those skilled in the art that other drawings can be obtained from the drawings without creative efforts.
Fig. 1 is a flowchart of a three-dimensional map modeling method for an unmanned aerial vehicle according to an embodiment of the present invention.
Fig. 2 is a flowchart illustrating that a data processing module corrects related data acquired by a sensing device according to an embodiment of the present invention.
Fig. 3 is a flowchart illustrating that an image processing module provided in the embodiment of the present invention performs denoising and enhancement processing on acquired image data.
Fig. 4 is a flowchart of constructing an initial three-dimensional map based on the acquired environmental information by the three-dimensional map generation module according to the embodiment of the present invention.
Fig. 5 is a schematic structural diagram of a three-dimensional map modeling system of an unmanned aerial vehicle according to an embodiment of the present invention;
in the figure: 1. a data acquisition module; 2. a data processing module; 3. an image processing module; 4. a central control module; 5. an image feature extraction module; 6. a three-dimensional map generation module; 7. a three-dimensional environment construction module; 8. a positioning module; 9. a three-dimensional map construction module; 10. a track marking module; 11. a three-dimensional map updating module; 12. and a display module.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is further described in detail with reference to the following embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
Aiming at the problems in the prior art, the invention provides a three-dimensional map modeling method for an unmanned aerial vehicle, and the invention is described in detail below by combining the attached drawings.
As shown in fig. 1, the method for modeling the three-dimensional map of the unmanned aerial vehicle provided by the embodiment of the invention comprises the following steps:
s101, sensing and scanning the surrounding environment by a data acquisition module through sensing equipment carried on an unmanned aerial vehicle to acquire corresponding environment information; the data processing module corrects the related data acquired by the sensing equipment;
s102, the image processing module carries out denoising and enhancement processing on the acquired image data; the central control module controls the image feature extraction module to extract features of the processed image by using a single chip microcomputer or a controller, and determines feature points of a scanning environment;
s103, the three-dimensional map generation module constructs an initial three-dimensional map based on the acquired environmental information; the three-dimensional environment construction module effectively extracts, stores, edits and memorizes the acquired three-dimensional map image characteristics and constructs a reasonable and accurately represented spatial three-dimensional environment;
s104, calculating the current position and the attitude of the unmanned aerial vehicle by the positioning module by using the extracted environmental characteristic points, the navigation positioning data, the inertial measurement data and the motion sensing data; the three-dimensional map building module is used for building a three-dimensional map based on the generated initial three-dimensional map and a space three-dimensional environment;
s105, the track marking module marks the position and attitude data of the unmanned aerial vehicle in the constructed three-dimensional map; the three-dimensional map updating module judges whether the surrounding environment in the execution task is consistent with the constructed three-dimensional map model and the space three-dimensional environment; if not, continuously acquiring the environmental information, and updating the three-dimensional map;
and S106, the display module displays the real-time three-dimensional map marked with the position and attitude data of the unmanned aerial vehicle by using the display equipment.
As shown in fig. 2, in step S101, the modifying, by the data processing module according to the embodiment of the present invention, the related data acquired by using the sensing device includes:
s201, acquiring a test data sequence corresponding to each sensor of the sensing equipment and a preset test time sequence;
s202, determining calibration data of each sensor according to the test data sequence corresponding to each sensor; acquiring first data to be processed from at least one sensor of a sensing device;
s203, correcting the first data according to pre-stored calibration data corresponding to each sensor to generate second data; and the second data is the corrected data.
In step S202, determining calibration data of each sensor according to the test data sequence corresponding to each sensor provided in the embodiment of the present invention includes:
acquiring an average value of the test data sequence corresponding to each sensor, and determining calibration data corresponding to each sensor; or filtering the acquired test data sequence by using a preset data range, reserving the test data in the data range, carrying out averaging processing on the reserved test data, and determining the calibration data corresponding to each sensor.
In step S203, the modifying the first data according to the pre-stored calibration data corresponding to each sensor to generate second data according to the embodiment of the present invention includes:
comparing the size of the first data with preset reference data; if the first data is larger than or equal to the reference data, subtracting the calibration data from the first data to generate second data; and if the first data is smaller than the reference data, adding the first data and the calibration data to generate second data.
As shown in fig. 3, in step S102, the denoising and enhancing processing performed on the acquired image data by the image processing module according to the embodiment of the present invention includes:
s301, performing wavelet decomposition on the acquired image to obtain a low-frequency image and three high-frequency images;
s302, carrying out self-adaptive Gamma correction on the low-frequency image to obtain a low-frequency image with enhanced contrast;
s303, denoising the three high-frequency images respectively to obtain denoised high-frequency images;
s304, performing wavelet inverse transformation on the obtained contrast-enhanced low-frequency image and the obtained de-noised high-frequency image to obtain an enhanced image.
As shown in fig. 4, in step S103, the constructing an initial three-dimensional map by the three-dimensional map generating module according to the embodiment of the present invention based on the acquired environment information includes:
s401, acquiring two-dimensional vector map data, unmanned aerial vehicle position information and acquired environment information of the same area;
s402, rendering according to the acquired two-dimensional vector map data to generate two-dimensional map textures; triangularizing the area according to the acquired unmanned aerial vehicle position information and the acquired related environment information to generate a three-dimensional terrain;
s403, mapping the two-dimensional map texture to the three-dimensional terrain to generate a textured three-dimensional terrain;
s404, rendering the three-dimensional terrain with the texture to obtain a three-dimensional map corresponding to the area to be generated.
As shown in fig. 4, the three-dimensional map modeling system for an unmanned aerial vehicle provided by the embodiment of the present invention includes:
the data acquisition module 1 is connected with the central control module 4 and used for sensing and scanning the surrounding environment by using sensing equipment carried on the unmanned aerial vehicle to acquire corresponding environmental information;
the data processing module 2 is connected with the central control module 4 and used for correcting the related data acquired by the sensing equipment;
the image processing module 3 is connected with the central control module 4 and is used for carrying out denoising and enhancement processing on the acquired image data;
the system comprises a central control module 4, a data acquisition module 1, a data processing module 2, an image processing module 3, an image feature extraction module 5, a three-dimensional map generation module 6, a three-dimensional environment construction module 7, a positioning module 8, a three-dimensional map construction module 9, a track marking module 10, a three-dimensional map updating module 11 and a display module 12; the system is used for controlling each module to normally operate by utilizing a single chip microcomputer or a controller;
the image feature extraction module 5 is connected with the central control module 4 and is used for extracting features of the processed image and determining feature points of a scanning environment;
the three-dimensional map generation module 6 is connected with the central control module 4 and used for constructing an initial three-dimensional map based on the acquired environmental information;
the three-dimensional environment construction module 7 is connected with the central control module 4 and is used for effectively extracting, storing, editing and memorizing the acquired three-dimensional map image characteristics and constructing a reasonable and accurately represented spatial three-dimensional environment;
the positioning module 8 is connected with the central control module 4 and used for calculating the current position and the attitude of the unmanned aerial vehicle by using the extracted environmental characteristic points, the navigation positioning data, the inertial measurement data and the motion sensing data;
the three-dimensional map building module 9 is connected with the central control module 4 and used for building a three-dimensional map based on the generated initial three-dimensional map and a space three-dimensional environment;
the track marking module 10 is connected with the central control module 4 and is used for marking the position and attitude data of the unmanned aerial vehicle in the constructed three-dimensional map;
the three-dimensional map updating module 11 is connected with the central control module 4 and used for judging whether the surrounding environment in the executed task is consistent with the constructed three-dimensional map model and the space three-dimensional environment; if not, continuously acquiring the environmental information, and updating the three-dimensional map;
and the display module 12 is connected with the central control module 4 and is used for displaying the three-dimensional map marked with the position and the attitude data of the unmanned aerial vehicle by using display equipment.
The sensing device provided by the embodiment of the invention comprises: camera devices, satellite navigation positioning devices, motion sensors, inertial measurement unit sensors, radar, infrared or optical flow sensors, and other sensors.
In the description of the present invention, "a plurality" means two or more unless otherwise specified; the terms "upper", "lower", "left", "right", "inner", "outer", "front", "rear", "head", "tail", and the like, indicate orientations or positional relationships based on the orientations or positional relationships shown in the drawings, are only for convenience in describing and simplifying the description, and do not indicate or imply that the device or element referred to must have a particular orientation, be constructed in a particular orientation, and be operated, and thus, should not be construed as limiting the invention. Furthermore, the terms "first," "second," "third," and the like are used for descriptive purposes only and are not to be construed as indicating or implying relative importance.
The above description is only for the purpose of illustrating the present invention and the appended claims are not to be construed as limiting the scope of the invention, which is intended to cover all modifications, equivalents and improvements that are within the spirit and scope of the invention as defined by the appended claims.