Unmanned aerial vehicle system for detecting fan blade and control method
1. The utility model provides an unmanned aerial vehicle system for fan blade detects which characterized in that: the system comprises a control unit, a communication unit, a first unmanned aerial vehicle and a second unmanned aerial vehicle, wherein the first unmanned aerial vehicle is used for sampling thermal imaging images of a detected fan blade, and the second unmanned aerial vehicle is used for sampling high-definition images of the detected fan blade;
the first unmanned machine comprises a controller and a thermal imaging camera in control connection with the controller, the controller receives an unmanned machine control signal and a thermal imaging sampling signal of the control unit through the communication unit and controls a first unmanned machine to fly to an appointed position according to the unmanned machine control signal and the thermal imaging sampling signal to perform thermal imaging image sampling on the detected fan blade;
the second unmanned aerial vehicle comprises a controller and a high-definition imaging camera which is in control connection with the controller, the controller receives the unmanned aerial vehicle control signal and the high-definition imaging sampling signal of the control unit through the communication unit, and controls the second unmanned aerial vehicle to fly to the designated position according to the unmanned aerial vehicle control signal and the high-definition imaging sampling signal so as to perform high-definition image sampling on the detected fan blade;
the control unit is right the thermal imaging image of being detected fan blade that first unmanned aerial vehicle gathered is handled, discerns out the defect position of fan blade to send control command to second unmanned aerial vehicle's controller, control second unmanned aerial vehicle flies to reach the assigned position, carries out high definition image sampling to being detected fan blade defect position, and handles high definition image, will follow the fan blade defect position mark that the thermal imaging image was discerned on high definition image.
2. The drone system for fan blade detection of claim 1, wherein: the first unmanned aerial vehicle flies along the length direction that is on a parallel with the fan blade and shoots, and controls the thermal imaging camera with the distance of fan blade is 1 ~ 3 meters.
3. The drone system for fan blade detection of claim 1, wherein: the first unmanned aerial vehicle and the second unmanned aerial vehicle both further comprise RTK positioning modules, and the position information of the first unmanned aerial vehicle and the position information of the second unmanned aerial vehicle can be obtained through the RTK positioning modules.
4. The drone system for fan blade detection of claim 1, wherein: the high-definition camera is a high-definition camera or a double-light camera provided with a holder.
5. An unmanned aerial vehicle control method for detecting fan blades is characterized by comprising the following steps:
step 1, a control unit sends a control instruction to a first unmanned machine controller through a communication unit to control the first unmanned machine to fly to an appointed height position, then a thermal imaging sampling signal is sent to the first unmanned machine controller through the communication unit, a first unmanned machine thermal imaging camera is aligned to a detected fan blade to carry out thermal imaging image sampling, and the collected fan blade thermal imaging image is sent to the control unit through the controller;
step 2, the control unit processes the thermal imaging image of the detected fan blade acquired by the first unmanned aerial vehicle, identifies the defect position of the fan blade and generates a control instruction of a second unmanned aerial vehicle according to the defect position of the fan blade;
step 3, the control unit sends the control instruction generated in the step 2 to a controller of a second unmanned aerial vehicle through a communication unit to control the second unmanned aerial vehicle to fly to an appointed position, then a high-definition imaging sampling signal is sent to the controller of the second unmanned aerial vehicle through the communication unit, a high-definition imaging camera of the second unmanned aerial vehicle aims at the defect position of the detected fan blade to perform high-definition image sampling, and the acquired high-definition imaging image of the fan blade is sent to the control unit through the controller;
and 4, processing the high-definition image acquired in the step 3 by the control unit, and marking the fan blade defect position identified from the thermal imaging image in the step 2 on the high-definition image of the fan blade.
6. The drone controlling method for fan blade detection according to claim 5, characterized in that: in the step 1, when the first unmanned aerial vehicle samples the thermal imaging image of the detected fan blade, the control unit controls the second unmanned aerial vehicle to simultaneously acquire the whole image of the fan blade, wherein the whole image comprises an image containing the thermal imaging image of the first unmanned aerial vehicle on the fan blade.
7. The drone controlling method for fan blade detection according to claim 5, characterized in that: in the step 1, the first unmanned aerial vehicle flies and shoots along the length direction parallel to the fan blade, and controls the distance between the thermal imaging camera and the fan blade to be 1-3 meters.
8. A drone control method for fan blade detection according to claim 5 or 6, characterised in that: the first unmanned aerial vehicle and the second unmanned aerial vehicle both further comprise RTK positioning modules, the position information of the first unmanned aerial vehicle and the position information of the second unmanned aerial vehicle can be obtained through the RTK positioning modules, when the first unmanned aerial vehicle and the second unmanned aerial vehicle perform thermal imaging image sampling and high-definition image sampling, the position information of the first unmanned aerial vehicle and the position information of the second unmanned aerial vehicle are marked in the thermal imaging image and the high-definition image, in the step 4, the position of the defect of the fan blade identified from the thermal imaging image in the step 2 is marked in the high-definition image of the fan blade, and the method further comprises the step of uniformly adjusting and positioning the proportion of the thermal imaging image and the high-definition image according to the position information in the thermal imaging image and the high-definition image.
9. The drone control method for fan blade detection according to claim 5, wherein in step 2, the control unit processes the first drone acquired thermal imaging image of the detected fan blade, and in particular, comprises the following steps:
step 21, carrying out image augmentation processing on the thermal imaging image to increase the data volume of the thermal imaging image;
step 22, performing binarization processing on the thermal imaging image subjected to the image amplification processing in the step 21, and then eliminating Gaussian noise in the thermal imaging image through Gaussian filtering processing, specifically, adopting two-dimensional Gaussian distributionPerforming Gaussian filtering processing on characteristic variables x and y in the thermal imaging image subjected to the image amplification processing in the step 21, wherein sigma is an adjustable parameter, the size of sigma determines the width of a Gaussian function, G (x, y) represents the Gaussian function related to x and y, x represents the gray value of a pixel point, and y represents the gray value calculated through the Gaussian function; then, linear gray scale enhancement is carried out on the image after Gaussian filtering treatment specifically by using a gray scale linear transformation functionPerforming linear gray scale enhancement processing on the characteristic variables x and y in the image after the Gaussian filtering processing, wherein g (x, y) represents a linear transformation function about (x, y), and the gray scale range is [ c, d ]]The Gaussian function is to process the noise of the image, the linear transformation function is to linearly expand the pixels in the image, f (x, y) is the gray value of the thermal imaging image before the conversion of the linear transformation function, and the gray range is [ a, b ]];
Step 23, further performing image sharpening on the image processed in step 22 by using Scharr operator to obtain an image enhancement model, wherein the size of edge detection filtering of the charr operator is 3 × 3, so that the Scharr operator is also called Scharr filter, and the difference between pixel values can be increased by amplifying the weight coefficient in the Scharr operator, which is the idea adopted by the Scharr operator and is used for edge detection operators in the X direction and the Y direction of the binarized image
10. The drone control method for fan blade detection according to claim 5, wherein in step 2, the control unit processes the first drone acquired thermal imaging image of the detected fan blade, and in particular, comprises the following steps:
a fan blade extraction step, namely acquiring a thermal imaging image of a fan blade to be detected, carrying out binarization processing on the thermal imaging image, dividing the thermal imaging image after binarization processing into N x N grids, wherein the N and N can be adjusted according to design requirements and precision requirements, and judging all the grids belonging to the edge area of the fan blade in the thermal imaging image according to the principle that the size of each divided grid is not smaller than that of each pixel point, then comparing the middle of each grid with a gray judgment threshold value which is set according to a prior threshold value and belongs to the grids in the edge area of the fan blade, wherein the middle gray value of each grid refers to the gray value which is in the median or has the maximum specific gravity of all the gray values contained in the grids, and removing the background except the edge area in the thermal imaging image according to the position of the edge area of the fan blade, obtaining a fan blade image in the image;
a fan blade edge determining step, which is to perform gray value derivation on the edge area of the fan blade image extracted in the fan blade extracting step, determine pixel points at the edge of the fan blade according to the gray value derivation result, extract the edge of the fan blade, specifically, perform gray value derivation processing on the fan blade image one by one, compare the gray value derivation result with a gray value derivation determination threshold of the pixel points at the edge of the fan blade set according to a prior threshold, determine all pixel points belonging to the edge of the fan blade in the thermal imaging image, namely, the edge of the fan blade in the thermal imaging image, set up a two-dimensional coordinate system for the fan blade image, and express each pixel point in the thermal imaging image by two-dimensional coordinates (x, y), one image has x and y directions which can also be called (x, y) coordinates, and derivation can only be one direction, therefore, two-dimensional derivatives in two directions (x, y) are established to correspond to the binary pixel points, the point can be judged as an edge point when the derivatives in any direction exceed a set value, and then a gradient formula is utilizedCalculating each pixelThe modulus of the gradient of the point correspondingly obtains the transformation rate of the gray value of each pixel point, wherein the length of the integral gradient f is M (x, y), and thenWhereinRespectively representing derivation in the x direction and derivation in the early y direction, wherein the conversion rate of the gray value is the result of the gray value derivation, and correspondingly, the gray value derivation judgment threshold is a set conversion rate value;
an image enhancement step, namely performing neighborhood convolution processing on the fan blade edge of the thermal imaging image extracted in the fan blade extraction step through a Prewitt operator to realize extreme value detection on the fan blade edge and remove a false edge in the fan blade edge of the thermal imaging image extracted in the fan blade extraction step, specifically: is based on a formula of gradient value calculationOr P (x)i,yj)=G(xi)+G(yj) Processing each pixel point at the edge of the fan blade of the thermal imaging image extracted in the fan blade extraction step, calculating the gray difference between each pixel point and the upper, lower, left and right adjacent pixel points of each pixel point, and completing the extreme value detection of the edge pixel points, wherein,
G(xi)=|[f(xi-1,yj-1)+f(xi-1,yj)+f(xi-1,yj+1)]-[f(xi+1,yj-1)+f(xi+1,yj)+f(xi+1,yj+1)]|
G(yj)=|[f(xi-1,yj+1)+f(xi,yj+1)+f(xi+1,yj+1)]-[f(xi-1,yj-1)+f(xi,yj-1)+f(xi+1,yj-1)]|
i represents the ith row in the two-dimensional coordinate system, and j represents the jth row in the two-dimensional coordinate system;
and obtaining the image enhancement model.
Background
Wind energy is an important renewable energy source, and with the expansion of the wind energy market in China, the fan manufacturing industry gradually enters a high-speed development period. The service life and the safety of the wind driven generator influence the step of wind power utilization and development, and the fan blade is a core component of the wind driven generator, so that the service life and the safety of the fan blade directly influence the service life and the safety condition of the whole wind driven generator set. Because the operating environment of a wind power plant is complex, the fan blades operate at high altitude all day long, the influence of various factors such as wind sand, pollution, lightning stroke, typhoon and the like is received for a long time, the fan blades are easy to have defects and gradually expand, if the defects of the fan blades cannot be found in time, the load and the rigidity matrix can be directly influenced, and finally the service life and the operation safety of the blades are reduced.
In the prior art, a fan blade is patrolled and navigated by flying above a designated fan through an unmanned aerial vehicle, shooting is carried out, and then the image shot by the unmanned aerial vehicle is exported for checking and analysis, so that the mode has the disadvantages of high labor cost for processing, and the detection efficiency and the identification accuracy are reduced; in addition, because artificial control, strike the fan easily and lead to unmanned aerial vehicle to damage to and lead to increasing the damage degree on fan blade surface at unmanned aerial vehicle flight in-process.
Disclosure of Invention
The invention aims to overcome the defects of the prior art, and the invention aims to provide an unmanned aerial vehicle system for detecting a fan blade, which can improve the detection efficiency and the identification accuracy, reduce the risk of collision between the unmanned aerial vehicle and the fan blade, and prolong the service lives of the unmanned aerial vehicle and the fan blade.
The invention provides an unmanned aerial vehicle system for fan blade detection, which comprises a control unit, a communication unit, a first unmanned aerial vehicle and a second unmanned aerial vehicle, wherein the first unmanned aerial vehicle is used for sampling a thermal imaging image of a detected fan blade and the second unmanned aerial vehicle is used for sampling a high-definition image of the detected fan blade;
the first unmanned machine comprises a controller and a thermal imaging camera in control connection with the controller, the controller receives an unmanned machine control signal and a thermal imaging sampling signal of the control unit through the communication unit and controls a first unmanned machine to fly to an appointed position according to the unmanned machine control signal and the thermal imaging sampling signal to perform thermal imaging image sampling on the detected fan blade;
the second unmanned aerial vehicle comprises a controller and a high-definition imaging camera which is in control connection with the controller, the controller receives the unmanned aerial vehicle control signal and the high-definition imaging sampling signal of the control unit through the communication unit, and controls the second unmanned aerial vehicle to fly to the designated position according to the unmanned aerial vehicle control signal and the high-definition imaging sampling signal so as to perform high-definition image sampling on the detected fan blade;
the control unit is right the thermal imaging image of being detected fan blade that first unmanned aerial vehicle gathered is handled, discerns out the defect position of fan blade to send control command to second unmanned aerial vehicle's controller, control second unmanned aerial vehicle flies to reach the assigned position, carries out high definition image sampling to being detected fan blade defect position, and handles high definition image, will follow the fan blade defect position mark that the thermal imaging image was discerned on high definition image.
Preferably, the first unmanned aerial vehicle flies and shoots along the length direction parallel to the fan blade, and controls the thermal imaging camera and the fan blade to be 1-3 meters apart from each other.
Preferably, the first unmanned aerial vehicle and the second unmanned aerial vehicle each further include an RTK positioning module, and the position information of the first unmanned aerial vehicle and the second unmanned aerial vehicle can be obtained by the RTK positioning module.
Preferably, the high-definition camera is a high-definition camera or a dual-optical camera provided with a holder.
Corresponding to the unmanned aerial vehicle system, the invention also provides an unmanned aerial vehicle system for detecting the fan blade and a control method, and the unmanned aerial vehicle system comprises the following steps:
step 1, a control unit sends a control instruction to a first unmanned machine controller through a communication unit to control the first unmanned machine to fly to an appointed height position, then a thermal imaging sampling signal is sent to the first unmanned machine controller through the communication unit, a first unmanned machine thermal imaging camera is aligned to a detected fan blade to carry out thermal imaging image sampling, and the collected fan blade thermal imaging image is sent to the control unit through the controller;
step 2, the control unit processes the thermal imaging image of the detected fan blade acquired by the first unmanned aerial vehicle, identifies the defect position of the fan blade and generates a control instruction of a second unmanned aerial vehicle according to the defect position of the fan blade;
step 3, the control unit sends the control instruction generated in the step 2 to a controller of a second unmanned aerial vehicle through a communication unit to control the second unmanned aerial vehicle to fly to an appointed position, then a high-definition imaging sampling signal is sent to the controller of the second unmanned aerial vehicle through the communication unit, a high-definition imaging camera of the second unmanned aerial vehicle aims at the defect position of the detected fan blade to perform high-definition image sampling, and the acquired high-definition imaging image of the fan blade is sent to the control unit through the controller;
and 4, processing the high-definition image acquired in the step 3 by the control unit, and marking the fan blade defect position identified from the thermal imaging image in the step 2 on the high-definition image of the fan blade.
Preferably, in step 1, when the first unmanned aerial vehicle samples the thermal imaging image of the detected fan blade, the control unit controls the second unmanned aerial vehicle to simultaneously acquire an overall image of the fan blade, wherein the overall image includes an image containing the thermal imaging image of the first unmanned aerial vehicle on the fan blade.
Furthermore, the first unmanned aerial vehicle and the second unmanned aerial vehicle both further comprise RTK positioning modules, the position information of the first unmanned aerial vehicle and the position information of the second unmanned aerial vehicle can be obtained through the RTK positioning modules, when the first unmanned aerial vehicle and the second unmanned aerial vehicle perform thermal imaging image sampling and high-definition image sampling, the position information of the first unmanned aerial vehicle and the position information of the second unmanned aerial vehicle are marked in the thermal imaging image and the high-definition image, in the step 4, the position of the defect of the fan blade identified from the thermal imaging image in the step 2 is marked in the high-definition image of the fan blade, and the method further comprises the step of uniformly adjusting and positioning the proportion of the thermal imaging image and the high-definition image according to the position information in the thermal imaging image and the high-definition image.
More specifically, in step 2, the control unit processes the first unmanned thermal imaging image of the detected fan blade, specifically, the method includes the following steps:
step 1, carrying out image augmentation processing on a thermal imaging image to increase the data volume of the thermal imaging image; specifically, in step 1, the image enhancement processing on the thermal imaging image includes: the thermal imaging image is turned, cut and color-changed, so that the image quality can be improved, calculation and analysis are facilitated, the generalization capability of the model is improved, and overfitting of the model can be prevented, so that the detection precision is reduced. Preferably, the turning of the thermal imaging image is that the thermal imaging image is turned left, right, up and down; the color transformation processing is to transform the color saturation of the same thermal imaging image to form a plurality of thermal imaging images with different color saturations; the cropping is a cropping of the thermographic image at a different location, such as the lower right or upper left portion.
Step 2, performing binarization processing on the thermal imaging image subjected to the image amplification processing in the step 1, then eliminating Gaussian noise in the thermal imaging image through Gaussian filtering processing, and then performing linear gray level enhancement processing on the image subjected to the Gaussian filtering processing; gaussian filtering is a linear smooth filtering, is suitable for eliminating gaussian noise, and is widely applied to a noise reduction process of image processing, in popular terms, gaussian filtering is a process of performing weighted average on the whole image, and the value of each pixel point is obtained by performing weighted average on the value of each pixel point and other pixel values in the neighborhood. Given a black-and-white picture, the picture may not be clear in black and white, the pixel value of the blackest pixel point of the whole picture may not reach 0, and the pixel value of the whitest pixel point may not reach 255, in other words, the pixels of the black-and-white picture are not uniform, and the entire pixel value is concentrated in a certain area, so that a linear pixel is required to be performed, therefore, in the step 2, the linear gray scale enhancement processing is to adjust the pixel value of the blackest pixel point in the image after the gaussian filtering processing to 0, adjust the pixel value of the whitest pixel point to 255, perform linear adjustment on the pixel values of other pixel points according to the color depth ratio between the pixel values and the blackest and/or whitest pixel points, and the ratio and the linearity can be set and adjusted according to the prior threshold.
Preferably, the step 2 is performed with gaussian filtering, specifically, two-dimensional gaussian distribution is adoptedAnd (2) carrying out Gaussian filtering processing on the characteristic variables x and y in the thermal imaging image subjected to the image amplification processing in the step (1), wherein sigma is an adjustable parameter, the size of sigma determines the width of a Gaussian function, G (x, y) represents the Gaussian function related to x and y, x represents the gray value of a pixel point, and y represents the gray value calculated through the Gaussian function.
Further, in the step 2, a linear gray scale enhancement process is performed, specifically, by a gray scale linear transformation functionPerforming linear gray scale enhancement processing on the characteristic variables x and y in the image after the Gaussian filtering processing, wherein g (x, y) represents a linear transformation function about (x, y), and the gray scale range is [ c, d ]]The Gaussian function is a linear transformation function of the noise of the processed imageAnd performing linear expansion on pixels in the image, wherein f (x, y) is the gray value of the thermal imaging image before linear transformation function conversion, and the gray range is [ a, b ]]。
And 3, further carrying out image sharpening on the image processed in the step 2 to obtain an image enhancement model. Preferably, in the step 3, the Scharr operator is used for image sharpening, edge details of the image can be more highlighted, and therefore calculation accuracy is improved, in order to effectively extract weak edges, a difference between pixel values needs to be increased, so that the Scharr operator is introduced, the Scharr operator is used for enhancing a difference of Sobel operators, and therefore a principle of detecting an edge of the image and a use mode between the two are the same, a size of edge detection filtering of the charr operator is 3 × 3, so that the Scharr operator is also called a Scharr filter, a difference between pixel values can be increased by amplifying a weight coefficient in the Scharr operator, and the Scharr operator is used for the idea that the Scharr operator is used for edge detection in an X direction and a Y direction of the binarized image, and the Scharr operator is used for edge detection in the X direction and the Y direction of the binarized image
Or, in the step 2, the control unit processes the thermal imaging image of the detected fan blade acquired by the first unmanned machine, specifically, the method includes the following steps:
a fan blade extraction step, namely acquiring a thermal imaging image of a fan blade to be detected, wherein the fan blade is required to be arranged in the acquired thermal imaging image and is clear, segmenting the thermal imaging image, extracting the edge area of the fan blade in the thermal imaging image, and extracting the fan blade image in the thermal imaging image according to the edge area of the fan blade;
preferably, in the fan blade extracting step, the fan blade in the thermal imaging image is extracted according to the edge area of the fan blade, and the background except the edge area in the thermal imaging image is removed according to the position of the edge area of the fan blade, so that the fan blade image in the image is obtained.
Specifically, in the fan blade extraction step, after a thermal imaging image of a fan blade to be detected is obtained, binarization processing is performed on the thermal imaging image, the thermal imaging image after binarization processing is divided into N × N grids, which step can be performed first, where N and N can be adjusted according to design requirements and precision requirements, and the principle is that the size of each divided grid is not less than that of each pixel point, then the middle of each grid is compared with a gray level determination threshold value, which is set according to a priori threshold value and belongs to grids in the fan blade edge region, to determine all grids belonging to the fan blade edge region in the thermal imaging image, and the middle gray value of each grid refers to a gray value, which is in the median or has the maximum specific gravity, of all gray values contained in the grids.
And determining the edge of the fan blade, namely performing gray value derivation on the edge area of the fan blade image extracted in the fan blade extraction step, determining pixel points of the edge of the fan blade according to a gray value derivation result, and extracting the edge of the fan blade.
Specifically, in the fan blade edge determining step, firstly, binarization processing is performed on a fan blade image, then gray value derivation processing is performed on pixels of the binarized image one by one, and a gray value derivation result is compared with a gray value derivation judgment threshold of fan blade edge pixels set according to a priori threshold, so that all pixels belonging to the fan blade edge in the thermal imaging image are judged, and the fan blade edge in the thermal imaging image is formed.
Furthermore, derivation is performed on the gray value of the thermal imaging image in the fan blade edge determining step, specifically, after binarization processing is performed on the thermal imaging image, a two-dimensional coordinate system is established for the thermal imaging image subjected to binarization processing, each pixel point in the thermal imaging image is represented by a two-dimensional coordinate (x, y), one image has x direction and y direction and can also be called (x, y) coordinate, derivation can only be performed in one direction, so that derivatives in two directions of the two-dimensional (x, y) are established to correspond to the binary pixel points, and derivatives in any direction exceed the set derivative in any directionThe fixed value can judge the point as an edge point, and then a gradient formula is utilizedCalculating the modulus of the gradient of each pixel point, and correspondingly obtaining the transformation rate of the gray value of each pixel point, wherein the length of the integral gradient f is M (x, y), and thenWhereinAnd respectively expressing derivation in the x direction and derivation in the early y direction, wherein the conversion rate of the gray value is the result of the gray value derivation, and correspondingly, the gray value derivation judgment threshold is a set conversion rate value.
Certainly, for saving calculation power, reducing implementation cost, or aiming at a use scenario with a less high progress requirement, in the step of determining the fan blade edge, the binary processing may be performed on the fan blade image, and the thermal imaging image after the binary processing may be divided into N × N grids, which may be performed first, where N and N may be adjusted according to design requirements and precision requirements, and the principle is that the size of each divided grid is not smaller than each pixel point, then the intermediate gray value of each grid is derived, and the result of the derivation of the gray value is compared with the gray derivative determination threshold of the fan blade edge grid set according to the prior threshold, to determine all grids belonging to the fan blade edge in the thermal imaging image, and all determined grids enclose the fan blade edge in the thermal imaging image, the middle gray value of each grid refers to the gray value with the highest median or specific gravity among all the gray values contained in the grid.
And an image enhancement step, namely performing neighborhood convolution processing on the fan blade edge of the thermal imaging image extracted in the fan blade edge determination step, calculating the gray difference between each position of the fan blade edge and the upper, lower, left and right adjacent positions of the fan blade edge, realizing the extreme value detection of the fan blade edge, and removing the false edge in the fan blade edge of the thermal imaging image extracted in the fan blade extraction step to obtain an image enhancement model.
The image enhancement step is to perform neighborhood convolution processing on the fan blade edge of the thermal imaging image extracted in the fan blade extraction step through a Prewitt operator, wherein the Prewitt operator is edge detection of a first-order differential operator, the edge detection edge is subjected to extreme value detection at the edge by utilizing the gray difference of upper, lower, left and right adjacent points of a pixel point, part of false edges are removed, and the noise enhancement step has a smoothing effect, and the principle is to perform neighborhood convolution on the image in an image space by utilizing two direction templates, wherein one of the two direction templates is used for detecting a horizontal edge and the other is used for detecting a vertical edge, and the method comprises the following steps:
is based on a formula of gradient value calculationOr P (x)i,yj)=G(xi)+G(yj) Processing each pixel point at the edge of the fan blade of the thermal imaging image extracted in the fan blade extraction step, calculating the gray difference between each pixel point and the upper, lower, left and right adjacent pixel points of each pixel point, and completing the extreme value detection of the edge pixel points, wherein,
G(xi)=|[f(xi-1,yj-1)+f(xi-1,yj)+f(xi-1,yj+1)]-[f(xi+1,yj-1)+f(xi+1,yj)+f(xi+1,yj+1)]|
G(yj)=|[f(xi-1,yj+1)+f(xi,yj+1)+f(xi+1,yj+1)]-[f(xi-1,yj-1)+f(xi,yj-1)+f(xi+1,yj-1)]|
i denotes the ith row in the two-dimensional coordinate system, and j denotes the jth row in the two-dimensional coordinate system.
Compared with the prior art, the technical scheme of the invention at least has the following advantages:
above scheme, mutually support through first unmanned aerial vehicle and second unmanned aerial vehicle, thereby guarantee the flight safety of unmanned aerial vehicle flight process, reduce the risk of unmanned aerial vehicle and fan blade striking, prolong the life of unmanned aerial vehicle and fan blade, shoot the image of fan blade by first unmanned aerial vehicle after, connect through the communication with the image transmission who acquires for the control unit analyze and save and correspond and generate the control command to second unmanned aerial vehicle, guarantee when first unmanned aerial vehicle trouble or damage, can ensure that data does not lose simultaneously, can also control second unmanned aerial vehicle pertinence carry out high definition image sampling to the defective position.
According to the scheme, the thermal imaging image of the first unmanned aerial vehicle shooting fan blade is analyzed and processed through the control unit, when the image sent by the first unmanned aerial vehicle is analyzed to have a defect, the control unit further sends a control instruction to the first unmanned aerial vehicle through the second communication module and controls the high-definition camera to obtain a high-definition image of the same position of the fan blade, repeated shooting is carried out through the first unmanned aerial vehicle, and therefore the process of repeatedly confirming the position of the defect is achieved, and the detection accuracy of the fan blade is improved.
Drawings
The foregoing and following detailed description of the invention will be apparent when read in conjunction with the following drawings, in which:
fig. 1 is a schematic structural diagram of an unmanned aerial vehicle system for detecting a fan blade according to the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention more apparent, the technical solutions of the embodiments of the present invention will be described clearly and completely with reference to the accompanying drawings of the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all embodiments of the present invention. All other embodiments, which can be obtained by a person skilled in the art without any inventive step based on the embodiments of the present invention, are within the scope of the present invention. Thus, the following detailed description of the embodiments of the present invention, presented in the figures, is not intended to limit the scope of the invention, as claimed, but is merely representative of selected embodiments of the invention. All other embodiments, which can be obtained by a person skilled in the art without any inventive step based on the embodiments of the present invention, are within the scope of the present invention.
In the description of the present invention, the terms "first" and "second" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implying any number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include one or more of that feature. In the description of the present invention, "a plurality" means two or more unless specifically defined otherwise.
The present invention will be described in detail with reference to the following examples.
The invention provides an unmanned aerial vehicle system for detecting a fan blade, which can improve the detection efficiency and the identification accuracy, reduce the risk of collision between an unmanned aerial vehicle and the fan blade and prolong the service life of the unmanned aerial vehicle and the fan blade.
Referring to fig. 1, a schematic structural diagram of an unmanned aerial vehicle system for detecting a fan blade according to this embodiment is shown.
In this embodiment, the unmanned aerial vehicle system includes:
the system comprises a control unit, a communication unit, a first unmanned aerial vehicle and a second unmanned aerial vehicle, wherein the first unmanned aerial vehicle is used for sampling thermal imaging images of detected fan blades and is used for sampling the high-definition images of the detected fan blades.
The first unmanned aerial vehicle comprises a controller and a thermal imaging camera connected with the controller in a control mode, the controller receives an unmanned aerial vehicle control signal and a thermal imaging sampling signal of the control unit through the communication unit, and controls the first unmanned aerial vehicle to fly to an appointed position to carry out thermal imaging image sampling on a detected fan blade according to the unmanned aerial vehicle control signal and the thermal imaging sampling signal.
The second unmanned aerial vehicle includes the controller and with the controller between the continuous high definition formation of image camera of control, the controller passes through the communication unit is received the unmanned aerial vehicle control signal and the high definition formation of image sampling signal of the control unit to according to unmanned aerial vehicle control signal and high definition formation of image sampling signal control second unmanned aerial vehicle fly to reach the assigned position and carry out high definition image sampling to being detected the fan blade.
The control unit is right the thermal imaging image of being detected fan blade that first unmanned aerial vehicle gathered is handled, discerns out the defect position of fan blade to send control command to second unmanned aerial vehicle's controller, control second unmanned aerial vehicle flies to reach the assigned position, carries out high definition image sampling to being detected fan blade defect position, and handles high definition image, will follow the fan blade defect position mark that the thermal imaging image was discerned on high definition image.
The control unit is used for detecting the thermal imaging image according to a preset defect detection model, and when the thermal imaging image is detected to have defects, the control processing module further sends a control instruction to the second unmanned aerial vehicle through the communication unit and controls the high-definition camera to acquire a high-definition image of the fan blade to a specified position or the same position as the first unmanned aerial vehicle during sampling; the control unit is further used for marking the thermal imaging image and the corresponding high-definition image and storing the thermal imaging image and the corresponding high-definition image in the memory.
The control unit processes the thermal imaging image of the detected fan blade acquired by the first unmanned machine, and specifically comprises the following steps:
step 1, carrying out image augmentation processing on a thermal imaging image to increase the data volume of the thermal imaging image; specifically, in step 1, the image enhancement processing on the thermal imaging image includes: the thermal imaging image is turned, cut and color-changed, so that the image quality can be improved, calculation and analysis are facilitated, the generalization capability of the model is improved, and overfitting of the model can be prevented, so that the detection precision is reduced. Preferably, the turning of the thermal imaging image is that the thermal imaging image is turned left, right, up and down; the color transformation processing is to transform the color saturation of the same thermal imaging image to form a plurality of thermal imaging images with different color saturations; the cropping is a cropping of the thermographic image at a different location, such as the lower right or upper left portion.
Step 2, performing binarization processing on the thermal imaging image subjected to the image amplification processing in the step 1, then eliminating Gaussian noise in the thermal imaging image through Gaussian filtering processing, and then performing linear gray level enhancement processing on the image subjected to the Gaussian filtering processing; gaussian filtering is a linear smooth filtering, is suitable for eliminating gaussian noise, and is widely applied to a noise reduction process of image processing, in popular terms, gaussian filtering is a process of performing weighted average on the whole image, and the value of each pixel point is obtained by performing weighted average on the value of each pixel point and other pixel values in the neighborhood. Given a black-and-white picture, the picture may not be clear in black and white, the pixel value of the blackest pixel point of the whole picture may not reach 0, and the pixel value of the whitest pixel point may not reach 255, in other words, the pixels of the black-and-white picture are not uniform, and the entire pixel value is concentrated in a certain area, so that a linear pixel is required to be performed, therefore, in the step 2, the linear gray scale enhancement processing is to adjust the pixel value of the blackest pixel point in the image after the gaussian filtering processing to 0, adjust the pixel value of the whitest pixel point to 255, perform linear adjustment on the pixel values of other pixel points according to the color depth ratio between the pixel values and the blackest and/or whitest pixel points, and the ratio and the linearity can be set and adjusted according to the prior threshold.
Preferably, the step 2 is performed with gaussian filtering, specifically, two-dimensional gaussian distribution is adoptedAnd (2) carrying out Gaussian filtering processing on the characteristic variables x and y in the thermal imaging image subjected to the image amplification processing in the step (1), wherein sigma is an adjustable parameter, the size of sigma determines the width of a Gaussian function, G (x, y) represents the Gaussian function related to x and y, x represents the gray value of a pixel point, and y represents the gray value calculated through the Gaussian function.
Further, in the step 2, a linear gray scale enhancement process is performed, specifically, by a gray scale linear transformation functionPerforming linear gray scale enhancement processing on the characteristic variables x and y in the image after the Gaussian filtering processing, wherein g (x, y) represents a linear transformation function about (x, y), and the gray scale range is [ c, d ]]The Gaussian function is to process the noise of the image, the linear transformation function is to linearly expand the pixels in the image, f (x, y) is the gray value of the thermal imaging image before the conversion of the linear transformation function, and the gray range is [ a, b ]]。
And 3, further carrying out image sharpening on the image processed in the step 2 to obtain an image enhancement model. Preferably, in the step 3, the Scharr operator is used for image sharpening, edge details of the image can be more highlighted, and therefore calculation accuracy is improved, in order to effectively extract weak edges, a difference between pixel values needs to be increased, so that the Scharr operator is introduced, the Scharr operator is used for enhancing a difference of Sobel operators, and therefore a principle of detecting an edge of the image and a use mode between the two are the same, a size of edge detection filtering of the charr operator is 3 × 3, so that the Scharr operator is also called a Scharr filter, a difference between pixel values can be increased by amplifying a weight coefficient in the Scharr operator, and the Scharr operator is used for the idea that the Scharr operator is used for edge detection in an X direction and a Y direction of the binarized image, and the Scharr operator is used for edge detection in the X direction and the Y direction of the binarized image
Or, in the step 2, the control unit processes the thermal imaging image of the detected fan blade acquired by the first unmanned machine, specifically, the method includes the following steps:
a fan blade extraction step, namely acquiring a thermal imaging image of a fan blade to be detected, wherein the fan blade is required to be arranged in the acquired thermal imaging image and is clear, segmenting the thermal imaging image, extracting the edge area of the fan blade in the thermal imaging image, and extracting the fan blade image in the thermal imaging image according to the edge area of the fan blade;
preferably, in the fan blade extracting step, the fan blade in the thermal imaging image is extracted according to the edge area of the fan blade, and the background except the edge area in the thermal imaging image is removed according to the position of the edge area of the fan blade, so that the fan blade image in the image is obtained.
Specifically, in the fan blade extraction step, after a thermal imaging image of a fan blade to be detected is obtained, binarization processing is performed on the thermal imaging image, the thermal imaging image after binarization processing is divided into N × N grids, which step can be performed first, where N and N can be adjusted according to design requirements and precision requirements, and the principle is that the size of each divided grid is not less than that of each pixel point, then the middle of each grid is compared with a gray level determination threshold value, which is set according to a priori threshold value and belongs to grids in the fan blade edge region, to determine all grids belonging to the fan blade edge region in the thermal imaging image, and the middle gray value of each grid refers to a gray value, which is in the median or has the maximum specific gravity, of all gray values contained in the grids.
And determining the edge of the fan blade, namely performing gray value derivation on the edge area of the fan blade image extracted in the fan blade extraction step, determining pixel points of the edge of the fan blade according to a gray value derivation result, and extracting the edge of the fan blade.
Specifically, in the fan blade edge determining step, firstly, binarization processing is performed on a fan blade image, then gray value derivation processing is performed on pixels of the binarized image one by one, and a gray value derivation result is compared with a gray value derivation judgment threshold of fan blade edge pixels set according to a priori threshold, so that all pixels belonging to the fan blade edge in the thermal imaging image are judged, and the fan blade edge in the thermal imaging image is formed.
Furthermore, in the step of determining the edge of the fan blade, the derivative is performed on the gray value of the thermal imaging image, specifically, the derivative is performed on the thermal imaging imageAfter binarization processing, a two-dimensional coordinate system is established for the thermal imaging image subjected to binarization processing, each pixel point in the thermal imaging image is represented by two-dimensional coordinates (x, y), one image has x direction and y direction which can also be called as (x, y) coordinates, derivation can only be carried out in one direction, so that derivatives in two-dimensional (x, y) directions are established to correspond to the binary pixel points, the point can be judged as an edge point when the derivative in any direction exceeds a set value, and then a gradient formula is utilized to determine that the point is an edge point when the derivative in any direction exceeds the set valueCalculating the modulus of the gradient of each pixel point, and correspondingly obtaining the transformation rate of the gray value of each pixel point, wherein the length of the integral gradient f is M (x, y), and thenWhereinAnd respectively expressing derivation in the x direction and derivation in the early y direction, wherein the conversion rate of the gray value is the result of the gray value derivation, and correspondingly, the gray value derivation judgment threshold is a set conversion rate value.
Certainly, for saving calculation power, reducing implementation cost, or aiming at a use scenario with a less high progress requirement, in the step of determining the fan blade edge, the binary processing may be performed on the fan blade image, and the thermal imaging image after the binary processing may be divided into N × N grids, which may be performed first, where N and N may be adjusted according to design requirements and precision requirements, and the principle is that the size of each divided grid is not smaller than each pixel point, then the intermediate gray value of each grid is derived, and the result of the derivation of the gray value is compared with the gray derivative determination threshold of the fan blade edge grid set according to the prior threshold, to determine all grids belonging to the fan blade edge in the thermal imaging image, and all determined grids enclose the fan blade edge in the thermal imaging image, the middle gray value of each grid refers to the gray value with the highest median or specific gravity among all the gray values contained in the grid.
And an image enhancement step, namely performing neighborhood convolution processing on the fan blade edge of the thermal imaging image extracted in the fan blade edge determination step, calculating the gray difference between each position of the fan blade edge and the upper, lower, left and right adjacent positions of the fan blade edge, realizing the extreme value detection of the fan blade edge, and removing the false edge in the fan blade edge of the thermal imaging image extracted in the fan blade extraction step to obtain an image enhancement model.
The image enhancement step is to perform neighborhood convolution processing on the fan blade edge of the thermal imaging image extracted in the fan blade extraction step through a Prewitt operator, wherein the Prewitt operator is edge detection of a first-order differential operator, the edge detection edge is subjected to extreme value detection at the edge by utilizing the gray difference of upper, lower, left and right adjacent points of a pixel point, part of false edges are removed, and the noise enhancement step has a smoothing effect, and the principle is to perform neighborhood convolution on the image in an image space by utilizing two direction templates, wherein one of the two direction templates is used for detecting a horizontal edge and the other is used for detecting a vertical edge, and the method comprises the following steps:
is based on a formula of gradient value calculationOr P (x)i,yj)=G(xi)+G(yj) Processing each pixel point at the edge of the fan blade of the thermal imaging image extracted in the fan blade extraction step, calculating the gray difference between each pixel point and the upper, lower, left and right adjacent pixel points of each pixel point, and completing the extreme value detection of the edge pixel points, wherein,
G(xi)=|[f(xi-1,yj-1)+f(xi-1,yj)+f(xi-1,yj+1)]-[f(xi+1,yj-1)+f(xi+1,yj)+f(xi+1,yj+1)]|
G(yj)=|[f(xi-1,yj+1)+f(xi,yj+1)+f(xi+1,yj+1)]-[f(xi-1,yj-1)+f(xi,yj-1)+f(xi+1,yj-1)]|
i denotes the ith row in the two-dimensional coordinate system, and j denotes the jth row in the two-dimensional coordinate system.
Preferably, the high definition camera is a high definition camera or a dual-optical camera provided with a holder, the holder is arranged on the first unmanned high definition camera and can be used for installing and fixedly supporting the high definition camera, the holder is a 4108 motor triaxial micro single holder, so that Sony and other micro single cameras can be hung, the maximum load is 800g, triaxial stability increase in the flight process is realized, and the situations that the shot pictures are blurred and the like are avoided. Therefore, the self-stability of the high-definition camera is ensured, the image stabilizing function of shooting is improved, and the rotation of the high-definition camera on the holder in the space direction is controlled. The first unmanned machine also comprises a first battery module used for supplying power to the first unmanned machine, and the first battery module comprises two batteries loaded with 6S10000mah and can realize hot plug. Wherein the second camera is a high definition camera or a thermal imaging camera or a dual-camera.
In this embodiment, the first unmanned machine is mainly responsible for shooting thermal imaging image information of the fan blade by carrying an infrared camera, that is, shooting a thermal imaging image of the fan blade by the thermal imaging camera, so that the thermal imaging image information can be visually displayed, and defects such as skin falling or large cracks on the surface of the fan blade can be quickly found out; high-definition shooting is realized through the high-definition camera with the cloud platform, and the fine defects such as tiny crackle are found out to the high-definition image information of gathering fan blade and through processing analysis.
In this embodiment, the second drone is mainly equipped with a TX2 control processing module, and the TX2 control processing module mainly acquires image information and pan-tilt position information captured by the first drone, acquires coordinate position information of the first drone, analyzes and processes a defective image, and controls the first drone to repeatedly capture images. The system is particularly responsible for observing the surrounding environment, processing the image shot by the first unmanned machine, outputting surface defect information and controlling the first unmanned machine to zoom in the shot detail image at the defect position, thereby realizing the process of confirming the defect position and improving the detection precision of the fan blade.
In this scheme, the first unmanned aerial vehicle and the second unmanned aerial vehicle both further include RTK positioning modules, position information of the first unmanned aerial vehicle and the second unmanned aerial vehicle can be obtained through the RTK positioning modules, when the first unmanned aerial vehicle and the second unmanned aerial vehicle perform thermal imaging image sampling and high definition image sampling, the position information of the first unmanned aerial vehicle and the position information of the second unmanned aerial vehicle are marked in the thermal imaging image and the high definition image at the same time, in the step 4, the position of the fan blade defect identified from the thermal imaging image in the step 2 is marked in the high definition image of the fan blade, and the method further includes uniformly adjusting and positioning the proportions of the thermal imaging image and the high definition image according to the position information in the thermal imaging image and the high definition image, so as to analyze the position of the defect on the fan blade according to the position information of the fan blade shooting position corresponding to the first unmanned aerial vehicle.
Corresponding to the unmanned aerial vehicle system, the invention also provides an unmanned aerial vehicle system for detecting the fan blade and a control method, and the unmanned aerial vehicle system comprises the following steps:
step 1, a control unit sends a control instruction to a first unmanned machine controller through a communication unit to control the first unmanned machine to fly to an appointed height position, then a thermal imaging sampling signal is sent to the first unmanned machine controller through the communication unit, a first unmanned machine thermal imaging camera is aligned to a detected fan blade to carry out thermal imaging image sampling, and the collected fan blade thermal imaging image is sent to the control unit through the controller;
step 2, the control unit processes the thermal imaging image of the detected fan blade acquired by the first unmanned aerial vehicle, identifies the defect position of the fan blade and generates a control instruction of a second unmanned aerial vehicle according to the defect position of the fan blade;
step 3, the control unit sends the control instruction generated in the step 2 to a controller of a second unmanned aerial vehicle through a communication unit to control the second unmanned aerial vehicle to fly to an appointed position, then a high-definition imaging sampling signal is sent to the controller of the second unmanned aerial vehicle through the communication unit, a high-definition imaging camera of the second unmanned aerial vehicle aims at the defect position of the detected fan blade to perform high-definition image sampling, and the acquired high-definition imaging image of the fan blade is sent to the control unit through the controller;
and 4, processing the high-definition image acquired in the step 3 by the control unit, and marking the fan blade defect position identified from the thermal imaging image in the step 2 on the high-definition image of the fan blade.
In step 1, first unmanned aerial vehicle flies along the length direction who is on a parallel with fan blade and shoots, and controls thermal imaging camera with fan blade's distance is 1 ~ 3 meters, flies along the length direction who is on a parallel with fan blade through controlling first unmanned aerial vehicle and shoots and control thermal imaging camera and set up between 1 ~ 3 meters apart from fan blade's shooting distance, not only can reduce the probability that unmanned aerial vehicle appears striking the fan at the flight in-process, still makes to obtain clear accurate image shooting precision in the shooting distance that sets up. If the shooting distance is set too far, the shot image is unclear, and low precision caused by later detection is avoided; in addition, when shooting too closely, unmanned aerial vehicle appears striking the fan easily in flight process and leads to unmanned aerial vehicle and fan to damage.
Preferably, in step 1, when the first unmanned aerial vehicle samples the thermal imaging image of the detected fan blade, the control unit controls the second unmanned aerial vehicle to simultaneously acquire an overall image of the fan blade, wherein the overall image includes an image containing the thermal imaging image of the first unmanned aerial vehicle on the fan blade.
Furthermore, the first unmanned aerial vehicle and the second unmanned aerial vehicle both further comprise RTK positioning modules, the position information of the first unmanned aerial vehicle and the position information of the second unmanned aerial vehicle can be obtained through the RTK positioning modules, when the first unmanned aerial vehicle and the second unmanned aerial vehicle perform thermal imaging image sampling and high-definition image sampling, the position information of the first unmanned aerial vehicle and the position information of the second unmanned aerial vehicle are marked in the thermal imaging image and the high-definition image, in the step 4, the position of the defect of the fan blade identified from the thermal imaging image in the step 2 is marked in the high-definition image of the fan blade, and the method further comprises the step of uniformly adjusting and positioning the proportion of the thermal imaging image and the high-definition image according to the position information in the thermal imaging image and the high-definition image.
In this embodiment, the first drone further includes an RTK positioning module, the RTK positioning module is configured to obtain position information of the first drone, and the RTK positioning module in the first drone is configured to perform real-time positioning on the first drone so as to obtain the position information of the first drone, so that the position information of the shooting position of the fan blade can be further analyzed according to the obtained position information of the first drone, thereby obtaining more accurate image position information, and transmitting thermal imaging image information obtained by the first drone to the control unit through the communication unit.
It should be noted that the above-described device embodiments are merely illustrative, where the units described as separate parts may or may not be physically separate, and the parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on multiple network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present embodiment. In addition, in the drawings of the embodiment of the apparatus provided by the present invention, the connection relationship between the modules indicates that there is a communication connection between them, and may be specifically implemented as one or more communication buses or signal lines. One of ordinary skill in the art can understand and implement it without inventive effort.
The embodiments in the above embodiments can be further combined or replaced, and the embodiments are only used for describing the preferred embodiments of the present invention, and do not limit the concept and scope of the present invention, and various changes and modifications made to the technical solution of the present invention by those skilled in the art without departing from the design idea of the present invention belong to the protection scope of the present invention.