All-time cloud detection method and device for geostationary satellite
1. A method for detecting the full-time cloud of a geostationary satellite is characterized by comprising the following steps:
acquiring a remote sensing image to be processed, and performing standardized preprocessing on the remote sensing image to be processed to obtain a preprocessed image;
judging whether the illumination of the preprocessed image is sufficient or not;
if so, performing normalization processing on the preprocessed image to obtain a normalized image;
performing initial cloud identification on the normalized image to obtain initial cloud identification data;
and clustering the initial cloud identification data to obtain a cloud detection result.
2. The method of claim 1, wherein the determining whether the pre-processed image is sufficiently illuminated comprises:
acquiring position information and time information of the preprocessed image;
determining a geographic position corresponding to an image geometric center pixel of the preprocessed image according to the position information;
calculating sunrise and sunset time corresponding to the geographic position;
and judging whether the illumination of the preprocessed image is sufficient or not according to the time information and the sunrise and sunset time.
3. The method of claim 2, wherein the calculating the sunrise and sunset time corresponding to the geographic location comprises:
calculating the shooting date and the number of days of the preprocessed image according to preset Greenwich mean time;
calculating the century number corresponding to the shooting date number according to the pre-stored historical sunrise and sunset time and the shooting date number;
calculating the mean yellow diameter of the sun, the mean paraxial point angle of the sun and the inclination angle of the earth according to the century;
calculating the ecliptic longitude of the sun according to the mean yellow diameter of the sun and the mean paraxial point angle of the sun;
calculating the deviation of the sun according to the inclination angle of the earth and the ecliptic longitude of the sun;
calculating the solar time angle of the Greenwich mean time according to the mean-time angle of the sun, the ecliptic longitude of the sun and the historical sunrise and sunset time;
calculating a correction value according to the longitude and latitude position of the geographic position and the deviation of the sun;
calculating a preliminary sunrise and sunset time according to the solar time angle, the longitude and latitude position, the historical sunrise and sunset time and the correction value;
and determining the sunrise and sunset time corresponding to the geographic position according to the historical sunrise and sunset time and the preliminary sunrise and sunset time.
4. The method of claim 1, wherein the normalizing the preprocessed image to obtain a normalized image comprises:
acquiring a satellite azimuth angle, a satellite zenith angle and a solar zenith angle according to the preprocessed image;
calculating a normalization operator according to the satellite azimuth angle, the satellite zenith angle and the solar zenith angle;
and carrying out visible light channel normalization processing on the preprocessed image according to the normalization operator to obtain a normalized image.
5. The method of claim 4, wherein the performing initial cloud recognition on the normalized image to obtain initial cloud recognition data comprises:
calculating a first identification threshold value according to the normalized image;
and performing initial cloud identification on the normalized image according to the first identification threshold value to obtain initial cloud identification data.
6. The method of claim 1, wherein computing a first recognition threshold from the normalized image comprises:
acquiring an initial identification threshold value set;
determining a first pixel number corresponding to each initial identification threshold in the initial identification threshold set, wherein the first pixel number is the pixel number of the normalized image with the gray value smaller than the initial identification threshold;
calculating the foreground pixel proportion corresponding to each initial identification threshold according to the first pixel number;
calculating the average gray level of a first pixel corresponding to each initial identification threshold according to the normalized image, wherein the average gray level of the first pixel is the average gray level of the pixel of which the gray value is smaller than the initial identification threshold;
calculating second pixel average gray corresponding to each initial identification threshold according to the normalized image, wherein the second pixel average gray is the pixel average gray with the gray value not less than the initial identification threshold;
calculating the inter-class variance corresponding to each initial identification threshold according to the foreground pixel proportion, the first pixel average gray scale and the second pixel average gray scale;
and determining the initial identification threshold corresponding to the maximum inter-class variance as a first identification threshold.
7. The method of geostationary satellite universal time cloud detection according to claim 1, characterized in that the method further comprises:
when the illumination of the preprocessed image is judged to be insufficient, performing enhancement processing on the preprocessed image to obtain an enhanced image;
calculating a second identification threshold of the enhanced image;
and performing initial cloud identification on the enhanced image according to the second identification threshold to obtain initial cloud identification data, and performing clustering processing on the initial cloud identification data to obtain a cloud detection result.
8. An all-time cloud detection device for a geostationary satellite, the all-time cloud detection device comprising:
the acquisition unit is used for acquiring a remote sensing image to be processed;
the preprocessing unit is used for carrying out standardized preprocessing on the remote sensing image to be processed to obtain a preprocessed image;
the judging unit is used for judging whether the illumination of the preprocessed image is sufficient or not;
the normalization unit is used for performing normalization processing on the preprocessed image to obtain a normalized image when the illumination of the preprocessed image is judged to be sufficient;
the identification unit is used for carrying out initial cloud identification on the normalized image to obtain initial cloud identification data;
and the clustering unit is used for clustering the initial cloud identification data to obtain a cloud detection result.
9. An electronic device, characterized in that the electronic device comprises a memory for storing a computer program and a processor for executing the computer program to cause the electronic device to perform the geostationary satellite universal time cloud detection method of any one of claims 1 to 7.
10. A readable storage medium having stored thereon computer program instructions which, when read and executed by a processor, perform the method of geostationary satellite universal time cloud detection of any one of claims 1 to 7.
Background
In recent years, with the emission of new generation geostationary satellites such as wind clouds 4 and sunflower 8, the ground observation and meteorological observation capability based on remote sensing is greatly improved. Cloud detection is an important component of a world climate research plan, and the existing geostationary satellite cloud detection method generally selects and extracts cloud characteristics based on an image characteristic cloud detection method, and designs a detector to perform cloud detection on an image. However, in practice, it is found that the existing method needs a large number of samples to train the detector, which not only consumes time, but also directly affects the accuracy of the cloud detection result by the number and quality of the samples, and has the problems of large computation amount, high complexity and the like. Therefore, the existing geostationary satellite cloud detection method is large in calculation amount and high in complexity.
Disclosure of Invention
The embodiment of the application aims to provide a method and a device for detecting the cloud of a geostationary satellite all-day, which can simply and quickly realize cloud detection, and have the advantages of high accuracy, small calculated amount and small complexity.
A first aspect of an embodiment of the present application provides a method for detecting a full-time cloud of a geostationary satellite, including:
acquiring a remote sensing image to be processed, and performing standardized preprocessing on the remote sensing image to be processed to obtain a preprocessed image;
judging whether the illumination of the preprocessed image is sufficient or not;
if so, performing normalization processing on the preprocessed image to obtain a normalized image;
performing initial cloud identification on the normalized image to obtain initial cloud identification data;
and clustering the initial cloud identification data to obtain a cloud detection result.
In the implementation process, a remote sensing image to be processed is obtained first, and standard preprocessing is carried out on the remote sensing image to be processed to obtain a preprocessed image; judging whether the illumination of the preprocessed image is sufficient or not; if so, performing normalization processing on the preprocessed image to obtain a normalized image; then, performing initial cloud identification on the normalized image to obtain initial cloud identification data; and finally, clustering the initial cloud identification data to obtain a cloud detection result, so that cloud detection can be simply and rapidly realized, and the method has the advantages of high accuracy, small calculated amount and small complexity.
Further, the determining whether the illumination of the preprocessed image is sufficient includes:
acquiring position information and time information of the preprocessed image;
determining a geographic position corresponding to an image geometric center pixel of the preprocessed image according to the position information;
calculating sunrise and sunset time corresponding to the geographic position;
and judging whether the illumination of the preprocessed image is sufficient or not according to the time information and the sunrise and sunset time.
In the implementation process, when the illumination of the preprocessed image is judged to be sufficient, whether the illumination of the image is sufficient can be judged according to the position information and the time information of the preprocessed image: and if the geographic position of the geometric center pixel of the image is positioned before sunrise or after sunset at the time point of image acquisition, judging that the illumination is insufficient. And if the image acquisition time is between sunrise and sunset of the position of the geometric center pixel, judging that the illumination is sufficient.
Further, the calculating the sunrise and sunset time corresponding to the geographic location includes:
calculating the shooting date and the number of days of the preprocessed image according to preset Greenwich mean time;
calculating the century number corresponding to the shooting date number according to the pre-stored historical sunrise and sunset time and the shooting date number;
calculating the mean yellow diameter of the sun, the mean paraxial point angle of the sun and the inclination angle of the earth according to the century;
calculating the ecliptic longitude of the sun according to the mean yellow diameter of the sun and the mean paraxial point angle of the sun;
calculating the deviation of the sun according to the inclination angle of the earth and the ecliptic longitude of the sun;
calculating the solar time angle of the Greenwich mean time according to the mean-time angle of the sun, the ecliptic longitude of the sun and the historical sunrise and sunset time;
calculating a correction value according to the longitude and latitude position of the geographic position and the deviation of the sun;
calculating a preliminary sunrise and sunset time according to the solar time angle, the longitude and latitude position, the historical sunrise and sunset time and the correction value;
and determining the sunrise and sunset time corresponding to the geographic position according to the historical sunrise and sunset time and the preliminary sunrise and sunset time.
In the implementation process, the sunrise and sunset time is calculated, so that the accuracy of judging whether the illumination is sufficient can be improved, and the cloud detection accuracy is improved.
Further, the normalizing the preprocessed image to obtain a normalized image includes:
acquiring a satellite azimuth angle, a satellite zenith angle and a solar zenith angle according to the preprocessed image;
calculating a normalization operator according to the satellite azimuth angle, the satellite zenith angle and the solar zenith angle;
and carrying out visible light channel normalization processing on the preprocessed image according to the normalization operator to obtain a normalized image.
In the implementation process, the normalization processing is performed on the preprocessed image, so that the available time of the visible light channel can be increased, and the fitting speed of the model can be increased.
Further, the performing initial cloud identification on the normalized image to obtain initial cloud identification data includes:
calculating a first identification threshold value according to the normalized image;
and performing initial cloud identification on the normalized image according to the first identification threshold value to obtain initial cloud identification data.
In the implementation process, the normalized image is subjected to initial cloud identification through a threshold method, the processing process is simple, the calculated amount is small, and the cloud detection efficiency is favorably improved.
Further, said calculating a first recognition threshold from said normalized image comprises:
acquiring an initial identification threshold value set;
determining a first pixel number corresponding to each initial identification threshold in the initial identification threshold set, wherein the first pixel number is the pixel number of the normalized image with the gray value smaller than the initial identification threshold;
calculating the foreground pixel proportion corresponding to each initial identification threshold according to the first pixel number;
calculating the average gray level of a first pixel corresponding to each initial identification threshold according to the normalized image, wherein the average gray level of the first pixel is the average gray level of the pixel of which the gray value is smaller than the initial identification threshold;
calculating second pixel average gray corresponding to each initial identification threshold according to the normalized image, wherein the second pixel average gray is the pixel average gray with the gray value not less than the initial identification threshold;
calculating the inter-class variance corresponding to each initial identification threshold according to the foreground pixel proportion, the first pixel average gray scale and the second pixel average gray scale;
and determining the initial identification threshold corresponding to the maximum inter-class variance as a first identification threshold.
In the implementation process, the threshold corresponding to the image can be calculated, the flexibility is high, the applicability is good, and the identification accuracy is favorably improved.
Further, the method further comprises:
when the illumination of the preprocessed image is judged to be insufficient, performing enhancement processing on the preprocessed image to obtain an enhanced image;
calculating a second identification threshold of the enhanced image;
and performing initial cloud identification on the enhanced image according to the second identification threshold to obtain initial cloud identification data, and performing clustering processing on the initial cloud identification data to obtain a cloud detection result.
In the implementation process, when the illumination of the preprocessed image is insufficient, the image enhancement processing is firstly carried out, then the threshold calculation and the initial cloud identification processing are carried out, the cloud detection in all days is realized, and the detection effect is not influenced by the image acquisition time.
A second aspect of the embodiments of the present application provides a geostationary satellite universal time cloud detection apparatus, including:
the acquisition unit is used for acquiring a remote sensing image to be processed;
the preprocessing unit is used for carrying out standardized preprocessing on the remote sensing image to be processed to obtain a preprocessed image;
the judging unit is used for judging whether the illumination of the preprocessed image is sufficient or not;
the normalization unit is used for performing normalization processing on the preprocessed image to obtain a normalized image when the illumination of the preprocessed image is judged to be sufficient;
the identification unit is used for carrying out initial cloud identification on the normalized image to obtain initial cloud identification data;
and the clustering unit is used for clustering the initial cloud identification data to obtain a cloud detection result.
In the implementation process, the acquisition unit acquires a remote sensing image to be processed, and the preprocessing unit performs standard preprocessing on the remote sensing image to be processed to obtain a preprocessed image; the judging unit judges whether the illumination of the preprocessed image is sufficient; the normalization unit is used for performing normalization processing on the preprocessed image to obtain a normalized image when the illumination of the preprocessed image is judged to be sufficient; then, carrying out initial cloud identification on the normalized image by an identification unit to obtain initial cloud identification data; and finally, clustering the initial cloud identification data by the clustering unit to obtain a cloud detection result, so that cloud detection can be simply and quickly realized, and the method has the advantages of high accuracy, small calculated amount and small complexity.
A third aspect of the embodiments of the present application provides an electronic device, including a memory and a processor, where the memory is used to store a computer program, and the processor runs the computer program to make the electronic device execute the method for detecting a satellite-at-rest all-time cloud according to any one of the first aspect of the embodiments of the present application.
A fourth aspect of the embodiments of the present application provides a computer-readable storage medium, which stores computer program instructions, where the computer program instructions, when read and executed by a processor, perform the method for detecting a full-time cloud of geostationary satellites according to any of the first aspect of the embodiments of the present application.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are required to be used in the embodiments of the present application will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered as limiting the scope, and that those skilled in the art can also obtain other related drawings based on the drawings without inventive efforts.
Fig. 1 is a schematic flowchart of a method for detecting a satellite geostationary cloud in a full-time environment according to an embodiment of the present disclosure;
fig. 2 is a schematic flowchart of a method for detecting a satellite geostationary cloud in an all-time environment according to a second embodiment of the present disclosure;
fig. 3 is a schematic structural diagram of a geostationary satellite all-time cloud detection apparatus according to a third embodiment of the present application;
fig. 4 is a schematic structural diagram of a geostationary satellite all-time cloud detection apparatus according to a fourth embodiment of the present disclosure;
fig. 5 is a schematic flowchart of another geostationary satellite all-time cloud detection method according to the second embodiment of the present disclosure.
Detailed Description
The technical solutions in the embodiments of the present application will be described below with reference to the drawings in the embodiments of the present application.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, it need not be further defined and explained in subsequent figures. Meanwhile, in the description of the present application, the terms "first", "second", and the like are used only for distinguishing the description, and are not to be construed as indicating or implying relative importance.
Example 1
Referring to fig. 1, fig. 1 is a schematic flowchart illustrating a method for detecting a satellite geostationary cloud in a full-time environment according to an embodiment of the present disclosure. The method for detecting the full-time cloud of the geostationary satellite comprises the following steps:
s101, obtaining a remote sensing image to be processed, and performing standardized preprocessing on the remote sensing image to be processed to obtain a preprocessed image.
In the embodiment of the application, the to-be-processed remote sensing image is subjected to standardized preprocessing, specifically including radiation correction, geometric correction and the like, and meanwhile, the reflectivity, brightness temperature 2 indexes and the like of the to-be-processed remote sensing image can be extracted, and the embodiment of the application is not limited.
S102, judging whether the illumination of the preprocessed image is sufficient, and if so, executing the step S104-step S106; if not, step S103 and step S106 are executed.
In the embodiment of the application, when judging whether the illumination of the preprocessed image is sufficient, whether the illumination of the image is sufficient can be judged according to the position information and the time information of the preprocessed image: and if the geographic position of the geometric center pixel of the image is positioned before sunrise or after sunset at the time point of image acquisition, judging that the illumination is insufficient. And if the image acquisition time is between sunrise and sunset of the position of the geometric center pixel, judging that the illumination is sufficient.
Specifically, the geographical position of the geometric center pixel of the image is determined according to the position information of the preprocessed image, the sunrise and sunset time corresponding to the preprocessed image is calculated according to the geographical position of the geometric center pixel of the image, and whether the illumination is sufficient or not is judged according to the time information of the image and the sunrise and sunset time.
As an optional implementation manner, when the illumination of the preprocessed image is determined to be insufficient, the method may further include the following steps:
s103, performing initial cloud identification processing based on the 11-micron wavelength channel on the preprocessed image to obtain initial cloud identification data, and executing the step S106.
And S104, carrying out normalization processing on the preprocessed image to obtain a normalized image.
In the embodiment of the application, if the image illumination condition is sufficient, a visible light channel normalization algorithm is adopted.
As an alternative embodiment, the visible light albedo may be normalized by using a quasi-bernoulli surface adjustment method: due to the influence of the light source condition, the albedo of the visible light channel has a large difference at different moments, and therefore, the albedo needs to be normalized. On the one hand, the normalization processing operation can increase the available time of the visible light channel, and on the other hand, the fitting speed of the model can also be increased.
And S105, performing initial cloud identification on the normalized image to obtain initial cloud identification data.
In the embodiment of the present application, step S105 is implemented, so that initial cloud identification processing based on a 0.64-micron wavelength channel can be performed on the normalized image, and initial cloud identification data is obtained.
In the embodiment of the application, initial cloud identification is performed on the normalized image, so that initial cloud identification and non-cloud identification can be performed on each pixel in the normalized image, and then initial cloud identification data is obtained, wherein the initial cloud identification data comprises a cloud pixel set Cj(i.e. "cloud" class C0) And a set of non-cloud pixels ("non-cloud" class C)1) I.e. Cj = {C0,C1}。
As an alternative embodiment, an Otsu threshold-based method may be used to achieve initial cloud identification of the normalized image.
And S106, clustering the initial cloud identification data to obtain a cloud detection result.
In the embodiment of the application, the clustering method is adopted, so that the identification accuracy and the identification efficiency of cloud detection can be enhanced.
In the embodiment of the application, the initial cloud identification data is clustered, and a K-Means clustering algorithm and the like can be adopted, so that the embodiment of the application is not limited.
As an optional implementation manner, clustering the initial cloud identification data to obtain a cloud detection result may include the following steps:
the first step is as follows: determining a cloud pixel set and a non-cloud pixel set according to the initial cloud identification data;
the second step is that: calculating a first cluster centroid of the cloud pixel set and a second cluster centroid of the non-cloud pixel set;
the third step: traversing each pixel point in the initial cloud identification data, and calculating a first distance from each pixel point to a first clustering centroid and a second distance from each pixel point to a second clustering centroid;
the fourth step: reclassifying each pixel point in the initial cloud identification data according to the first distance and the second distance to obtain a new divided pixel set;
the fifth step: repeating the second step to the fourth step until a preset termination cycle condition is reached;
and a sixth step: and determining the finally obtained divided pixel set as a cloud detection result, wherein the divided pixel set comprises a final cloud pixel set and a final non-cloud pixel set.
In the above embodiment, for each class, i.e., "cloud" class C0And "non-cloud" class C1Calculating the clustering centroid ajThe calculation formula is as follows:
;
wherein j =0, 1; when j =0, CjIs C0,ajIs a0,a0In particular the first cluster centroid, C when j =1jIs C1,ajIs a1,a1In particular a second centroid; a isj={ a0,a1}。
Wherein, the value of x is the corresponding gray value of the pixel point, in the above calculation formula, when j =0, x is C0The gray value of the middle pixel point is,is represented by C0The sum of the gray values of all the pixels in the image; for the same reason when j =1,is represented by C1The sum of the gray values of all the pixels in the image.
In the above embodiment, each pixel point x in the initial cloud identification data is traversediCalculating each pixel point xiTo the first cluster centroid a0First distance d of0And each pixel point xiTo the second centroid class a1Second distance d1The calculation formula is as follows:
;
wherein i =1, 2, … mxn, j =0, 1;
in the above embodiment, when each pixel point in the initial cloud identification data is re-classified according to the first distance and the second distance, the pixel point is classified into the class corresponding to the clustering center with the smallest distance, so that a new class classification C is obtainedj= {C0,C1}. For example, assume a pixel A with a first distance dA0And a second distance dA1If d is calculatedA0≤dA1Then, the pixel point A is divided into a cloud class C0If d is calculatedA0>dA1Then, the pixel point A is divided into a non-cloud class C1In (1).
In the foregoing embodiment, the preset termination loop condition includes one or more of reaching a preset iteration number, no change in the clustering center, a change amplitude of the clustering center not exceeding a preset threshold, and the like, and the embodiment of the present application is not limited thereto.
In the embodiment of the present application, the execution subject of the method may be a computing device such as a computer and a server, and is not limited in this embodiment.
In this embodiment, an execution subject of the method may also be an intelligent device such as a smart phone and a tablet computer, which is not limited in this embodiment.
It can be seen that the implementation of the method for detecting the cloud of the geostationary satellite in the all-weather environment described in this embodiment can simply and quickly realize cloud detection, and has high accuracy, small calculation amount and small complexity.
Example 2
Referring to fig. 2, fig. 2 is a schematic flowchart illustrating a method for detecting a satellite geostationary cloud in a full-time environment according to an embodiment of the present disclosure. As shown in fig. 2, the method for detecting the satellite-at-rest full-time cloud includes:
s201, obtaining a remote sensing image to be processed, and performing standardized preprocessing on the remote sensing image to be processed to obtain a preprocessed image.
S202, acquiring position information and time information of the preprocessed image.
S203, determining the geographic position corresponding to the geometric center pixel of the image of the preprocessed image according to the position information.
And S204, calculating sunrise and sunset time corresponding to the geographic position.
As an alternative embodiment, calculating the sunrise and sunset time corresponding to the geographic location may include the following steps:
the first step is as follows: calculating the number of days of shooting date of the preprocessed image according to the preset Greenwich mean time;
the second step is that: calculating the century number corresponding to the shooting date and day according to the pre-stored historical sunrise and sunset time and the shooting date and day;
the third step: calculating the mean yellow diameter of the sun, the mean paraxial point angle of the sun and the inclination angle of the earth according to the century;
the fourth step: calculating the ecliptic longitude of the sun according to the mean yellow diameter of the sun and the mean paraxial point angle of the sun;
the fifth step: calculating the deviation of the sun according to the inclination angle of the earth and the ecliptic longitude of the sun;
and a sixth step: calculating the solar time angle of the Greenwich mean time according to the mean-near point angle of the sun, the yellow-road longitude of the sun and the historical sunrise and sunset time;
the seventh step: calculating a correction value according to the longitude and latitude position of the geographic position and the deviation of the sun;
eighth step: calculating the preliminary sunrise and sunset time according to the solar time angle, the longitude and latitude positions, the historical sunrise and sunset time and the corrected value;
the ninth step: and determining the sunrise and sunset time corresponding to the geographic position according to the historical sunrise and sunset time and the preliminary sunrise and sunset time.
In the above embodiment, the preset greenwich time is 1 month and 1 day of the year 2000 b.c.
In the above embodiment, the number of days of shooting date of the preprocessed image, i.e., the number of days, from 1/2000 of greenwich mean time to the calculation day (i.e., the date of shooting of the preprocessed image) is calculated.
In the above embodiment, the number of centuries corresponding to the number of shooting dates, i.e., the number of centuries t from 1/2000 greville power time unit to the calculation date, is calculated by the formula: t ═ (days + UTo/360)/36525; where days is the shooting date and day of the preprocessed image, UTo is the pre-stored historical sunrise and sunset time, and UTo is 180 ° when the calculation is first performed.
In the above embodiment, the formula for calculating the solar plain yellow diameter L is: l =280.460+36000.770 × t.
In the above embodiment, the formula for calculating the mean paraxial point angle G of the sun is: g-357.528 +35999.050 × t.
In the above embodiment, the formula for calculating the ecliptic longitude λ of the sun is: λ ═ L +1.915 × sinG +0.020 × sin (2G).
In the above embodiment, the formula for calculating the inclination angle ∈ of the earth is: ε is 23.4393-0.0130 × t.
In the above embodiment, the calculation formula for calculating the deviation δ of the sun is: δ is arcsin (sin ∈ × sin λ).
In the above embodiment, the formula for calculating the solar time angle GHA of greenwich mean time is: GHA = UTo-180-1.915 × sinG-0.020 × sin (2G) +2.466 × sin (2 λ) -0.053 × sin (4 λ).
In the above embodiment, the formula for calculating the correction value e is: e = arcos { [ sin (h) -sin (Glat) sin (δ) ]/cos (Glat) cos (δ) }, where h is the position of the sun at sunrise and sunset, a preset value, specifically h ═ 0.833 °, and Glat is the latitude of the geographic location.
In the above embodiment, the latitudinal and longitudinal positions of the geographic position include the longitude Long of the geographic position and the latitude G1at of the geographic position.
In the above embodiment, the calculation formula for calculating the preliminary sunrise and sunset time is: UT' UTo- (GHA + Long ± e), where "+" indicates the calculation of the sunrise time and "-" indicates the calculation of the sunset time.
As a further optional implementation, determining the sunrise and sunset time corresponding to the geographic location according to the historical sunrise and sunset time and the preliminary sunrise and sunset time may further include:
determining the sunrise and sunset time of the Greenwich power control according to the historical sunrise and sunset time and the preliminary sunrise and sunset time;
and converting the sunrise and sunset time of Greenwich mean into time expressed by hours to obtain the sunrise and sunset time corresponding to the geographic position.
In the above embodiment, when determining the sunrise and sunset time corresponding to the geographic location according to the historical sunrise and sunset time and the preliminary sunrise and sunset time, first comparing UTo with the absolute value of the difference between UT', and if UTo is smaller than a preset threshold, UT is the calculated greenwich mean sunrise and sunset time; specifically, the preset threshold may be set to 0.1 °, i.e., 0.007 hours.
In the above embodiment, if the absolute value of the difference between UTo and UT 'is not less than the preset threshold, UT' may be used as a new historical sunrise and sunset time value, the above second to ninth steps may be repeatedly executed until the absolute value of the difference between UTo and UT 'is less than the preset threshold, and UT' finally calculated may be used as the greenwich mean sunrise and sunset time UT.
In the above embodiment, the calculated greenwich sunrise and sunset time is expressed in degrees, that is, 180 ° =12 hours, and therefore, it is necessary to convert the calculated greenwich sunrise and sunset time into time expressed in hours, and add the time Zone number Zone in which the geographical location is located, so that the sunrise and sunset time corresponding to the geographical location can be obtained, and the calculation formula is: t = UT/15+ Zone, wherein T is sunrise and sunset time corresponding to the geographic position, and UT is greenwich sunrise and sunset time.
After step S204, the method further includes the following steps:
s205, judging whether the illumination of the preprocessed image is sufficient or not according to the time information and the sunrise and sunset time, and if so, executing the steps S206-S210 and S214; if not, executing steps S211 to S214.
In the embodiment of the present application, the above steps S202 to S205 are implemented to determine whether the illumination of the preprocessed image is sufficient.
And S206, acquiring a satellite azimuth angle, a satellite zenith angle and a solar zenith angle according to the preprocessed image.
In the embodiment of the application, when the remote sensing image is acquired, the satellite azimuth angle, the satellite zenith angle and the solar zenith angle corresponding to the remote sensing image can be acquired together.
And S207, calculating a normalization operator according to the satellite azimuth angle, the satellite zenith angle and the solar zenith angle.
In the embodiment of the present application, the formula for calculating the normalization operator is:
F=cosθ1 -0.7cosΩ+1.3;
cosΩ= cosθ1·cosθ2-sinθ1·sinθ2 cos ∆φ;
wherein F represents a normalization operator, θ1Representing the zenith angle of the sun, theta2Indicating the satellite zenith angle, and phi the satellite relative (sun) azimuth angle (i.e., the satellite azimuth angle).
And S208, carrying out visible light channel normalization processing on the preprocessed image according to the normalization operator to obtain a normalized image.
In the embodiment of the application, a normalization operator is used for normalizing the visible light channel;
A0=As/F;
wherein A issRepresenting the albedo before normalization and a0 representing the albedo after normalization.
In the embodiment of the present application, the normalization processing can be performed on the preprocessed image by performing the steps S202 to S205 to obtain a normalized image.
S209, calculating a first identification threshold value according to the normalized image.
As an alternative embodiment, calculating the first recognition threshold value according to the normalized image may include the following steps:
acquiring an initial identification threshold value set;
determining a first pixel number corresponding to each initial identification threshold value in the initial identification threshold value set, wherein the first pixel number is the pixel number of which the gray value in the normalized image is smaller than the initial identification threshold value;
calculating the foreground pixel proportion corresponding to each initial identification threshold according to the first pixel number;
calculating the average gray level of a first pixel corresponding to each initial identification threshold according to the normalized image, wherein the average gray level of the first pixel is the average gray level of the pixel of which the gray value is smaller than the initial identification threshold;
calculating the average gray level of a second pixel corresponding to each initial identification threshold according to the normalized image, wherein the average gray level of the second pixel is the average gray level of the pixel of which the gray value is not less than the initial identification threshold;
calculating the inter-class variance corresponding to each initial identification threshold according to the foreground pixel proportion, the first pixel average gray level and the second pixel average gray level;
and determining the initial identification threshold corresponding to the maximum inter-class variance as a first identification threshold.
In the above-described embodiment, the initial identification threshold set S specifically includes a plurality of initial identification thresholds, specifically, S = {0,1,2, …, 255 }.
In the above embodiment, when determining the first pixel number and the second pixel number corresponding to each initial recognition threshold, for example, for one of the initial recognition thresholds T in S, the gray value in the normalized image is smaller than the initial recognition threshold TThe number of pixel points with the threshold T is recorded as the first pixel number N0Thereby, the number N of the pixel points with the gray value not less than the initial identification threshold value T in the normalized image can be obtained1I.e. the number of second pixels N1=M×N–N0。
In the above embodiment, when calculating the first pixel average gray scale and the second pixel average gray scale corresponding to each of the initial recognition threshold values, the first pixel average gray scale U is calculated for one of the initial recognition threshold values T in S, as an example described above0The calculation formula of (2) is as follows:calculating the average gray level U of the second pixel1The calculation formula of (2) is as follows:wherein u isiAnd expressing the gray value of the ith pixel point in the normalized image.
In the above embodiment, the calculation formula for calculating the inter-class variance corresponding to each initial recognition threshold is as follows: gj = W0×(1- W0)×(U0-U1)( U0-U1) Wherein j = T, T ∈ initial identification threshold set S, and g can be finally calculatedj={g0,g1,…,g255}。
In the above embodiment, first, the program starts from gj={g0,g1,…,g255Determining the maximum inter-class variance, and then determining the initial identification threshold corresponding to the maximum inter-class variance as a first identification threshold.
As another alternative, calculating the first recognition threshold value according to the normalized image may include the following steps:
(1) the threshold T is preliminarily determined to be 0;
(2) traversing the gray image pixel (i.e. each pixel in the normalized image), if the gray value of the pixel is less than T, then N0Increasing by 1 to obtain pixels with gray value less than threshold value TNumber N0And the number N of pixels with the gray value larger than the threshold value and equal to T is finally obtained1=M×N–N0;
(3) Calculating the proportion W of the number of pixels belonging to the foreground in the whole image0=N0/(M×N);
(4) Calculating the average gray level of the pixels with the gray level value less than the threshold value T asWherein u isiExpressing the gray value of the ith pixel point;
(5) calculating the average pixel gray scale with the gray scale value being greater than or equal to the threshold value T asWherein u isiExpressing the gray value of the ith pixel point;
(6) calculating the between-class variance gj = W0×(1- W0)×(U0-U1)( U0-U1) Wherein j = T;
(7) the threshold T is increased by 1;
(8) the steps (2) to (7) are circularly executed until T>255, yield g = { g0,g1,…,g255Of which the maximum value gjThe corresponding T value is the first identification threshold value.
In the above embodiment, for a specific pixel point, the segmentation threshold of the foreground and the background is denoted as T, and the proportion of the number of pixels belonging to the foreground in the entire image is denoted as W0Average gray level U of0(ii) a The proportion of the number of background pixels to the whole image is W1Average gray of U1. The total average gray level of the image is denoted as U and the inter-class variance is denoted as g.
In the above embodiment, the normalized image has a size of M × N, and the number of pixels in the normalized image whose gray-scale value is smaller than the initial recognition threshold T is denoted as N0And obtaining a threshold T which enables the inter-class variance g to be maximum by adopting a traversal method according to the steps, and further obtaining a first identification threshold.
After step S209, the following steps are also included:
s210, performing initial cloud identification on the normalized image according to the first identification threshold value to obtain initial cloud identification data, and executing the step S214.
In the embodiment of the application, the initial cloud recognition can be performed on the normalized image by implementing the steps S209 to S210 to obtain initial cloud recognition data, so that the initial cloud recognition processing based on the 0.64-micron wavelength channel is realized.
And S211, performing enhancement processing on the preprocessed image to obtain an enhanced image.
In the embodiment of the present application, the preprocessed image may be enhanced by using an infrared enhancement algorithm, which is not limited in this embodiment of the present application.
S212, calculating a second identification threshold of the enhanced image.
In the embodiment of the present application, the method for calculating the second recognition threshold of the enhanced image is the same as the method for calculating the first recognition threshold, and when calculating the first recognition threshold, the normalized image is used for performing the threshold calculation, and similarly, when calculating the second recognition threshold, the enhanced image is used for performing the threshold calculation.
And S213, performing initial cloud identification on the enhanced image according to the second identification threshold to obtain initial cloud identification data, and executing the step S214.
In the embodiment of the present application, by implementing the steps S211 to S213, initial cloud identification processing based on an 11-micron wavelength channel can be performed on the preprocessed image, so as to obtain initial cloud identification data.
And S214, clustering the initial cloud identification data to obtain a cloud detection result.
In the embodiment of the present application, by implementing the method for detecting the full-time cloud of the geostationary satellite provided by the embodiment of the present application, a feature combination of a plurality of spectral channels can be constructed, and the method has the full-time cloud detection capability: the method comprises the steps of firstly judging whether the illumination of a remote sensing image is sufficient or not, if so, indicating that the remote sensing image is shot in the daytime, executing the steps S206-S210, and realizing initial cloud identification processing of a 0.64-micron wavelength channel, wherein the sensor has 16 channels and complete data comprising visible light and infrared; if the cloud image is insufficient, the remote sensing image is taken at night, and then the steps S211 to S213 are executed, so that the initial cloud identification processing of the 11-micron wavelength channel is realized.
In the embodiment of the application, for example, taking a sunflower satellite 8 as an example, a visible light and infrared scanning radiometer AHI carried by the sunflower satellite 8 has 16 channels, covers visible light to long-wave infrared bands, can be used for observing the characteristics of different target objects, and can effectively reflect the physical characteristics of clouds through multi-channel threshold combination judgment.
The 0.64 mu m waveband reflectivity is an effective waveband for distinguishing cloud and land, and the cloud has higher reflectivity and lower absorption value in the waveband.
Secondly, on the land, the bright temperature of an 11-micron channel (BT11) can detect cold clouds on inland water areas and certain ground objects, and the test can be used as a clear sky reset test on the land. When BT11 is greater than a predefined threshold, the pixel is identified as clear sky.
③ the bright temperature difference between BT11 and BT3.9 channels can be used to detect the presence of water and ice clouds. The difference between BT11 and BT3.9 is large and negative due to the 3.9 μm reflected sunlight. This technique is very useful for detecting low altitude water clouds in most cases.
The reflectance ratio (R0.86/R0.64) of 0.86 μm to 0.64 μm is sensitive to thick clouds on the water surface and cannot be used to detect clouds on land. However, the R0.86/R0.64 test is effective for detecting inland clear sky regions when used in conjunction with DEM data.
The reflectance ratio (R0.86/R1.6) at 0.86 μm and 1.6 μm is valid for a bright desert surface cloud, the test is used to detect clear sky in high or mountainous areas.
Sixthly, the Normalized Difference Vegetation Index (NDVI) is a simple index of vegetation distribution and is based on the characteristics of high reflectivity of the leaves in the near infrared band and low reflectivity of the leaves in the visible band. In terms of cloud detection, NDVI is employed to limit the thresholds in arid and semi-arid regions.
Referring to fig. 5, fig. 5 is a schematic flowchart illustrating another method for detecting a satellite geostationary cloud in a full-time environment according to an embodiment of the present disclosure. As shown in fig. 5, after obtaining the remote sensing image to be processed, firstly, performing standardized preprocessing on the remote sensing image to be processed to extract the reflectivity and the brightness temperature, then, judging whether the image illumination of the remote sensing image to be processed is sufficient, if the illumination is sufficient, performing normalization processing on the remote sensing image to be processed by adopting a visible light band normalization algorithm, and then, performing initial cloud identification processing on the image after the normalization processing based on a 0.64-micron wavelength channel to obtain initial cloud identification data; on the other hand, when the illumination is judged to be insufficient, the infrared band general enhancement algorithm is adopted for processing, initial cloud identification processing based on the 11-micron wavelength channel is carried out on the processed image, and initial cloud identification data are obtained. And finally, performing multichannel clustering processing on the initial cloud identification processing based on Kmeans, and identifying cloud pixels and non-cloud pixels in the image to obtain a cloud detection result. Cloud detection all day long can be realized, and the detection effect is not influenced by image acquisition time.
It can be seen that the implementation of the method for detecting the cloud of the geostationary satellite in the all-weather environment described in this embodiment can simply and quickly realize cloud detection, and has high accuracy, small calculation amount and small complexity.
Example 3
Referring to fig. 3, fig. 3 is a schematic structural diagram of a geostationary satellite all-time cloud detection apparatus according to an embodiment of the present disclosure. As shown in fig. 3, the geostationary satellite all-time cloud detection apparatus includes:
an obtaining unit 310, configured to obtain a remote sensing image to be processed;
the preprocessing unit 320 is configured to perform standardized preprocessing on the remote sensing image to be processed to obtain a preprocessed image;
a determining unit 330, configured to determine whether the illumination of the preprocessed image is sufficient;
the normalization unit 340 is configured to perform normalization processing on the preprocessed image to obtain a normalized image when it is determined that the illumination of the preprocessed image is sufficient;
an identifying unit 350, configured to perform initial cloud identification on the normalized image to obtain initial cloud identification data;
and the clustering unit 360 is configured to perform clustering processing on the initial cloud identification data to obtain a cloud detection result.
In the embodiment of the present application, for explanation of the geostationary satellite all-time cloud detection apparatus, reference may be made to the description in embodiment 1 or embodiment 2, and details are not repeated in this embodiment.
It can be seen that the implementation of the device for detecting the cloud of the geostationary satellite in the whole day described in this embodiment can simply and quickly realize cloud detection, and has high accuracy, small calculation amount and small complexity.
Example 4
Referring to fig. 4, fig. 4 is a schematic structural diagram of a geostationary satellite universal time cloud detection apparatus according to an embodiment of the present disclosure. The geostationary satellite all-time cloud detection apparatus shown in fig. 4 is obtained by optimizing the geostationary satellite all-time cloud detection apparatus shown in fig. 3. As shown in fig. 4, the judging unit 330 includes:
a first obtaining subunit 331, configured to obtain position information and time information of the preprocessed image;
a first determining subunit 332, configured to determine, according to the location information, a geographic location corresponding to a geometric center pixel of the image of the preprocessed image;
a first calculating subunit 333, configured to calculate sunrise and sunset times corresponding to geographic locations;
the determining subunit 334 is configured to determine whether the illumination of the preprocessed image is sufficient according to the time information and the sunrise and sunset time.
As an alternative embodiment, the first calculating subunit 333 includes:
the first calculation module is used for calculating the shooting date and the number of days of the preprocessed image according to the preset Greenwich mean time; calculating the century number corresponding to the shooting date and day according to the pre-stored historical sunrise and sunset time and the shooting date and day; calculating the mean yellow diameter of the sun, the mean paraxial point angle of the sun and the inclination angle of the earth according to the century; calculating the ecliptic longitude of the sun according to the mean yellow diameter of the sun and the mean paraxial point angle of the sun; and calculating the deviation of the sun according to the inclination angle of the earth and the ecliptic longitude of the sun; calculating the solar time angle of the Greenwich mean time according to the mean-near point angle of the sun, the yellow-road longitude of the sun and the historical sunrise and sunset time; calculating a correction value according to the longitude and latitude position of the geographic position and the deviation of the sun;
the second calculation module is used for calculating the preliminary sunrise and sunset time according to the solar time angle, the longitude and latitude positions, the historical sunrise and sunset time and the correction value;
and the third calculation module is used for determining the sunrise and sunset time corresponding to the geographic position according to the historical sunrise and sunset time and the preliminary sunrise and sunset time.
As an optional implementation, the normalization unit 340 includes:
a second obtaining subunit 341, configured to obtain a satellite azimuth angle, a satellite zenith angle, and a solar zenith angle according to the preprocessed image;
the second calculating subunit 342 is configured to calculate a normalization operator according to the satellite azimuth angle, the satellite zenith angle, and the solar zenith angle;
and a normalizing subunit 343, configured to perform visible light channel normalization on the preprocessed image according to the normalization operator, so as to obtain a normalized image.
As an optional trial mode, the identifying unit 350 includes:
a third calculating subunit 351, configured to calculate a first recognition threshold value according to the normalized image;
and the cloud identification subunit 352 is configured to perform initial cloud identification on the normalized image according to the first identification threshold value to obtain initial cloud identification data.
As an alternative embodiment, the third calculation subunit 351 includes:
an obtaining module, configured to obtain an initial set of identification threshold values;
the first determining module is used for determining the number of first pixels corresponding to each initial identification threshold in the initial identification threshold set, wherein the number of the first pixels is the number of pixels of which the gray value in the normalized image is smaller than the initial identification threshold;
the fourth calculation module is used for calculating the foreground pixel proportion corresponding to each initial identification threshold according to the first pixel number; calculating the average gray level of a first pixel corresponding to each initial identification threshold according to the normalized image, wherein the average gray level of the first pixel is the average gray level of the pixel of which the gray value is smaller than the initial identification threshold; calculating the average gray level of a second pixel corresponding to each initial identification threshold according to the normalized image, wherein the average gray level of the second pixel is the average gray level of the pixel of which the gray value is not less than the initial identification threshold; calculating the inter-class variance corresponding to each initial identification threshold according to the foreground pixel proportion, the first pixel average gray level and the second pixel average gray level;
and the second determining module is used for determining the initial identification threshold corresponding to the maximum inter-class variance as the first identification threshold.
As an optional implementation manner, the device for detecting a satellite cloud all-time of a geostationary satellite further includes:
an enhancement unit 370, configured to perform enhancement processing on the preprocessed image to obtain an enhanced image when it is determined that the illumination of the preprocessed image is insufficient;
a calculating unit 380 for calculating a second recognition threshold of the enhanced image;
and the initial identification unit 390 is configured to perform initial cloud identification on the enhanced image according to the second identification threshold to obtain initial cloud identification data, and trigger the clustering unit 360 to perform clustering processing on the initial cloud identification data to obtain a cloud detection result.
In the embodiment of the present application, for explanation of the geostationary satellite all-time cloud detection apparatus, reference may be made to the description in embodiment 1 or embodiment 2, and details are not repeated in this embodiment.
It can be seen that the implementation of the device for detecting the cloud of the geostationary satellite in the whole day described in this embodiment can simply and quickly realize cloud detection, and has high accuracy, small calculation amount and small complexity.
The embodiment of the present application provides an electronic device, which includes a memory and a processor, where the memory is used to store a computer program, and the processor runs the computer program to enable the electronic device to execute the method for detecting the full-time cloud of the geostationary satellite in embodiment 1 or embodiment 2 of the present application.
An embodiment of the present application provides a computer-readable storage medium, which stores computer program instructions, and when the computer program instructions are read and executed by a processor, the method for detecting a full-time cloud of a geostationary satellite according to any one of embodiment 1 and embodiment 2 of the present application is executed.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus and method can be implemented in other ways. The apparatus embodiments described above are merely illustrative, and for example, the flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of apparatus, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
In addition, functional modules in the embodiments of the present application may be integrated together to form an independent part, or each module may exist separately, or two or more modules may be integrated to form an independent part.
The functions, if implemented in the form of software functional modules and sold or used as a stand-alone product, may be stored in a computer-readable storage medium. Based on such understanding, the technical solution of the present application or portions thereof that substantially contribute to the prior art may be embodied in the form of a software product stored in a storage medium and including instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
The above description is only an example of the present application and is not intended to limit the scope of the present application, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, improvement and the like made within the spirit and principle of the present application shall be included in the protection scope of the present application. It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, it need not be further defined and explained in subsequent figures.
The above description is only for the specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present application, and shall be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.
It is noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.