Flame detection method and device, computer equipment and storage medium

文档序号:8481 发布日期:2021-09-17 浏览:25次 中文

1. A method of flame detection, the method comprising:

acquiring a first detection image;

detecting the first detection image to obtain a flame frame in the first detection image, wherein the information of the flame frame comprises flame confidence;

calculating a flame color score for the flame frame;

calculating a flame motion score for the flame frame;

calculating a flame metric index of the flame frame according to the flame confidence, the flame color score and the flame motion score;

comparing the flame metric indicator to a flame threshold, determining that a flame is present in the first detected image when the flame metric indicator is greater than or equal to the flame threshold, and/or determining that a flame is not present in the first detected image when the flame metric indicator is less than the flame threshold.

2. The flame detection method of claim 1, wherein the flame metric is calculated by:

Fire_Score=Conf*(w1YUV_Score+w2Motion_Score)

wherein Fire _ Score is the flame metric, Conf is the flame confidence, YUV _ Score is the flame color Score, Motion _ Score is the flame Motion Score, w is the flame Motion Score1Is the weight of the flame color score, w2Is a weight of the flame motion score, and w1+w2=1。

3. The flame detection method according to claim 1, wherein the step of calculating the flame color score of the flame frame specifically comprises:

carrying out YUV color space transformation on the flame frame;

counting the total number of pixels meeting the flame pixel constraint rule in the flame frame, and recording the total number as N _ r;

calculating the flame color score, wherein the calculation formula of the flame color score is as follows:

wherein N _ All is the total number of pixels in the flame frame;

the flame pixel constraint rules include:

rule r 1: y (x, Y) > U (x, Y)

Rule r 2: v (x, y) > U (x, y)

Rule r 3:

rule r 4: v (x, y) -U (x, y) | > τ, τ -40

Y (x, Y), U (x, Y) and V (x, Y) respectively represent values of Y components, U components and V components of pixel points with coordinate values of (x, Y) in a YUV color space, N is the number of pixels in the x direction of the flame frame, and N is the number of pixels in the Y direction of the flame frame.

4. The flame detection method according to claim 1, wherein the step of calculating the flame motion score of the flame frame specifically comprises:

acquiring adjacent frame images of the first detection image and recording the adjacent frame images as a second detection image;

respectively extracting flame key points in the first detection image and the second detection image;

matching the flame key points in the first detection image and the second detection image to obtain a flame key point matching pair;

obtaining the change angle of the flame key point matching pair according to the flame key point matching pair;

and obtaining the flame motion score according to the distribution condition of the change angles of the flame key point matching pairs.

5. The flame detection method of claim 1, further comprising: and determining the fire level according to the area ratio of the flame frame in the first detection image when the flame is judged to exist in the first detection image.

6. The flame detection method of claim 5, wherein the information of the flame frame further includes a flame position;

the method further comprises the following steps: outputting to a user at least one of whether a fire is occurring, the fire level, and the flame location.

7. A flame detection device, the device comprising:

an image acquisition module configured to acquire a first detection image and a second detection image;

a flame detection module configured to detect the first detection image, resulting in a flame frame in the first detection image, information of the flame frame including a flame confidence level; (ii) a

A flame color metric module configured to calculate a flame color score for the flame frame;

a flame movement metric module configured to calculate a flame movement score for the flame box;

a flame metrology module configured to:

calculating a flame metric index of the flame frame according to the flame confidence, the flame color score and the flame motion score;

comparing the flame metric indicator to a flame threshold, determining that a flame is present in the first detected image when the flame metric indicator is greater than or equal to the flame threshold, and/or determining that a flame is not present in the first detected image when the flame metric indicator is less than the flame threshold.

8. The flame detection device of claim 7, wherein the flame metric is calculated by:

Fire_Score=Conf*(w1YUV_Score+w2Motion_Score)

wherein Fire _ Score is the flame metric, Conf is the flame confidence, YUV _ Score is the flame color Score, Motion _ Score is the flame Motion Score, w is the flame Motion Score1Is the weight of the flame color score, w2Is a weight of the flame motion score, and w1+w2=1。

9. The flame detection device of claim 7, wherein the flame colorimetry module performs in particular the following:

carrying out YUV color space transformation on the flame frame;

counting the total number of pixels meeting the flame pixel constraint rule in the flame frame, and recording the total number as N _ r;

calculating the flame color score, wherein the calculation formula of the flame color score is as follows:

wherein N _ All is the total number of pixels in the flame frame;

the flame pixel constraint rules include:

rule r 1: y (x, Y) > U (x, Y)

Rule r 2: v (x, y) > U (x, y)

Rule r 3:

rule r 4: v (x, y) -U (x, y) | > τ, τ -40

Y (x, Y), U (x, Y) and V (x, Y) respectively represent values of Y components, U components and V components of pixel points with coordinate values of (x, Y) in a YUV color space, N is the number of pixels in the x direction of the flame frame, and N is the number of pixels in the Y direction of the flame frame.

10. The flame detection device of claim 7, wherein the flame motion metric module performs in particular the following:

acquiring adjacent frame images of the first detection image and recording the adjacent frame images as a second detection image;

respectively extracting flame key points in the first detection image and the second detection image;

matching the flame key points in the first detection image and the second detection image to obtain a flame key point matching pair;

obtaining the change angle of the flame key point matching pair according to the flame key point matching pair;

and obtaining the flame motion score according to the distribution condition of the change angles of the flame key point matching pairs.

11. The flame detection device according to claim 7, further comprising a fire level discrimination module configured to determine a fire level based on an area ratio of the flame frame in the first detection image when it is determined that there is a flame in the first detection image.

12. A computer device comprising a processor and a storage means adapted to store a plurality of program codes, characterized in that said program codes are adapted to be loaded and run by said processor to perform the flame detection method according to any of claims 1 to 6.

13. A storage medium adapted to store a plurality of program codes, wherein the program codes are adapted to be loaded and executed by a processor to perform the flame detection method of any of claims 1 to 6.

Background

A fire is a disaster that can cause significant casualties and property loss. According to the statistical data of the fire rescue bureau of the emergency management department, 17.6 thousands of fires are reported all over the country in a quarter of 2021 year, 433 people who die and 249 people who are injured are reported, and the direct economic loss is 13.9 million yuan. Although a fire disaster can bring great loss to lives and properties of people, compared with other disasters, the fire disaster has certain controllability, and the fire disaster needs to be found in time and remedial measures are taken in the early stage of the fire disaster to realize the controllability, so that the flame can be detected in time.

The traditional flame detection method is mainly realized based on a smoke sensor, an optical sensor, an infrared sensor, a thermosensitive sensor and the like. However, such a sensor-based flame detection method has a large limitation in detection time, detection range, etc., and generally cannot provide detailed information of a fire, such as flame size, combustion degree, etc. In addition, the thermal imaging camera can also be used for detecting fire, but the thermal imaging camera is too high in cost compared with a common camera and difficult to popularize.

With the application of surveillance cameras in public areas, it has become a trend of technical development to automatically detect flames by intercepting video images of the surveillance cameras, but a computer flame detection method is generally completed based on rule learning of a color space, and such a rule-based method is poor in generalization, is easily affected by factors such as illumination and brightness, and is difficult to distinguish objects with colors similar to the colors of flames. Therefore, how to detect flames rapidly and accurately through images of monitoring videos becomes a problem to be solved in the field.

Accordingly, there is a need in the art for a new solution to the above-mentioned problems.

Disclosure of Invention

The invention aims to solve the technical problems, namely, the problems of poor generalization performance, easy interference and the like of a flame detection method based on color space rule learning in the application of monitoring video image flame detection are solved.

In a first aspect, the present invention provides a method of flame detection, the method comprising:

acquiring a first detection image;

detecting the first detection image to obtain a flame frame in the first detection image, wherein the information of the flame frame comprises flame confidence;

calculating a flame color score for the flame frame;

calculating a flame motion score for the flame frame;

calculating a flame metric index of the flame frame according to the flame confidence, the flame color score and the flame motion score;

comparing the flame metric indicator to a flame threshold, determining that a flame is present in the first detected image when the flame metric indicator is greater than or equal to the flame threshold, and/or determining that a flame is not present in the first detected image when the flame metric indicator is less than the flame threshold.

In one embodiment of the flame detection method, the flame metric index is calculated by:

Fire_Score=Conf*(w1YUV_Score+w2Motion_Score)

wherein Fire _ Score is the flame metric, Conf is the flame confidence, YUV _ Score is the flame color Score, Motion _ Score is the flame Motion Score, w is the flame Motion Score1Is the weight of the flame color score, w2Is a weight of the flame motion score, and w1+w2=1。

In an embodiment of the above flame detection method, the step of "calculating the flame color score of the flame frame" specifically includes:

carrying out YUV color space transformation on the flame frame;

counting the total number of pixels meeting the flame pixel constraint rule in the flame frame, and recording the total number as N _ r;

calculating the flame color score, wherein the calculation formula of the flame color score is as follows:

wherein N _ All is the total number of pixels in the flame frame;

the flame pixel constraint rules include:

rule r 1: y (x, Y) > U (x, Y)

Rule r 2: v (x, y) > U (x, y)

Rule r 3:

rule r 4: l V (x, y) -U (x, y) | > τ, τ ═ 40

Y (x, Y), U (x, Y) and V (x, Y) respectively represent values of Y components, U components and V components of pixel points with coordinate values of (x, Y) in a YUV color space, N is the number of pixels in the x direction of the flame frame, and N is the number of pixels in the Y direction of the flame frame.

In an embodiment of the above flame detection method, the step of "calculating a flame movement score of the flame frame" specifically includes:

acquiring adjacent frame images of the first detection image and recording the adjacent frame images as a second detection image;

respectively extracting flame key points in the first detection image and the second detection image;

matching the flame key points in the first detection image and the second detection image to obtain a flame key point matching pair;

obtaining the change angle of the flame key point matching pair according to the flame key point matching pair;

and obtaining the flame motion score according to the distribution condition of the change angles of the flame key point matching pairs.

In one embodiment of the above flame detection method, the method further comprises: and determining the fire level according to the area ratio of the flame frame in the first detection image when the flame is judged to exist in the first detection image.

In one embodiment of the above flame detection method, the information of the flame frame further includes a flame position;

the method further comprises the following steps: outputting to a user at least one of whether a fire is occurring, the fire level, and the flame location.

In a second aspect, the present invention provides a flame detection apparatus, the apparatus comprising:

an image acquisition module configured to acquire a first detection image and a second detection image;

a flame detection module configured to detect the first detection image, resulting in a flame frame in the first detection image, information of the flame frame including a flame confidence level; (ii) a

A flame color metric module configured to calculate a flame color score for the flame frame;

a flame movement metric module configured to calculate a flame movement score for the flame box;

a flame metrology module configured to:

calculating a flame metric index of the flame frame according to the flame confidence, the flame color score and the flame motion score;

comparing the flame metric indicator to a flame threshold, determining that a flame is present in the first detected image when the flame metric indicator is greater than or equal to the flame threshold, and/or determining that a flame is not present in the first detected image when the flame metric indicator is less than the flame threshold.

In an embodiment of the flame detection apparatus, the flame metric index is calculated by:

Fire_Score=Conf*(w1YUV_Score+w2Motion_Score)

wherein Fire _ Score is the flame metric index and Conf is theFlame confidence, YUV _ Score for the flame color Score, Motion _ Score for the flame Motion Score, w1Is the weight of the flame color score, w2Is a weight of the flame motion score, and w1+w2=1。

In one embodiment of the above flame detection device, the flame colorimetry module performs the following operations:

carrying out YUV color space transformation on the flame frame;

counting the total number of pixels meeting the flame pixel constraint rule in the flame frame, and recording the total number as N _ r;

calculating the flame color score, wherein the calculation formula of the flame color score is as follows:

wherein N _ All is the total number of pixels in the flame frame;

the flame pixel constraint rules include:

rule r 1: y (x, Y) > U (x, Y)

Rule r 2: v (x, y) > U (x, y)

Rule r 3:

rule r 4: l V (x, y) -U (x, y) | > τ, τ ═ 40

Y (x, Y), U (x, Y) and V (x, Y) respectively represent values of Y components, U components and V components of pixel points with coordinate values of (x, Y) in a YUV color space, N is the number of pixels in the x direction of the flame frame, and N is the number of pixels in the Y direction of the flame frame.

In one embodiment of the above flame detection apparatus, the flame motion metric module performs the following operations:

acquiring adjacent frame images of the first detection image and recording the adjacent frame images as a second detection image;

respectively extracting flame key points in the first detection image and the second detection image;

matching the flame key points in the first detection image and the second detection image to obtain a flame key point matching pair;

obtaining the change angle of the flame key point matching pair according to the flame key point matching pair;

and obtaining the flame motion score according to the distribution condition of the change angles of the flame key point matching pairs.

In one embodiment of the flame detection apparatus, the apparatus further includes a fire level determination module configured to determine a fire level based on an area ratio of the flame frame in the first detection image when it is determined that the flame exists in the first detection image.

In a third aspect, the invention proposes a computer device comprising a processor and a storage means adapted to store a plurality of program codes adapted to be loaded and run by the processor to perform a flame detection method according to any of the above aspects.

In a fourth aspect, the present invention proposes a storage medium adapted to store a plurality of program codes adapted to be loaded and run by a processor to perform a flame detection method according to any of the above aspects.

Under the condition of adopting the technical scheme, the method and the device can detect the flame frame in the video image, and synthesize the flame confidence coefficient, the flame color score and the flame motion score of the flame frame to obtain the flame measurement index, thereby judging whether the flame exists in the image. The invention can effectively improve the generalization performance and the anti-interference capability of pure visual flame detection, realize the quick detection and positioning at the initial stage of fire occurrence, and provide data support for quickly coping with the fire, thereby reducing the loss of life and property of people. Meanwhile, the method of the invention can utilize the existing video monitoring equipment, has low implementation cost and extremely high popularization value.

Drawings

Preferred embodiments of the present invention are described below with reference to the accompanying drawings, in which:

FIG. 1 is a flow chart of the main steps of a flame detection method of an embodiment of the invention.

Fig. 2 is a flowchart of a specific implementation of step S104 in fig. 1.

FIG. 3 is a schematic illustration of the varying angles of the flame keypoint matching pairs of the present invention.

FIG. 4 is a schematic view of the constitution of the flame detection device of the present invention.

Detailed Description

In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions of the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings, and it is obvious that the described embodiments are some embodiments of the present invention, but not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments of the present invention without making any creative effort, shall fall within the protection scope of the present invention.

Turning first to FIG. 1, FIG. 1 is a flow chart of the main steps of a flame detection method of an embodiment of the invention. As shown in fig. 1, the flame detection method of the present invention includes:

step S101: acquiring a first detection image;

step S102: detecting the first detection image to obtain a flame frame in the first detection image, wherein the information of the flame frame comprises a flame confidence coefficient;

step S103: calculating the flame color score of the flame frame;

step S104: calculating the flame motion value of the flame frame;

step S105: calculating a flame measurement index of the flame frame according to the flame confidence, the flame color score and the flame motion score;

step S106: and comparing the flame metric index with the flame threshold value, and judging whether flame exists in the first detection image.

In step S101, the first detection image is usually from an image captured in a video surveillance video.

Moreover, according to different selected flame frame detection methods, the requirements of the first detection image such as format, size and the like are different, so that for the image captured by the video, one or more combinations of scaling, filling, storage format conversion, normalization and the like of the image are generally required according to the requirements of a target detection algorithm, so as to obtain the first detection image meeting the requirements of the input format of the flame frame detection method.

In the present invention, the method for acquiring the second detection image is the same as the method for acquiring the first detection image, and the image formats of the first detection image and the second detection image are also the same.

In step S102, the method for detecting the flame frame in the first detected image is not limited by the present invention, and by way of example, a target detection algorithm based on deep learning, such as YOLOV5, SSD algorithm, etc., may be selected, and those skilled in the art may select a suitable method according to actual situations.

In the flame frame detection result, one or more flame frame information may be obtained, which typically includes the flame confidence of the flame frame and the position of the flame frame in the first detection image. As an example, a pixel coordinate system may be established in the first inspection image, with the coordinate system origin typically chosen in the upper left corner of the image. The flame frame is usually rectangular, and the flame frame can be represented as Qn(x, y, w, h, Conf), x and y being coordinates of the lower left corner of the rectangular box in the pixel coordinate system, w being the width of the rectangular box, h being the height of the rectangular box, Conf being the flame confidence value for the region, the flame confidence indicating the likelihood of a flame being present in the region, typically the greater the value of the flame confidence, the greater the likelihood of a flame being present in the region.

When the plurality of flame frames in the first detection image have the areas which mutually contain or mostly intersect, screening can be carried out by a non-maximum value inhibition method, redundant flame frames are removed, and the flame frame with the highest confidence coefficient is reserved. And, can be according to needing, further screen the flame frame through setting up the threshold value of confidence degree of flame, get one or more flame frames that the invention needs.

In step S103, YUV color space transformation is first performed on the flame frame, the number N _ r of pixel points satisfying the flame pixel constraint rule is counted according to the flame pixel constraint rule, and then the formula is used

Calculating the flame color score of the flame frame, wherein N _ All is the total number of pixels in the flame frame.

In step S103, the flame pixel constraint rule includes:

rule r 1: y (x, Y) > U (x, Y)

Rule r 2: v (x, y) > U (x, y)

Rule r 3:

rule r 4: l V (x, y) -U (x, y) | > τ, τ ═ 40

Y (x, Y), U (x, Y) and V (x, Y) respectively represent values of Y components, U components and V components of pixel points with coordinate values of (x, Y) in a YUV color space, N is the number of pixels in the x direction of the flame frame, and M is the number of pixels in the Y direction of the flame frame.

As an example, the resolution of a certain flame frame image is 200 × 300, that is, in the pixel coordinate system, the number N of pixels in the x direction of the flame frame is 200, the number M of pixels in the y direction of the flame frame is 300, the total number N _ All of pixels in the flame frame is 200 × 300 — 60000, the number N _ r of pixels satisfying the flame pixel constraint rule is 48000, and the calculation is performed

Continuing with fig. 2, fig. 2 is a specific implementation method of step S104, which includes:

step S1041: acquiring adjacent frame images of the first detection image and recording the adjacent frame images as a second detection image;

step S1042: respectively extracting flame key points in the first detection image and the second detection image;

step S1043: matching the flame key points in the first detection image and the second detection image to obtain a flame key point matching pair;

step S1044: obtaining the change angle of the flame key point matching pair according to the flame key point matching pair;

step S1045: and obtaining the flame motion score according to the distribution condition of the change angles of the flame key point matching pairs.

In step S1041, preferably, an image of a next frame adjacent to the first detection image in the monitoring video is intercepted as the second detection image, so as to further determine whether flame exists in the first detection image according to the motion attribute of flame.

In step S1042, the method for determining the flame key points of the first detection image and the second detection image is not limited in the present invention, and for example, the flame key points may be extracted by an ORB algorithm based on OpenCV, and a person skilled in the art may select an appropriate method according to actual situations.

In step S1043, the method for matching the flame key points in the first detection image and the second detection image is not limited in the present invention, and for example, the euclidean distance between the key points of the two images may be calculated by an SIFT algorithm based on OpenCV to perform matching, and those skilled in the art may select an appropriate method according to actual situations.

By referring to fig. 3, the variation angle of the flame key point matching pairs in step S1044 is explained. As shown in fig. 3, points a1 and B1 are the flame key points in the first test image, and points a2 and B2 are the flame key points in the second test image; obtaining flame key point matching pairs (A1, A2) and (B1, B2) after matching in the step S1043; connecting flame key point matching pairs by straight lines to obtain vectorsAndas an example, if the included angle between the flame key point matching pair vector and the positive x-axis direction is defined as the variation angle of the flame key point matching pair, the variation angle of the flame key point matching pair (a1, a2) is defined as a vectorThe included angle alpha with the positive direction of the x axis, and the change angles of the flame key point matching pairs (B1, B2) are vectorsThe angle beta is included with the positive direction of the x axis.

In step S1045, the flame motion score of the flame frame is determined by analyzing the distribution of the variation angles of all the matching pairs of the flame key points in the flame frame. Because of the disordering nature of the flame variation, the distribution of the varying angles of the flame keypoint matching pairs is relatively uniform. That is, the more uniform the variation angle distribution of the flame key point matching pairs, the higher the flame motion score of the flame frame; on the contrary, the more concentrated the variation angle distribution of the flame key point matching pairs, the lower the flame motion score of the corresponding flame frame.

In step S1045, the method for calculating the flame motion score is not limited in the present invention, and for example, the flame motion score of the flame frame may be determined by counting the mean and variance of the variation angles of the flame key point matching pairs in the flame frame, and those skilled in the art may select an appropriate method according to the actual situation.

After obtaining the flame confidence, the flame color score and the flame motion score of the flame frame, the flame metric index of the flame frame may be obtained through step S105. Preferably, the flame metric is calculated as follows:

Fire_Score=Conf*(w1YUV_Score+w2Motion_Score)

wherein Fire _ Score is the flame metric, Conf is the flame confidence, YUV _ Score is the flame color Score, Motion _ Score is the flame Motion Score, w is the flame Motion Score1Is said flameWeight of color score, w2Is a weight of the flame motion score, and w1+w2=1。

In practical application, w1And w2Can be set according to the condition of a monitoring site, and can improve the weight w of the flame color score for finding the flame in time1(ii) a To detect the condition of flame variation, the weight w of the flame motion score can be calculated2The skilled person can determine the appropriate w under different scenes by training1And w2The value of (c).

In step S106, a flame threshold value of 0.5 may be set, the flame metric indexes of all flame frames in the first detection image are compared with the flame threshold value one by one, and when the flame metric indexes of the flame frames in the first detection image are greater than or equal to the flame threshold value, it is determined that there is a flame in the first detection image, and/or when the flame metric indexes are all less than the flame threshold value, it is determined that there is no flame in the first detection image.

It should be noted that, those skilled in the art can also design other forms of flame metric index calculation methods according to the flame confidence, flame color score and flame motion score of the flame frame, such as

FireScore=w1YUVScore+w2MotionScore+w3Conf

Wherein, w3The detection of the flame can also be realized by the weight of the confidence coefficient of the flame. Such modifications and substitutions are intended to fall within the scope of the present invention without departing from the spirit thereof.

When it is determined that there is a flame in the first detection image, the area ratio of all flame frames in which flames exist in the first detection image may be further calculated. For example, if the area ratio of the flame frame in which flames are present in the first detection image is 70% or more, it can be determined that a fire is serious.

And the actual geographical position of the flame occurrence can be determined by the position of the flame frame in the first detection image in the flame frame information and the geographical position of the installation of the camera.

After the flame detection result is obtained, information such as whether a fire disaster occurs, the fire disaster level, the flame position and the like can be output to a user through forms such as screen display, voice broadcasting and the like according to setting.

Further, the invention also provides a flame detection device. As shown in fig. 4, the flame detection device 4 of the present invention mainly includes: an image acquisition module 41, a flame detection module 42, a flame color metric module 43, a flame motion metric module 44, a flame metric module 45, and a fire class discrimination module 46.

The image acquisition module 41 is configured to acquire a first detection image and a second detection image. The flame detection module 42 is configured to perform the operation in step S102. The flame colorimetry module 43 is configured to perform the operation in step S103. The flame motion metric module 44 is configured to perform the operations in step S104 and steps S1041 to S1045 in fig. 2. The flame metrology module 45 is configured to perform the operations in step S105 and step S106. The fire level discrimination module 46 is configured to determine a fire level based on an area ratio of a flame frame in the first detection image in a case where it is determined that there is a flame in the first detection image; and outputs to the user: whether a fire occurs, the fire level, the flame position, etc.

Further, the present invention also provides a computer device comprising a processor and a storage means, the storage means may be configured to store and execute a program of the flame detection method of the above method embodiment, and the processor may be configured to execute a program in the storage means, the program including but not limited to a program of the flame detection method of the above method embodiment. For convenience of explanation, only the parts related to the embodiments of the present invention are shown, and details of the specific techniques are not disclosed. The flame detection device may be a control device formed including various electronic devices.

Further, the present invention also provides a storage medium, which may be configured to store a program for executing the flame detection method of the above-described method embodiment, which may be loaded and executed by a processor to implement the method of the above-described flame detection method. For convenience of explanation, only the parts related to the embodiments of the present invention are shown, and details of the specific techniques are not disclosed. The storage medium may be a storage device formed of various electronic apparatuses, and optionally, the storage medium is a non-transitory computer-readable storage medium in an embodiment of the present invention.

Those of skill in the art will appreciate that the method steps of the examples described in connection with the embodiments disclosed herein may be embodied in electronic hardware, computer software, or combinations of both, and that the components and steps of the examples have been described above generally in terms of their functionality in order to clearly illustrate the interchangeability of electronic hardware and software. Whether such functionality is implemented as electronic hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.

It should be noted that the terms "first," "second," and the like in the description and in the claims, and in the drawings, are used for distinguishing between similar elements and not necessarily for describing or implying any particular order or sequence. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the invention described herein are capable of operation in sequences other than those illustrated or described herein.

It should be noted that in the description of the present application, the term "a and/or B" indicates all possible combinations of a and B, such as a alone, B alone, or a and B.

So far, the technical solutions of the present invention have been described in connection with the preferred embodiments shown in the drawings, but it is easily understood by those skilled in the art that the scope of the present invention is obviously not limited to these specific embodiments. Equivalent changes or substitutions of related technical features can be made by those skilled in the art without departing from the principle of the invention, and the technical scheme after the changes or substitutions can fall into the protection scope of the invention.

完整详细技术资料下载
上一篇:石墨接头机器人自动装卡簧、装栓机
下一篇:基于骨髓细胞形态学的血液疾病人工智能辅助诊断系统

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!