Image registration method and related device and equipment
1. An image registration method, comprising:
acquiring an image to be registered and a reference image containing a target image;
determining a first deflection angle of a first image area and a second deflection angle of a second image area, wherein at least part of pixel points of the first image area are in the reference image, and at least part of pixel points of the second image area are in the image to be registered;
obtaining candidate registration parameters based on the first deflection angle and the second deflection angle;
and obtaining a final registration parameter between the target image and the image to be registered based on the candidate registration parameter.
2. The method according to claim 1, wherein there are a preset number of the second image regions and the candidate registration parameters, each candidate registration parameter being derived based on the first deflection angle and a second deflection angle corresponding to one of the second image regions;
the obtaining of the final registration parameter between the target image and the image to be registered based on the candidate registration parameter includes:
for each of the candidate registration parameters: determining the similarity between the image to be registered and a target image by using the candidate registration parameters to obtain a first scoring result of the candidate registration parameters;
and obtaining the final registration parameter based on the first grading result of the preset number of candidate registration parameters.
3. The method according to claim 2, wherein the deriving the final registration parameter based on the first scoring result of a preset number of the candidate registration parameters comprises:
selecting candidate registration parameters of which the first grading results meet preset conditions as rough registration parameters;
optimizing the rough registration parameters by using a preset optimization mode;
determining the similarity between the image to be registered and a target image by using the optimized rough registration parameter so as to obtain a second grading result of the optimized rough registration parameter;
and selecting the optimized rough registration parameter as the final registration parameter based on the second grading result of the optimized rough registration parameter.
4. The method according to claim 2 or 3, wherein the determining the similarity between the image to be registered and the target image by using the candidate registration parameters comprises:
determining at least one set of matching point pairs using the candidate registration parameters; wherein two pixel points in the matching point pair are respectively from the target image in the image to be registered and the reference image;
and obtaining the similarity between the image to be registered and the target image based on the pixel values of the at least one group of matching point pairs.
5. The method of any of claims 1 to 4, wherein prior to said deriving a candidate registration parameter based on the first and second deflection angles, the method further comprises:
acquiring an estimated region of the reference image in the image to be registered, and acquiring an estimated scaling based on the size of the estimated region and the size of the reference image;
obtaining a candidate registration parameter based on the first deflection angle and the second deflection angle, including:
and obtaining candidate registration parameters based on the pre-estimated scaling, the first deflection angle and the second deflection angle.
6. The method of claim 5, wherein deriving candidate registration parameters based on the pre-estimated scaling, the first deflection angle, and the second deflection angle comprises:
acquiring an angle difference between the first deflection angle and a second deflection angle;
and obtaining the candidate registration parameters based on the angle difference and the pre-estimated scaling.
7. The method of claim 5, further comprising:
and in response to the fact that the estimated area cannot be obtained, taking the candidate scaling as the estimated scaling.
8. The method according to any one of claims 1 to 7, wherein the obtaining a reference image containing a target image comprises:
taking the target image as a reference image; alternatively, the first and second electrodes may be,
and processing the target image into a reference image with the same shape as the image to be registered, and recording attribute information of pixel points in the reference image, wherein the attribute information of the pixel points is used for determining whether the pixel points belong to the target image.
9. The method according to any one of claims 1 to 8, wherein the pixels of the first image region are all in the reference image; and/or the pixel points of the second image area are all in the image to be registered.
10. The method according to claims 1 to 9, wherein the center of the first image area is the center of the reference image;
and/or the first deflection angle is a directed included angle between a connecting line of the centroid of the first image area and the center of the first image area and a preset direction; the second deflection angle is a directed included angle between a connecting line of the centroid of the second image area and the center of the second image area and a preset direction.
11. An image registration apparatus, comprising:
the image acquisition module is used for acquiring an image to be registered and a reference image containing a target image;
an angle determining module, configured to determine a first deflection angle of a first image region and determine a second deflection angle of a second image region, where at least part of pixel points of the first image region are in the reference image and at least part of pixel points of the second image region are in the image to be registered;
a candidate parameter obtaining module, configured to obtain a candidate registration parameter based on the first deflection angle and the second deflection angle;
and the final parameter acquisition module is used for obtaining a final registration parameter between the target image and the image to be registered based on the candidate registration parameter.
12. An electronic device comprising a memory and a processor coupled to each other, the processor being configured to execute program instructions stored in the memory to implement the image registration method of any one of claims 1 to 10.
13. A computer readable storage medium having stored thereon program instructions which, when executed by a processor, implement the image registration method of any of claims 1 to 10.
Background
With the development of electronic information technology, Augmented Reality (AR), Virtual Reality (VR), and the like become application hotspots in the field of computer vision, and the surrounding environment can be digitized by using a camera as an input device and processing with an image algorithm, so that the use experience of interaction with a real environment is obtained.
The image registration is a research focus in the field of computer vision such as AR, VR and the like, and the registration parameters between the image to be registered and the target image shot by the camera can be obtained through the image registration technology, so that the registration position of the target image in the image to be registered can be obtained through the registration parameters subsequently. However, the existing image registration technology is mainly directed to the situation that the target image occupies a relatively large proportion in the image to be registered, and if the target image occupies a relatively small proportion in the image to be registered, accurate registration cannot be performed. In view of this, how to improve the accuracy of image registration becomes an urgent problem to be solved.
Disclosure of Invention
The application provides an image registration method and a related device and equipment.
A first aspect of the present application provides an image registration method, including: acquiring an image to be registered and a reference image containing a target image; determining a first deflection angle of a first image area and a second deflection angle of a second image area, wherein at least part of pixel points of the first image area are in a reference image, and at least part of pixel points of the second image area are in an image to be registered; obtaining candidate registration parameters based on the first deflection angle and the second deflection angle; and obtaining a final registration parameter between the target image and the image to be registered based on the candidate registration parameter.
Therefore, the image to be registered and the reference image containing the target image are obtained, the first deflection angle of the first image area is determined, the second deflection angle of the second image area is determined, at least part of pixel points of the first image area are in the reference image, at least part of pixel points of the second image area are in the image to be registered, on the basis, the candidate registration parameters are obtained based on the first deflection angle and the second deflection angle, the final registration parameters between the target image and the image to be registered are obtained based on the candidate registration parameters, registration is not required to be carried out by extracting feature points and feature representation, registration cannot be affected by the proportion of the target image in the image to be registered, and the accuracy of image registration can be improved.
The second image area and the candidate registration parameters are respectively in a preset number, and each candidate registration parameter is obtained based on the first deflection angle and a second deflection angle corresponding to the second image area; obtaining a final registration parameter between the target image and the image to be registered based on the candidate registration parameter, wherein the step of obtaining the final registration parameter comprises the following steps: for each candidate registration parameter: determining the similarity between the image to be registered and the target image by using the candidate registration parameters to obtain a first grading result of the candidate registration parameters; and obtaining a final registration parameter based on the first grading result of the preset number of candidate registration parameters.
Therefore, the second image region and the candidate registration parameters are respectively provided with a preset number, each candidate registration parameter is obtained based on the first deflection angle and a second deflection angle corresponding to the second image region, and on the basis, for each candidate registration parameter: and determining the similarity between the image to be registered and the target image by using the candidate registration parameters to obtain a first grading result of the candidate registration parameters, and obtaining the final registration parameters based on the first grading results of the preset number of candidate registration parameters, so that the final registration parameters can be obtained from the preset number of candidate registration parameters through the first grading result of the similarity, and the accuracy of image registration can be further improved.
Obtaining a final registration parameter based on a first scoring result of a preset number of candidate registration parameters, wherein the obtaining of the final registration parameter comprises: selecting candidate registration parameters of which the first grading results meet preset conditions as rough registration parameters; optimizing the rough registration parameters by using a preset optimization mode; determining the similarity between the image to be registered and the target image by using the optimized rough registration parameters to obtain a second grading result of the optimized rough registration parameters; and selecting the optimized rough registration parameter as a final registration parameter based on the second grading result of the optimized rough registration parameter.
Therefore, the candidate registration parameters of which the first grading results meet the preset conditions are selected as rough registration parameters, the rough registration parameters are optimized in a preset optimization mode, the optimized rough registration parameters are used for determining the similarity between the image to be registered and the target image so as to obtain a second grading result of the optimized rough registration parameters, and the optimized rough registration parameters are selected as final registration parameters based on the second grading result of the optimized rough registration parameters, so that the final registration parameters can be obtained by screening the candidate registration parameters in two stages, and the accuracy of image registration can be further improved.
Determining the similarity between the image to be registered and the target image by using the candidate registration parameters, wherein the determining comprises the following steps: determining at least one set of matching point pairs using the candidate registration parameters; wherein two pixel points in the matching point pair are respectively from a target image in the image to be registered and a reference image; and obtaining the similarity between the image to be registered and the target image based on the pixel values of the at least one group of matching point pairs.
Therefore, at least one group of matching point pairs is determined by utilizing the candidate registration parameters, two pixel points in the matching point pairs are respectively from the target image in the image to be registered and the reference image, and the similarity between the image to be registered and the target image is obtained based on the pixel values of the at least one group of matching point pairs, so that the interference of the pixel points which do not belong to the target image can be eliminated in the similarity calculation process, and the accuracy of the similarity is improved.
Wherein, before obtaining the candidate registration parameter based on the first deflection angle and the second deflection angle, the method further comprises: acquiring an estimated region of a reference image in an image to be registered, and acquiring an estimated scaling ratio based on the estimated region and the size of the reference image; obtaining a candidate registration parameter based on the first deflection angle and the second deflection angle, including: and obtaining candidate registration parameters based on the pre-estimated scaling, the first deflection angle and the second deflection angle.
Therefore, the estimated scaling is obtained by obtaining the estimated area of the reference image in the image to be registered and based on the size of the estimated area and the reference image, so that the candidate registration parameters are obtained based on the estimated scaling, the first deflection angle and the second deflection angle, and the image registration efficiency can be improved.
Obtaining candidate registration parameters based on the pre-estimated scaling, the first deflection angle and the second deflection angle, wherein the candidate registration parameters comprise: acquiring an angle difference between a first deflection angle and a second deflection angle; and obtaining candidate registration parameters based on the angle difference and the pre-estimated scaling.
Therefore, the candidate registration parameters are obtained by obtaining the angle difference between the first deflection angle and the second deflection angle and based on the angle difference and the pre-estimated scaling, which is beneficial to reducing the complexity of image registration.
Wherein the image registration method further comprises: and in response to the fact that the estimated area cannot be obtained, taking the candidate scaling as the estimated scaling.
Therefore, in response to that the estimated region cannot be obtained, the candidate scaling is used as the estimated scaling, which can be beneficial to improving the robustness of image registration.
Wherein obtaining a reference image containing a target image comprises: taking the target image as a reference image; or processing the target image into a reference image with the same shape as the image to be registered, and recording attribute information of pixel points in the reference image, wherein the attribute information of the pixel points is used for determining whether the pixel points belong to the target image.
Therefore, the target image is used as the reference image, or the target image is processed into the reference image with the same shape as the image to be registered, and the attribute information of the pixel points in the reference image is recorded, and the attribute information of the pixel points is used for determining whether the pixel points belong to the target image, so that the image registration can be completed under the condition that the target image is in any shape, and the robustness of the image registration is improved.
The pixel points of the first image area are all in the reference image; and/or the pixel points of the second image area are all in the image to be registered.
Therefore, the pixel points of the first image area are set to be all in the reference image, and/or the pixel points of the second image area are set to be all in the image to be registered, so that the accuracy of image registration can be improved.
Wherein, the center of the first image area is the center of the reference image; and/or the first deflection angle is a directed included angle between a connecting line of the centroid of the first image area and the center of the first image area and a preset direction; the second deflection angle is a directed included angle between a connecting line of the centroid of the second image area and the center of the second image area and the preset direction.
Therefore, the center of the first image is set as the center of the reference image, the first deflection angle is a directed angle between a connecting line of the centroid of the first image area and the center of the first image area and the preset direction, and the second deflection angle is a directed angle between a connecting line of the centroid of the second image area and the center of the second image area and the preset direction, so that the accuracy of the first deflection angle and the accuracy of the second deflection angle can be improved.
A second aspect of the present application provides an image registration apparatus, comprising: the device comprises an image acquisition module, an angle determination module, a candidate parameter acquisition module and a final parameter acquisition module, wherein the image acquisition module is used for acquiring an image to be registered and a reference image containing a target image; the angle determining module is used for determining a first deflection angle of a first image area and determining a second deflection angle of a second image area, wherein at least part of pixel points of the first image area are in a reference image, and at least part of pixel points of the second image area are in an image to be registered; the candidate parameter acquisition module is used for acquiring candidate registration parameters based on the first deflection angle and the second deflection angle; and the final parameter acquisition module is used for obtaining a final registration parameter between the target image and the image to be registered based on the candidate registration parameter.
A third aspect of the present application provides an electronic device, comprising a memory and a processor coupled to each other, wherein the processor is configured to execute program instructions stored in the memory to implement the image registration method in the first aspect.
A fourth aspect of the present application provides a computer-readable storage medium having stored thereon program instructions that, when executed by a processor, implement the image registration method of the first aspect described above.
According to the scheme, the image to be registered and the reference image containing the target image are obtained, the first deflection angle of the first image area is determined, the second deflection angle of the second image area is determined, at least part of pixel points of the first image area are in the reference image, at least part of pixel points of the second image area are in the image to be registered, on the basis, the candidate registration parameters are obtained based on the first deflection angle and the second deflection angle, the final registration parameters between the target image and the image to be registered are obtained based on the candidate registration parameters, registration is not required to be carried out through extraction of feature points and feature representation, therefore, registration cannot be affected by the proportion of the target image in the image to be registered, and the accuracy of image registration can be improved.
Drawings
FIG. 1 is a schematic flow chart diagram of an embodiment of an image registration method of the present application;
FIG. 2 is a schematic diagram of an embodiment of a reference image;
FIG. 3 is a schematic view of an embodiment of a deflection angle acquisition mode;
FIG. 4 is a schematic diagram of obtaining a first characterization representation;
FIG. 5 is a flowchart illustrating an embodiment of step S14 in FIG. 1;
FIG. 6 is a block diagram of an embodiment of an image registration apparatus according to the present application;
FIG. 7 is a block diagram of an embodiment of an electronic device of the present application;
FIG. 8 is a block diagram of an embodiment of a computer-readable storage medium of the present application.
Detailed Description
The following describes in detail the embodiments of the present application with reference to the drawings attached hereto.
In the following description, for purposes of explanation and not limitation, specific details are set forth such as particular system structures, interfaces, techniques, etc. in order to provide a thorough understanding of the present application.
The terms "system" and "network" are often used interchangeably herein. The term "and/or" herein is merely an association describing an associated object, meaning that three relationships may exist, e.g., a and/or B, may mean: a exists alone, A and B exist simultaneously, and B exists alone. In addition, the character "/" herein generally indicates that the former and latter related objects are in an "or" relationship. Further, the term "plurality" herein means two or more than two.
Referring to fig. 1, fig. 1 is a schematic flowchart illustrating an embodiment of an image registration method according to the present application.
Specifically, the method may include the steps of:
step S11: and acquiring an image to be registered and a reference image containing a target image.
In one implementation scenario, the image to be registered may be an image captured by a camera. For example, in application scenarios such as AR and VR, the image to be registered may be an image captured by an electronic device such as a mobile phone, a tablet computer, and smart glasses; alternatively, in a video monitoring scene, the image to be registered may be an image captured by a monitoring camera, which is not limited herein. Other scenarios may be analogized, and are not exemplified here.
In another implementation scenario, the target image may be a pre-acquired image, and may be specifically set according to an actual application situation. For example, in the case where the position of the building a in the image to be registered needs to be determined, the image of the building a may be acquired in advance; alternatively, in a case where the position of the person B in the image to be registered needs to be determined, the image of the person B may be obtained in advance, and the like may be performed in other cases, which is not illustrated herein.
In one implementation scenario, the target image may be rectangular in shape, in which case the target image may be used directly as a reference image for subsequent registration.
In another implementation scenario, the shape of the target image may also be any shape other than a rectangle, which may specifically include but is not limited to: triangular, circular, trapezoidal, etc., without limitation. In addition, the target image may also be an irregular shape, for example, the target image may be a human face image, and the edge of the target image is a human face contour; or, the target image may also be an animal image, and the edge of the target image is an animal contour, and so on in other cases, which is not illustrated here. As described above, the embodiments of the present disclosure and the following embodiments of the present disclosure do not limit the specific shape of the target image, and the target image may have any shape.
In a specific implementation scenario, in the case that the shape of the target image is an arbitrary shape other than a rectangle, the target image may be processed into a reference image having the same shape as the image to be registered. For example, the target image may be extrapolated to a reference image that has the same shape as the image to be registered. Referring to fig. 2, fig. 2 is a schematic diagram of an embodiment of a reference image. As shown in fig. 2, taking the target image as a circle and the image to be registered as a rectangle as an example, a circular circumscribed rectangle may be obtained, and the circle in the circumscribed rectangle is the target image, and the pixel point between the circle and the circumscribed rectangle may be any pixel value, so as to obtain the reference image, for example, black may be uniformly used to fill the area between the circle and the circumscribed rectangle, or white may be uniformly used to fill the area between the circle and the circumscribed rectangle, which is not limited herein. With reference to fig. 2, when the target image is a circle and the image to be registered is a rectangle, a rectangle including the circle and not tangent to the circle may be obtained, the circle in the rectangle is the target image, and the pixel point between the circle and the rectangle may be any pixel value, so as to obtain the reference image, i.e., the rectangle including the circle may not be limited to the circumscribed rectangle. In the case that the target image is in another shape, or in the case that the image to be registered is in another shape, the analogy can be made, and no one example is given here. In addition, a sub-image that is inscribed in the target image (or not inscribed in the target image and included in the target image) and has the same shape as the image to be registered may also be used as the reference image, which is not limited herein. Still taking the target image as a circle and the image to be registered as a rectangle as an example, the image data in the circle inscribed rectangle can be used as a reference image; alternatively, still taking the target image as a circle and the image to be registered as a rectangle as an example, the image data that is not inscribed in the circle and is contained in the circle may be used as the reference image, which is not limited herein. It should be noted that, in order to improve the registration robustness, the target image may be directly selected to be expanded into a reference image having the same shape as the image to be registered.
In another specific implementation scenario, in the case that the shape of the target image is any shape other than a rectangle, the target image may be expanded into a reference image with the same shape as the image to be registered, and on this basis, attribute information of a pixel point in the reference image may be further recorded, where the attribute information is used to determine whether the pixel point belongs to the target image. Specifically, the attribute information may specifically include an attribute value of each pixel in the reference image, and when the attribute value of a pixel is a first numerical value (e.g., 1), it indicates that the pixel belongs to the target image, and when the attribute value of a pixel is a second numerical value (e.g., 0), it indicates that the pixel does not belong to the target image. Referring to fig. 2, the pixels in the circular area in the reference image belong to the target image, so the attribute values of the pixels can be set to a first value (e.g., 1), and the pixels in the reference image between the circular area and the rectangular area do not belong to the target image, so the attribute values of the pixels can be set to a second value (e.g., 0). Other cases may be analogized, and no one example is given here.
In another specific implementation scenario, in order to facilitate subsequent acquisition of attribute information of a pixel point in a reference image in a registration process, a template image with the same size as the reference image may be generated based on the attribute information, and a pixel value of each pixel point in the template image indicates whether a corresponding pixel point in the reference image belongs to a target image. Specifically, in the case that the pixel value of the pixel point in the template image is the first value (e.g., 1), it may be indicated that the corresponding pixel point in the reference image belongs to the target image, and in the case that the pixel value of the pixel point in the template image is the second value (e.g., 0), it may be indicated that the corresponding pixel point in the reference image does not belong to the target image. The term "corresponding" means being located at the same position in the image. For example, a first coordinate system is established by taking the pixel point at the upper left corner of the template image as the origin of coordinates, the horizontal right direction of the origin is taken as the positive direction of the x axis of the first coordinate system, and the vertical downward direction of the origin is taken as the positive direction of the y axis of the first coordinate systemDirection, establishing a second coordinate system by taking the pixel point at the upper left corner of the reference image as an original point, taking the horizontal rightward direction of the original point as the positive direction of the x axis of the second coordinate system, taking the vertical downward direction of the original point as the positive direction of the y axis of the second coordinate system, and forming a pixel point (u) in the template image0,v0) And a pixel point (u) in the reference image0,v0) And correspondingly.
Step S12: a first deflection angle of the first image area is determined and a second deflection angle of the second image area is determined.
In the embodiment of the disclosure, at least part of the pixel points of the first image area are in the reference image, and at least part of the pixel points of the second image area are in the image to be registered. For example, some of the pixel points located at the upper left position of the first image region are not in the reference image, and other pixel points are in the reference image; or, some pixel points located at the upper left position of the second image region are not in the image to be registered, and other pixel points are in the image to be registered, which is not limited herein. In addition, in order to reduce interference of pixel points which are not in the reference image or the image to be registered to image registration, the pixel points of the first image region can be all in the reference image, and the pixel points of the second image region can also be all in the image to be registered, so that the robustness of image registration can be improved.
In one implementation scenario, the shapes of the first image region and the second image region may both be set to a preset shape. The preset shape may include, but is not limited to: rectangular, circular, etc., without limitation.
In another implementation scenario, the first image region may be sized to a first size and the second image region may be sized to a second size. The first size and the second size may be set according to practical applications, for example, in the case that the first image area and the second image area are set to be rectangular, the first size, the second size may be set to be 16 × 16, 32 × 32, and so on, which is not limited herein; alternatively, in the case where the first image region and the second image region are set to be circular, the first image region and the second image region may be specifically set to be circular with a radius of 16, 32, or the like, which is not limited herein.
In yet another implementation scenario, to reduce the assistance of acquiring the deflection angle, the center of the first image region may be set as the center of the reference image.
In yet another implementation scenario, the first deflection angle is a directional angle between a line connecting the centroid of the first image region and the center of the first image region and the predetermined direction, and the second deflection angle is a directional angle between a line connecting the centroid of the second image region and the center of the second image region and the predetermined direction. Specifically, the directed angle may include: the connection line deflects to the predetermined angle in the clockwise direction, or the connection line deflects to the predetermined angle in the counterclockwise direction, which is not limited herein. For example, the sign of the directional angle may be defined as "-" (i.e., a negative sign) when the deflection is performed in the clockwise direction, or may be "+" (i.e., a positive sign) when the deflection is performed in the counterclockwise direction, which is not limited herein.
In a specific implementation scenario, please refer to fig. 3 in combination, and fig. 3 is a schematic diagram of an embodiment of a deflection angle obtaining manner. As shown in fig. 3, a solid rectangle represents a reference image (or an image to be registered), a dashed rectangle in the solid rectangle represents a first image region (or a second image region), P is a centroid of the first image region (or the second image region), a rectangular coordinate system is established with a center of the first image region as a coordinate origin O, a connection line between the centroid P of the first image region (or the second image region) and the center of the first image region (or the second image region) is OP, a predetermined direction may be an x-axis of the rectangular coordinate system, and the directed angle may be an angle θ from the predetermined direction to a counterclockwise direction of the connection line. Other cases may be analogized, and no one example is given here.
In another embodiment, please continue with FIG. 3, the centroid (c)x,cy) Can be expressed as:
in the above formula (1), (x, y) represents an offset of a certain pixel point in the first image region (or the second image region) with respect to the center of the first image region (or the second image region), I (x, y) represents a pixel value of the pixel point, and Σ represents a summation coincidence, where the summation range is a pixel point in the first image region (or the second image region). It should be noted that, in the case that the attribute information of a certain pixel point in the first image region (or the second image region) indicates that the certain pixel point does not belong to the target image, that is, in the case that the attribute value of a certain pixel point in the first image region (or the second image region) is the second value, the pixel point may be removed in the summation process, that is, in the centroid calculation process, the pixel point is not considered.
In yet another specific implementation scenario, the first deflection angle (or the second deflection angle) θ can be directly obtained by the following formula:
θ=arctan(∑yI(x,y),∑xI(x,y))……(2)
in the above formula (2), (x, y) represents an offset of a certain pixel point in the first image region (or the second image region) with respect to the center of the first image region (or the second image region), I (x, y) represents a pixel value of the pixel point, and Σ represents a summation coincidence, where the summation range is a pixel point in the first image region (or the second image region). It should be noted that, in the case that the attribute information of a certain pixel point in the first image region (or the second image region) indicates that the certain pixel point does not belong to the target image, that is, in the case that the attribute value of a certain pixel point in the first image region (or the second image region) is the second value, the pixel point may be removed in the summation process, that is, in the centroid calculation process, the pixel point is not considered.
Step S13: and obtaining candidate registration parameters based on the first deflection angle and the second deflection angle.
In an implementation scene, an estimated region of a reference image in an image to be registered can be obtained, an estimated scaling can be obtained based on the estimated region and the size of the reference image, and on the basis, a candidate registration parameter can be obtained based on the estimated scaling, a first deflection angle and a second deflection angle, so that the efficiency of image registration can be improved.
In a specific implementation scenario, a user may be prompted to mark an estimated region of a reference image in an image to be registered, so that the estimated region of the reference image in the image to be registered may be obtained when the user determines that the marking is completed. For example, the user may mark a predicted area of 48 × 48 at a certain position of the image to be registered, so that in the case of an image size of 32 × 32 of the reference image, a predicted scaling of 32/48 ≈ 0.67 may be obtained, and so on, which will not be exemplified herein.
In another specific implementation scenario, an image registration network may also be trained in advance, and specifically, the image registration network may include but is not limited to: regression HomographyNet, DIRNet, etc., without limitation. On the basis, the reference image and the image to be registered can be input into an image registration network to obtain an estimated registration parameter between the reference image and the image to be registered, and an estimated region of the reference image in the image to be registered is obtained by using the estimated registration parameter. For example, first position coordinates of four corner points of the reference image may be obtained, and the position coordinates of the four corner points are converted into the image to be registered by using the estimated registration parameters to obtain corresponding second position coordinates, so that the estimated region may be obtained by the second position coordinates of the four corner points in the image to be registered.
In yet another specific implementation scenario, a first feature point in the reference image may be extracted, a first feature representation of the first feature point may be determined based on the attribute information, and a second feature point and a second feature representation thereof in the image to be registered may be extracted, so that an estimated registration parameter between the reference image and the image to be registered may be obtained based on the first feature point and the first feature representation, and the second feature point and the second feature representation. In addition, when the template image is generated based on the attribute information, the first feature representation of the first feature point may be determined directly based on the template image. Specifically, the first Feature point and the second Feature point may be extracted based on a Feature extraction manner such as ORB (organized FAST and Rotated brief), SIFT (Scale-innovative Feature Transform), and the like, and the specific extraction process may refer to technical details of the Feature extraction manner such as ORB, SIFT, and the like, which is not described herein again. On this basis, the estimated region of the reference image in the image to be registered can be obtained by using the estimated registration parameter, which may specifically refer to the foregoing related description and will not be described herein again.
It should be noted that, in the process of determining the first feature representation of the first feature point based on the attribute information, a preset number of groups of pixel point pairs may be selected from an image region including the first feature point in the reference image, and the first pixel comparison value of the pixel point pair is obtained based on the attribute information of the pixel point pair, so that the first feature representation of the first feature point may be obtained by using the first pixel comparison values of the preset number of groups of pixel point pairs. Specifically, in the case where the attribute information of the pixel point pair indicates that the pixel point pair does not belong to the target image, the first pixel alignment value of the pixel point pair may be set as a preset invalid character, and in the case where the attribute information of the pixel point pair indicates that the pixel point pair belongs to the target image, the first pixel alignment value of the pixel point pair may be obtained based on the size relationship between the pixel point pairs. Referring to fig. 4, fig. 4 is a schematic diagram of obtaining a first feature representation. As shown in fig. 4, a circular area represents an image area including a first feature point, each square area in the circular area represents a pixel point in the image area, wherein a direction area filled with grid lines represents the first feature point, pixel point pairs filled with other same shadows and connected by a dotted line are a preset number of pixel point pairs selected in the image area, for convenience of description, fig. 4 schematically represents 4 pixel point pairs, a pixel point on an arrow side of the dotted line in the pixel point pairs may be denoted as a, and a pixel point on a dot side of the dotted line may be denoted as B, so that a pixel point pair filled with a right oblique line may be denoted as P1(A, B), the pixel point pairs filled with points are marked as P2(A, B), the pixel point pairs filled with transverse lines are marked as P3(A, B), pixel point pairs filled with left oblique lines are marked as P4(A, B). If either one of the pixel point a and the pixel point B in the pixel point pair does not belong to the target image, the first pixel comparison value of the pixel point pair may be set as a preset invalid character (e.g., # or #), etc.),otherwise, a first pixel comparison value of the pixel point pair can be obtained based on the size relationship between the pixel value of the pixel point A and the pixel value of the pixel point B. For example, in the case where the pixel value of the pixel point a is greater than the pixel value of the pixel point B, the first pixel alignment value of the pixel point pair may be set to a first value (e.g., 1), and in the case where the pixel value of the pixel point a is not greater than the pixel value of the pixel point B, the first pixel alignment value of the pixel point pair may be set to a second value (e.g., 0). For example, for a pixel point pair P1(A, B) the pixel value of the pixel A is greater than the pixel value of the pixel B, so that the pixel point can be paired with the P1The first pixel alignment value of (A, B) is set to 1, and for a pixel point pair P2(A, B) the pixel value of the pixel A is not greater than the pixel value of the pixel B, so that the pixel point can be paired with the P2The first pixel alignment value of (A, B) is set to 0, and P is set for the pixel point pair3(A, B) the pixel value of the pixel A is greater than the pixel value of the pixel B, so that the pixel point can be paired with the P3The first pixel alignment value of (A, B) is set to 1, and P is set for the pixel point pair4(A, B) the pixel value of the pixel A is greater than the pixel value of the pixel B, so that the pixel point can be paired with the P4The first pixel alignment value of (a, B) is set to 1, so the first feature representation of the first feature point shown in fig. 4 can be described as [ 1011 ]]. Other cases may be analogized, and no one example is given here.
In addition, in the process of obtaining the estimated registration parameter between the reference image and the image to be registered based on the first feature point and the first feature representation, and the second feature point and the second feature representation, the feature similarity between the first feature point and the second feature point may be specifically obtained based on a first pixel comparison value in the first feature representation except for the preset invalid character and a second pixel comparison value in the second feature representation, on the basis, the first feature point and the second feature point, of which the feature similarity satisfies the preset condition, may be used as a feature point pair, and the estimated registration parameter between the target image and the image to be registered is obtained based on the feature point pair. Specifically, after a plurality of groups of feature point pairs are obtained, the plurality of groups of feature point pairs may be processed in a RANdom SAmple Consensus (RANdom SAmple Consensus) manner to obtain the estimated registration parameters. The specific processing procedure may refer to the details of the relevant technology of random consistent sampling, and is not described herein again.
In yet another specific implementation scenario, in response to the predicted region not being available, the candidate scaling may be taken as the predicted scaling. Specifically, the candidate scaling may be set according to the actual application, for example, the candidate scaling may include but is not limited to: 0.2, 0.4, 0.6, 0.8, etc., without limitation. In this case, at least one of the candidate scales may be selected as the pre-estimated scale, and the specific selected candidate scale is not limited herein.
In yet another specific implementation scenario, after obtaining the estimated scaling, an angular difference between the first deflection angle and the second deflection angle may be obtained, so that the candidate registration parameter may be obtained based on the angular difference and the estimated scaling. For convenience of description, the pre-estimated scaling is denoted as s, the angle difference is denoted as α, the center of the second image region is denoted as (u, v), and the center of the reference image is denoted as (O)x,Oy) Then the candidate registration parameter H can be expressed as:
in another implementation scenario, a preset number of candidate positions may be selected from the image to be registered, and a second image region may be determined based on each candidate position, so that the preset number of second image regions may be obtained. On this basis, the second deflection angle of each second image region can be determined in the manner described above, and a preset number of candidate registration parameters are obtained based on the first deflection angle and each second deflection angle.
In a specific implementation scenario, in the case that the second image regions are circular regions with a radius of R, a preset number of second image regions may be determined with R as a radius and a preset number of candidate positions as centers, respectively; alternatively, in the case that the second image regions are square regions with a side length D, the preset number of second image regions may be determined respectively with the preset number of candidate positions as centers and with D as the side length of the square. Other cases may be analogized, and no one example is given here.
In another specific implementation scenario, in the case that there are two second deflection angles, a candidate registration parameter 01 may be obtained based on the first deflection angle and the second deflection angle 01; alternatively, based on the first deflection angle and the second deflection angle 02, candidate registration parameters 02 may be derived. Other cases may be analogized, and no one example is given here.
In yet another specific implementation scenario, as mentioned above, in order to reduce the interference of the pixel points not in the reference image or not in the image to be registered with the image registration, the pixel points in the second image region determined by the candidate positions may all be in the image to be registered. That is, the second image region does not access the outer region of the image to be registered, so that it can be advantageous to improve the robustness of image registration.
Step S14: and obtaining a final registration parameter between the target image and the image to be registered based on the candidate registration parameter.
In one implementation scenario, in the case that there is only one candidate registration parameter, the candidate registration parameter may be directly used as a final registration parameter between the target image and the image to be registered.
In another implementation scenario, in order to improve the accuracy of the final registration parameter when there is only one candidate registration parameter, a preset optimization manner may also be used to optimize the candidate registration parameter, and the candidate registration parameter after optimization is used as the final registration parameter. Specifically, the preset optimization manner may include, but is not limited to: Gauss-Newton, Levenberg-Marquard, etc., without limitation.
In yet another implementation scenario, as mentioned above, there may be a preset number of candidate registration parameters, in which case, for each candidate registration parameter, the candidate registration parameter may be utilized to determine a similarity between the image to be registered and the target image to obtain a first scoring result of the candidate registration parameter, and obtain a final registration parameter based on the first scoring result of the preset number of candidate registration parameters. Therefore, the accuracy of image registration can be further improved.
In a specific implementation scenario, the first score result represents the accuracy of the candidate registration parameter, and the higher the similarity between the image to be registered and the target image is, the higher the accuracy of the candidate registration parameter is, whereas the lower the similarity between the image to be registered and the target image is, the lower the accuracy of the candidate registration parameter is.
In another specific implementation scenario, a candidate registration parameter with the highest accuracy may be specifically selected as a final registration parameter according to the first scoring result.
In an implementation scenario, after the final registration parameter is obtained, the target image may be processed by using the final registration parameter, so as to obtain a registration position of the target image in the image to be registered. For example, pixel points located on the edge line of the target image may be obtained, and the positions of the pixel points on the edge line in the image to be registered are determined by using the final registration parameters, so as to obtain the registration positions of the target image in the image to be registered.
According to the scheme, the image to be registered and the reference image containing the target image are obtained, the first deflection angle of the first image area is determined, the second deflection angle of the second image area is determined, at least part of pixel points of the first image area are in the reference image, at least part of pixel points of the second image area are in the image to be registered, on the basis, the candidate registration parameters are obtained based on the first deflection angle and the second deflection angle, the final registration parameters between the target image and the image to be registered are obtained based on the candidate registration parameters, registration is not required to be carried out through extraction of feature points and feature representation, therefore, registration cannot be affected by the proportion of the target image in the image to be registered, and the accuracy of image registration can be improved.
Referring to fig. 5, fig. 5 is a flowchart illustrating an embodiment of step S14 in fig. 1. In the embodiment of the present disclosure, a preset number of candidate registration parameters exist, and the process of acquiring the preset number of candidate registration parameters may specifically refer to the relevant description in the foregoing embodiment, which is not described herein again. Specifically, the embodiments of the present disclosure may include the following steps:
step S51: for each candidate registration parameter: and determining the similarity between the image to be registered and the target image by using the candidate registration parameters to obtain a first grading result of the candidate registration parameters.
Specifically, for a pixel point (x, y) in the reference image and a pixel point (x ', y') in the image to be registered, the following conversion relationship exists:
in the above equation (4), H represents a candidate registration parameter. That is to say, the candidate registration parameters may be used to perform coordinate transformation on the pixel point (x, y) in the reference image, so as to obtain a pixel point corresponding to the pixel point in the image to be registered, and further, the two pixel points from the reference image and the image to be registered may be used as a set of matching point pairs.
In an implementation scenario, on the basis of obtaining the matching point pair, the similarity between the image to be registered and the target image may be obtained by using the pixel values of the matching point pair.
In a specific implementation scenario, if a pixel point from the reference image in the matching point pair does not belong to the target image, the accuracy of the similarity between the image to be registered and the target image is reduced, so that before the similarity is calculated, whether the matching point pair is rejected or not can be determined based on the attribute information of the pixel point from the reference image in the matching point pair, the matching point pair can be rejected under the condition that the attribute information of the pixel point from the reference image in the matching point pair indicates that the pixel point does not belong to the target image, and conversely, the matching point pair can be retained under the condition that the attribute information of the pixel point from the reference image in the matching point pair indicates that the pixel point belongs to the target image.
In another specific implementation scenario, the SSD (Sum of Squared Differences) may be used to process the pixel values of the matching point pair, so as to obtain the similarity between the image to be registered and the target image. For convenience of description, a pixel value of a pixel point (x, y) in the matching point pair from the target image T may be denoted as T (x, y), and a pixel value of a pixel point (x ', y') in the matching point pair from the image F to be matched may be denoted as F (x, y), so that the similarity SSD (T, F) between the image to be registered and the target image may be expressed as:
SSD(T,F)=∑x,y(T(x,y)-F(x′,y′))2……(5)
in the above equation (5), Σx,yAnd performing error square summation on pixel values of a matching point pair consisting of the pixel point (x, y) in the target image and the pixel point (x ', y') corresponding to the candidate registration parameter H determined in the image to be registered. Therefore, the smaller the similarity SSD (T, F), the higher the similarity between the target image and the image to be registered, whereas the larger the similarity SSD (T, F), the lower the similarity between the target image and the image to be registered.
In another specific implementation scenario, the pixel values of the matching point pairs may also be processed by using NCC (Normalized Cross Correlation), so as to obtain the similarity between the image to be registered and the target image. For convenience of description, the pixel value of the pixel point (x, y) in the matching point pair from the target image T may be denoted as T (x, y), and the pixel value of the pixel point (x ', y') in the matching point pair from the image F to be matched may be denoted as F (x, y), so that the similarity NCC (T, F) between the image to be registered and the target image may be expressed as:
in the above formula (6), Σx,yAnd performing normalized cross-correlation processing on pixel values of a matching point pair consisting of pixel points (x, y) in the target image and pixel points (x ', y') corresponding to the candidate registration parameters H determined in the image to be registered. In addition to this, the present invention is,representing the average value of pixel values of pixel points (x, y) in the target image,and representing the average value of pixel values of pixel points (x ', y') in the image to be registered. It should be noted that the range of the NCC (T, F) is-1 to 1, and the closer the NCC (T, F) is to 1, the higher the similarity between the target image and the image to be registered is.
Step S52: and obtaining a final registration parameter based on the first grading result of the preset number of candidate registration parameters.
In one implementation scenario, as described above, the first scoring result represents the accuracy of the candidate registration parameter, and on this basis, the candidate registration parameter with the highest accuracy may be directly selected as the final registration parameter.
In another implementation scenario, as described above, the first scoring result represents accuracy of the candidate registration parameter, and on this basis, the candidate registration parameter of which the first scoring result satisfies the preset condition may be selected as the coarse registration parameter, and the coarse registration parameter is optimized in a preset optimization manner, so that the optimized coarse registration parameter is used to determine similarity between the image to be registered and the target image, so as to obtain a second scoring result of the optimized coarse registration parameter, and the optimized coarse registration parameter is selected as the final registration parameter based on the second scoring result of the optimized coarse registration parameter. Therefore, the final registration parameters can be obtained by screening the candidate registration parameters in two stages, and the accuracy of image registration can be further improved.
In a specific implementation scenario, the preset condition may include that the accuracy of the candidate registration parameter is higher than a preset threshold. The preset threshold may be specifically set according to an actual application situation, for example, in a case where the requirement on the accuracy of the final registration parameter is high, the preset threshold may be set slightly larger, and in a case where the requirement on the accuracy of the final registration parameter is relatively loose, the preset threshold may be set slightly smaller, and a specific numerical value of the preset threshold is not limited herein.
In another specific implementation scenario, the candidate registration parameters may also be ranked in order of high accuracy to low accuracy, and on this basis, the preset condition may also include a preset number of candidate registration parameters located at the top. For example, the preset condition may include candidate registration parameters located in the top 5 bits; alternatively, the preset condition may also include candidate registration parameters located in the first 4 bits, which is not limited herein.
In another specific implementation scenario, at least one group of matching point pairs may be determined by using the optimized rough registration parameter, and two pixel points in the matching point pairs are respectively from the target image in the image to be registered and the reference image, and based on pixel values of the at least one group of matching point pairs, a similarity between the image to be registered and the target image is obtained, so as to obtain a second scoring result of the optimized rough registration parameter. For details, reference may be made to the related description of the first scoring result, which is not repeated herein.
In yet another specific implementation scenario, the second scoring result is obtained in a different manner than the first scoring result. For example, in the case where the first scoring result is obtained using SSD, the second scoring result may be obtained using NCC; alternatively, in the case where the first scoring result is obtained using NCC, the second scoring result may be obtained using SSD. The specific calculation process of SSD and NCC can refer to the related description, and is not described herein.
In yet another specific implementation scenario, the second scoring result represents the accuracy of the optimized rough registration parameter, and the higher the similarity between the image to be registered and the target image is, the higher the accuracy of the optimized rough registration parameter is, whereas the lower the similarity between the image to be registered and the target image is, the lower the accuracy of the optimized rough registration parameter is. On the basis, the optimized rough registration parameter with the highest accuracy can be selected as the final registration parameter.
In yet another specific implementation scenario, the coarse registration parameters may be optimized by Gauss-Newton, Levenberg-Marquard, etc. due to the continuity of the way similarity is calculated, such as NCC, SSD, etc. as described above.
In another specific implementation scenario, in the case of calculating the similarity using NCC, the NCC (T, F) may be used as an objective function, and H may be used as an optimization object to perform iterative optimization; or, in the case of calculating the similarity by using the SSD, iterative optimization may be performed by using the SSD (T, F) as the objective function and using H as the optimization object, and the other cases may be similar to each other, which is not illustrated here. For convenience of description, the target image may be denoted as T, the image to be registered may be denoted as F, and the rough registration parameter may be denoted as H, then the target function may be expressed as:
in the above formula (7), F (H)-1) Representing the result of the coarse registration parameter H transformation of the image to be registered, and the F function is used for calculating T and F (H)-1) The similarity between the two functions may be specifically an SSD function or an NCC function, which is not limited herein. By maximizing f, the optimized coarse registration parameter H can be obtained.
Different from the foregoing embodiment, there are preset numbers of the second image regions and the candidate registration parameters, each candidate registration parameter is obtained based on the first deflection angle and a second deflection angle corresponding to one of the second image regions, and on this basis, for each candidate registration parameter: and determining the similarity between the image to be registered and the target image by using the candidate registration parameters to obtain a first grading result of the candidate registration parameters, and obtaining the final registration parameters based on the first grading results of the preset number of candidate registration parameters, so that the final registration parameters can be obtained from the preset number of candidate registration parameters through the first grading result of the similarity, and the accuracy of image registration can be further improved.
Referring to fig. 6, fig. 6 is a schematic diagram of an embodiment of an image registration apparatus 60 according to the present application. The image registration apparatus 60 includes: the system comprises an image acquisition module 61, an angle determination module 62, a candidate parameter acquisition module 63 and a final parameter acquisition module 64, wherein the image acquisition module 61 is used for acquiring an image to be registered and a reference image containing a target image; the angle determining module 62 is configured to determine a first deflection angle of the first image region, and determine a second deflection angle of the second image region, where at least part of pixel points of the first image region are in the reference image, and at least part of pixel points of the second image region are in the image to be registered; the candidate parameter obtaining module 63 is configured to obtain a candidate registration parameter based on the first deflection angle and the second deflection angle; the final parameter obtaining module 64 is configured to obtain a final registration parameter between the target image and the image to be registered based on the candidate registration parameter.
According to the scheme, the image to be registered and the reference image containing the target image are obtained, the first deflection angle of the first image area is determined, the second deflection angle of the second image area is determined, at least part of pixel points of the first image area are in the reference image, at least part of pixel points of the second image area are in the image to be registered, on the basis, the candidate registration parameters are obtained based on the first deflection angle and the second deflection angle, the final registration parameters between the target image and the image to be registered are obtained based on the candidate registration parameters, registration is not required to be carried out through extraction of feature points and feature representation, therefore, registration cannot be affected by the proportion of the target image in the image to be registered, and the accuracy of image registration can be improved.
In some disclosed embodiments, the second image region and the candidate registration parameters are both a preset number, each candidate registration parameter is obtained based on the first deflection angle and a second deflection angle corresponding to the second image region, and the final parameter obtaining module 64 includes a first scoring result obtaining sub-module configured to, for each candidate registration parameter: the final parameter obtaining module 64 includes a final registration parameter obtaining sub-module, configured to obtain a final registration parameter based on the first scoring results of the preset number of candidate registration parameters.
Different from the foregoing embodiment, there are preset numbers of the second image regions and the candidate registration parameters, each candidate registration parameter is obtained based on the first deflection angle and a second deflection angle corresponding to one of the second image regions, and on this basis, for each candidate registration parameter: and determining the similarity between the image to be registered and the target image by using the candidate registration parameters to obtain a first grading result of the candidate registration parameters, and obtaining the final registration parameters based on the first grading results of the preset number of candidate registration parameters, so that the final registration parameters can be obtained from the preset number of candidate registration parameters through the first grading result of the similarity, and the accuracy of image registration can be further improved.
In some disclosed embodiments, the final registration parameter acquisition sub-module includes a coarse registration parameter selection unit, the candidate registration parameters which are used for selecting the first grading result and meet the preset conditions are used as rough registration parameters, the final registration parameter acquisition submodule comprises a rough registration parameter optimization unit which is used for utilizing a preset optimization mode, the rough registration parameter is optimized, the final registration parameter acquisition submodule comprises a second grading result acquisition unit, used for determining the similarity between the image to be registered and the target image by utilizing the optimized rough registration parameters, so as to obtain a second grading result of the optimized rough registration parameter, the final registration parameter acquisition submodule comprises a final registration parameter selection unit, and the second grading device is used for selecting the optimized rough registration parameter as a final registration parameter based on the second grading result of the optimized rough registration parameter.
Different from the embodiment, the candidate registration parameters of which the first scoring results meet the preset conditions are selected as the rough registration parameters, the rough registration parameters are optimized by using the preset optimization mode, so that the similarity between the image to be registered and the target image is determined by using the optimized rough registration parameters to obtain the second scoring results of the optimized rough registration parameters, and the optimized rough registration parameters are selected as the final registration parameters based on the second scoring results of the optimized rough registration parameters, so that the final registration parameters can be obtained by screening the candidate registration parameters in two stages, and the accuracy of image registration can be further improved.
In some disclosed embodiments, the first scoring result obtaining sub-module includes a matching point pair determining unit, configured to determine at least one group of matching point pairs by using the candidate registration parameter, where two pixel points in the matching point pairs are respectively from the target image in the image to be registered and the reference image, and the first scoring result obtaining sub-module includes a similarity obtaining unit, configured to obtain a similarity between the image to be registered and the target image based on pixel values of the at least one group of matching point pairs. Or the second scoring result obtaining unit is specifically configured to determine at least one group of matching point pairs by using the optimized rough registration parameter, where two pixel points in the matching point pairs are respectively from the target image in the image to be registered and the reference image, and obtain the similarity between the image to be registered and the target image based on pixel values of the at least one group of matching point pairs.
Different from the foregoing embodiment, at least one group of matching point pairs is determined by using the candidate registration parameters or using the optimized rough registration parameters, and two pixel points in the matching point pairs are respectively from the target image in the image to be registered and the reference image, and the similarity between the image to be registered and the target image is obtained based on the pixel values of the at least one group of matching point pairs, so that the interference of the pixel points not belonging to the target image can be eliminated in the similarity calculation process, and the accuracy of the similarity can be improved.
In some disclosed embodiments, the image registration apparatus 60 includes a scaling obtaining module, configured to obtain an estimated region of the reference image in the image to be registered, and obtain an estimated scaling based on the estimated region and the size of the reference image, and the candidate parameter obtaining module 63 is specifically configured to obtain the candidate registration parameter based on the estimated scaling, the first deflection angle, and the second deflection angle.
Different from the embodiment, the estimated scaling is obtained by obtaining the estimated area of the reference image in the image to be registered and based on the size of the estimated area and the size of the reference image, so that the candidate registration parameters are obtained based on the estimated scaling, the first deflection angle and the second deflection angle, and the image registration efficiency can be improved.
In some disclosed embodiments, the candidate parameter obtaining module 63 comprises an angle difference obtaining sub-module for obtaining an angle difference between the first deflection angle and the second deflection angle, and the candidate parameter obtaining module 63 comprises a candidate parameter obtaining sub-module for obtaining the candidate registration parameter based on the angle difference and the estimated scaling.
Different from the foregoing embodiment, the candidate registration parameters are obtained by obtaining the angle difference between the first deflection angle and the second deflection angle and based on the angle difference and the estimated scaling, which is beneficial to reducing the complexity of image registration.
In some disclosed embodiments, the scale obtaining module is further configured to take the candidate scale as the estimated scale in response to the estimated region not being obtained.
Different from the foregoing embodiment, in response to that the estimated region cannot be obtained, the candidate scaling is used as the estimated scaling, which can be beneficial to improving the robustness of image registration.
In some disclosed embodiments, the image acquisition module 61 is specifically configured to take the target image as a reference image; or processing the target image into a reference image with the same shape as the image to be registered, and recording attribute information of pixel points in the reference image, wherein the attribute information of the pixel points is used for determining whether the pixel points belong to the target image.
Different from the foregoing embodiment, the target image is used as the reference image, or the target image is processed into the reference image with the same shape as the image to be registered, and the attribute information of the pixel points in the reference image is recorded, and the attribute information of the pixel points is used for determining whether the pixel points belong to the target image, so that the image registration can be completed even when the target image is in any shape, which is beneficial to improving the robustness of the image registration.
In some disclosed embodiments, the pixel points of the first image region are all in the reference image; and/or the pixel points of the second image area are all in the image to be registered.
Different from the foregoing embodiment, the accuracy of image registration can be improved by setting the pixel points of the first image region to be all in the reference image, and/or setting the pixel points of the second image region to be all in the image to be registered.
In some disclosed embodiments, the center of the first image region is the center of the reference image; and/or the first deflection angle is a directed included angle between a connecting line of the centroid of the first image area and the center of the first image area and a preset direction; the second deflection angle is a directed included angle between a connecting line of the centroid of the second image area and the center of the second image area and the preset direction.
Different from the foregoing embodiment, the center of the first image is set as the center of the reference image, the first deflection angle is a directed angle between a connection line between the centroid of the first image region and the center of the first image region and the preset direction, and the second deflection angle is a directed angle between a connection line between the centroid of the second image region and the center of the second image region and the preset direction, so that the accuracy of the first deflection angle and the second deflection angle can be improved.
Referring to fig. 7, fig. 7 is a schematic diagram of a frame of an embodiment of an electronic device 70 according to the present application. The electronic device 70 comprises a memory 71 and a processor 72 coupled to each other, the processor 72 being configured to execute program instructions stored in the memory 71 to implement the steps of any of the above-described embodiments of the image registration method. In one particular implementation scenario, the electronic device 70 may include, but is not limited to: a microcomputer, a server, and the electronic device 70 may also include a mobile device such as a notebook computer, a tablet computer, and the like, which is not limited herein.
In particular, the processor 72 is configured to control itself and the memory 71 to implement the steps of any of the above-described embodiments of the image registration method. The processor 72 may also be referred to as a CPU (Central Processing Unit). The processor 72 may be an integrated circuit chip having signal processing capabilities. The Processor 72 may also be a general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic, discrete hardware components. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. Additionally, the processor 72 may be collectively implemented by an integrated circuit chip.
The scheme can be beneficial to improving the accuracy of image registration.
Referring to fig. 8, fig. 8 is a block diagram illustrating an embodiment of a computer readable storage medium 80 according to the present application. The computer readable storage medium 80 stores program instructions 801 that can be executed by the processor, the program instructions 801 being for implementing the steps of any of the image registration method embodiments described above.
The scheme can be beneficial to improving the accuracy of image registration.
In some embodiments, functions of or modules included in the apparatus provided in the embodiments of the present disclosure may be used to execute the method described in the above method embodiments, and specific implementation thereof may refer to the description of the above method embodiments, and for brevity, will not be described again here.
The foregoing description of the various embodiments is intended to highlight various differences between the embodiments, and the same or similar parts may be referred to each other, and for brevity, will not be described again herein.
In the several embodiments provided in the present application, it should be understood that the disclosed method and apparatus may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, a division of a module or a unit is merely one type of logical division, and an actual implementation may have another division, for example, a unit or a component may be combined or integrated with another system, or some features may be omitted, or not implemented. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection of devices or units through some interfaces, and may be in an electrical, mechanical or other form.
Units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on network elements. Some or all of the units can be selected according to actual needs to achieve the purpose of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application may be substantially implemented or contributed to by the prior art, or all or part of the technical solution may be embodied in a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, a network device, or the like) or a processor (processor) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
- 上一篇:石墨接头机器人自动装卡簧、装栓机
- 下一篇:图像配准方法及相关装置、设备