Three-dimensional image cutting method and electronic equipment
1. A method of cutting a three-dimensional image, the method comprising:
receiving cutting parameters sent by terminal equipment, wherein the cutting parameters at least comprise cut target organs and two-dimensional cutting area information;
determining a two-dimensional cutting area based on the two-dimensional cutting area information;
determining a first data set belonging to the two-dimensional cutting area in the three-dimensional surface data of the target organ;
dividing the three-dimensional volume data of the target organ into a first partial organ region and a second partial organ region with a boundary represented by the first data set as a division reference; wherein the three-dimensional volume data included within the outer surface of the first data set is the first partial organ region;
screening out three-dimensional volume data in a smaller organ region from the first part organ region and the second part organ region, and comparing the three-dimensional volume data with the two-dimensional cutting region to obtain a second data set contained in the two-dimensional cutting region in the three-dimensional volume data as a target cutting region;
and cutting the target cutting area from the three-dimensional volume data.
2. The method of claim 1, wherein the cutting parameters further comprise a projective transformation matrix for projecting the three-dimensional data of the target organ into a two-dimensional space for display;
the determining a first data set belonging to the two-dimensional cutting region in the three-dimensional surface data of the target organ includes:
based on the projective transformation matrix, projecting the three-dimensional surface data to the two-dimensional space to obtain a two-dimensional image;
determining a two-dimensional image portion of the two-dimensional image located in the two-dimensional cutting area;
and determining the three-dimensional surface data corresponding to the two-dimensional image part to obtain the first data set.
3. The method of claim 1, wherein said dividing the three-dimensional volume data of the target organ into a first partial organ region and a second partial organ region with a boundary represented by the first data set as a division reference comprises:
determining a third data set which does not belong to the two-dimensional cutting area in the three-dimensional surface data of the target organ;
respectively determining a first three-dimensional space range corresponding to the first data set and a second three-dimensional space range corresponding to the third data set;
defining data points within said three-dimensional volume data that are contained within said first three-dimensional spatial range as said first partial organ region;
and taking data points contained in the second three-dimensional space range in the three-dimensional volume data as the second partial organ area.
4. The method of claim 3, wherein the separately determining a first three-dimensional spatial extent corresponding to the first data set and a second three-dimensional spatial extent corresponding to the third data set comprises:
determining the length of the first data set in each coordinate axis direction, and determining a first cube as the first three-dimensional space range based on the length of each coordinate axis direction as the side length of the first cube;
and determining the length of the third data set in each coordinate axis direction, and determining the second cube as the second three-dimensional space range based on the length of each coordinate axis direction as the side length of the second cube.
5. The method of claim 4, wherein said screening out smaller organ regions from said first and second partial organ regions comprises:
determining the number of pixel points of three-dimensional volume data contained within the first cube and the number of pixel points of three-dimensional volume data contained within the second cube based on pixel spacing;
comparing the number of pixel points of the three-dimensional volume data accommodated within the first cube with the number of pixel points of the three-dimensional volume data accommodated within the second cube;
if the number of pixel points of the three-dimensional volume data accommodated in the first cube is smaller than the number of pixel points of the three-dimensional volume data accommodated in the second cube, taking the first partial organ area as a smaller organ area;
and if the number of pixel points of the three-dimensional volume data accommodated in the first cube is not less than the number of pixel points of the three-dimensional volume data accommodated in the second cube, taking the second partial organ region as a smaller organ region.
6. The method according to claim 1, wherein the two-dimensional cutting region information includes a two-dimensional image of a specified size, and closed curve position information preset in the two-dimensional image; the determining a two-dimensional cutting region based on the two-dimensional cutting region information includes:
constructing a two-dimensional image of the specified size;
and determining the two-dimensional cutting area in the two-dimensional image based on the position information of a preset closed curve in the two-dimensional image.
7. The method according to claim 1, wherein for the case that the first partial organ region is a smaller organ region, the comparing the three-dimensional volume data in the first partial organ region with the two-dimensional cutting region to obtain a second data set contained in the two-dimensional cutting region in the three-dimensional volume data as a target cutting region comprises:
projecting each three-dimensional volume data within the first portion of organ region into a two-dimensional space based on a projective transformation matrix; the projective transformation matrix is used for projecting the three-dimensional data of the target organ to the two-dimensional space for displaying;
the second data set consisting of three-dimensional volume data points projected into the two-dimensional space and located within the two-dimensional cutting region serves as the target cutting region.
8. The method according to claim 7, wherein for the case that the second partial organ region is a smaller organ region, the comparing the three-dimensional volume data in the second partial organ region with the two-dimensional cutting region to obtain a second data set contained in the two-dimensional cutting region in the three-dimensional volume data as a target cutting region comprises:
projecting each three-dimensional volume data within the second partial organ region into the two-dimensional space based on the projective transformation matrix;
determining a three-dimensional volume data point projected into the two-dimensional space and located outside the two-dimensional cutting area as a fourth data set;
the second data set composed of three-dimensional volume data points of the three-dimensional volume data other than the fourth data set is used as the target cutting region.
9. A method of cutting a three-dimensional image, the method comprising:
acquiring three-dimensional surface data of a target organ from a server;
projecting the three-dimensional surface data to a two-dimensional space to obtain a display coordinate of the three-dimensional surface data;
displaying the target organ based on the display coordinates;
re-rendering the target organ based on a rotation, zoom operation;
and generating cutting parameters based on the setting operation of the cutting area aiming at the target organ in the display interface and sending the cutting parameters to the server.
10. An electronic device, comprising:
a processor;
a memory for storing the processor-executable instructions;
wherein the processor is configured to execute the instructions to implement the method of cutting a three-dimensional image according to any one of claims 1 to 9.
Background
With the development of medical imaging technology, CT (computer tomography), MR (magnetic resonance), and other devices are widely used in the process of daily clinical diagnosis. A doctor inputs a two-dimensional image sequence shot by CT, MR and other equipment into terminal equipment, reconstructs a three-dimensional image of a human tissue organ on the terminal equipment, cuts the three-dimensional image and displays the three-dimensional image through a three-dimensional visualization technology. Therefore, the distribution condition of tissues and organs around the focus can be visually shown, an operation scheme can be formulated and simulated according to the distribution condition of the tissues and organs, and doctors are helped to better treat the tissues and organs in the surgical operation, so that how to cut the three-dimensional image for constructing the human tissues and organs plays an extremely important role in the aspects of basic research, clinical application and the like.
The cutting method for three-dimensional images in the prior art generally includes the following two ways. The method comprises the steps of dividing a closed curve of a three-dimensional image into a plurality of straight lines, and cutting the straight lines on a plane. However, the cutting method needs to traverse the whole three-dimensional volume data of the three-dimensional image to determine the cutting area, and has large calculation amount and low cutting efficiency. And secondly, drawing a cutting line on a two-dimensional image, and then orthographically projecting the target organ to the two-dimensional image to determine a cutting area. However, after each cutting line is drawn, the target organ needs to be orthographically projected, and the cutting efficiency is low. For complex tissues and organs needing to be cut for many times, the two conditions can not meet the requirement of cutting efficiency, and a method capable of quickly cutting a three-dimensional image is needed.
Disclosure of Invention
The embodiment of the application provides a three-dimensional image cutting method and electronic equipment, which can be used for rapidly cutting a three-dimensional image.
In a first aspect, an embodiment of the present application provides a method for cutting a three-dimensional image, where the method includes:
receiving cutting parameters sent by terminal equipment, wherein the cutting parameters at least comprise cut target organs and two-dimensional cutting area information;
determining a two-dimensional cutting area based on the two-dimensional cutting area information;
determining a first data set belonging to the two-dimensional cutting area in the three-dimensional surface data of the target organ;
dividing the three-dimensional volume data of the target organ into a first partial organ region and a second partial organ region with a boundary represented by the first data set as a division reference; wherein the three-dimensional volume data included within the outer surface of the first data set is the first partial organ region;
screening out three-dimensional volume data in a smaller organ region from the first part organ region and the second part organ region, and comparing the three-dimensional volume data with the two-dimensional cutting region to obtain a second data set contained in the two-dimensional cutting region in the three-dimensional volume data as a target cutting region;
and cutting the target cutting area from the three-dimensional volume data.
In an embodiment of the present application, the cutting parameters further include a projective transformation matrix, where the projective transformation matrix is used to project the three-dimensional data of the target organ into a two-dimensional space for display;
the determining a first data set belonging to the two-dimensional cutting region in the three-dimensional surface data of the target organ includes:
based on the projective transformation matrix, projecting the three-dimensional surface data to the two-dimensional space to obtain a two-dimensional image;
determining a two-dimensional image portion of the two-dimensional image located in the two-dimensional cutting area;
and determining the three-dimensional surface data corresponding to the two-dimensional image part to obtain the first data set.
In an embodiment of the application, the dividing the three-dimensional volume data of the target organ into a first partial organ region and a second partial organ region with a boundary represented by the first data set as a dividing reference includes:
determining a third data set which does not belong to the two-dimensional cutting area in the three-dimensional surface data of the target organ;
respectively determining a first three-dimensional space range corresponding to the first data set and a second three-dimensional space range corresponding to the third data set;
defining data points within said three-dimensional volume data that are contained within said first three-dimensional spatial range as said first partial organ region;
and taking data points contained in the second three-dimensional space range in the three-dimensional volume data as the second partial organ area.
In an embodiment of the application, the determining a first three-dimensional spatial range corresponding to the first data set and a second three-dimensional spatial range corresponding to the third data set respectively includes:
determining the length of the first data set in each coordinate axis direction, and determining a first cube as the first three-dimensional space range based on the length of each coordinate axis direction as the side length of the first cube;
and determining the length of the third data set in each coordinate axis direction, and determining the second cube as the second three-dimensional space range based on the length of each coordinate axis direction as the side length of the second cube.
In an embodiment of the present application, the screening out a smaller organ region from the first partial organ region and the second partial organ region comprises:
determining the number of pixel points of three-dimensional volume data contained within the first cube and the number of pixel points of three-dimensional volume data contained within the second cube based on pixel spacing;
comparing the number of pixel points of the three-dimensional volume data accommodated within the first cube with the number of pixel points of the three-dimensional volume data accommodated within the second cube;
if the number of pixel points of the three-dimensional volume data accommodated in the first cube is smaller than the number of pixel points of the three-dimensional volume data accommodated in the second cube, taking the first partial organ area as a smaller organ area;
and if the number of pixel points of the three-dimensional volume data accommodated in the first cube is not less than the number of pixel points of the three-dimensional volume data accommodated in the second cube, taking the second partial organ region as a smaller organ region.
In an embodiment of the application, the two-dimensional cutting area information includes a two-dimensional image with a specified size, and position information of a preset closed curve in the two-dimensional image; the determining a two-dimensional cutting region based on the two-dimensional cutting region information includes:
constructing a two-dimensional image of the specified size;
and determining the two-dimensional cutting area in the two-dimensional image based on the position information of a preset closed curve in the two-dimensional image.
In an embodiment of the application, for a case that the first partial organ region is a smaller organ region, the comparing the three-dimensional volume data in the first partial organ region with the two-dimensional cutting region to obtain a second data set included in the two-dimensional cutting region in the three-dimensional volume data as a target cutting region includes:
projecting each three-dimensional volume data within the first portion of organ region into a two-dimensional space based on a projective transformation matrix; the projective transformation matrix is used for projecting the three-dimensional data of the target organ to the two-dimensional space for displaying;
the second data set consisting of three-dimensional volume data points projected into the two-dimensional space and located within the two-dimensional cutting region serves as the target cutting region.
In an embodiment of the application, for a case that the second partial organ region is a smaller organ region, the comparing the three-dimensional volume data in the second partial organ region with the two-dimensional cutting region to obtain a second data set included in the two-dimensional cutting region in the three-dimensional volume data as a target cutting region includes:
projecting each three-dimensional volume data within the second partial organ region into the two-dimensional space based on the projective transformation matrix;
determining a three-dimensional volume data point projected into the two-dimensional space and located outside the two-dimensional cutting area as a fourth data set;
the second data set composed of three-dimensional volume data points of the three-dimensional volume data other than the fourth data set is used as the target cutting region.
In a second aspect, an embodiment of the present application provides a method for cutting a three-dimensional image, where the method includes:
acquiring three-dimensional surface data of a target organ from a server;
projecting the three-dimensional surface data to a two-dimensional space to obtain a display coordinate of the three-dimensional surface data;
displaying the target organ based on the display coordinates;
re-rendering the target organ based on a rotation, zoom operation;
and generating cutting parameters based on the setting operation of the cutting area aiming at the target organ in the display interface and sending the cutting parameters to the server.
In a third aspect, an embodiment of the present application further provides an electronic device, including:
a processor;
a memory for storing the processor-executable instructions;
wherein the processor is configured to execute the instructions to implement any of the methods as provided in the first or second aspects of the present application.
In a fourth aspect, an embodiment of the present application further provides a computer-readable storage medium, in which instructions, when executed by a processor of an electronic device, enable the electronic device to perform any one of the methods as provided in the first or second aspects of the present application.
In a fifth aspect, an embodiment of the present application provides a computer program product comprising computer programs/instructions which, when executed by a processor, implement any of the methods as provided in the first or second aspects of the present application.
The technical scheme provided by the embodiment of the application at least has the following beneficial effects:
compared with the problems that in the prior art, the calculated amount is large and the cutting efficiency is low due to the fact that the cutting area is determined by each pixel point in the three-dimensional data of the target organ needing to be traversed, the cutting area is determined according to the pixel points in the smaller organ area by dividing the organ area of the target organ, so that the cutting area can be determined quickly, and the three-dimensional image can be cut quickly.
Specifically, after receiving the cutting parameters sent by the terminal device, a two-dimensional cutting area is determined in the server, a first data set belonging to the two-dimensional cutting area in the three-dimensional surface data of the target organ is determined, the boundary represented by the first data set is used as a division reference, the three-dimensional volume data of the target organ is divided into a first partial organ area and a second partial organ area, the three-dimensional volume data in the smaller organ area is compared with the two-dimensional cutting area, the second data set contained in the two-dimensional cutting area in the compared three-dimensional volume data is used as the target cutting area, and finally, the target cutting area is cut from the three-dimensional volume data.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings needed to be used in the embodiments of the present application will be briefly described below, and it is obvious that the drawings described below are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a schematic view of an application scenario of a three-dimensional image cutting method according to an embodiment of the present application;
fig. 2 is a schematic flowchart of a method for cutting a three-dimensional image according to an embodiment of the present disclosure;
FIG. 3a is a schematic diagram of each organ displayed in a display interface according to an embodiment of the present application;
FIG. 3b is a schematic diagram of a target organ, namely a liver, displayed in a display interface according to an embodiment of the present application;
fig. 3c is a schematic diagram of a target organ liver with an adjusted display angle displayed in a display interface according to an embodiment of the present application;
fig. 3d is a schematic diagram illustrating a target organ, liver, with an adjusted display angle, displayed after a closed region is marked in a display interface according to an embodiment of the present application;
fig. 4 is a schematic flowchart of a method for cutting a three-dimensional image according to an embodiment of the present application;
FIG. 5a is a schematic diagram of a server constructing a two-dimensional image of a specified size according to an embodiment of the present application;
fig. 5b is a schematic view of each organ displayed in a display interface after a cutting process is performed on a target cutting region according to an embodiment of the present application;
fig. 5c is a schematic view illustrating a target organ, a liver, after a cutting process is performed on a target cutting region according to an embodiment of the present application, displayed in a display interface;
fig. 6 is a schematic flowchart of a cutting method for displaying a three-dimensional image on a web page according to an embodiment of the present application;
fig. 7 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
In order to make the technical solutions of the present application better understood by those of ordinary skill in the art, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the accompanying drawings.
It should be noted that the terms "first," "second," and the like in the description and claims of this application and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the application described herein are capable of operation in sequences other than those illustrated or described herein. The embodiments described in the following exemplary embodiments do not represent all embodiments consistent with the present application. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present application, as detailed in the appended claims.
Hereinafter, some terms in the embodiments of the present application are explained to facilitate understanding by those skilled in the art.
(1) In the embodiments of the present application, the term "plurality" means two or more, and other terms are similar thereto.
(2) "and/or" describes the association relationship of the associated objects, meaning that there may be three relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship.
(3) A server serving the terminal, the contents of the service such as providing resources to the terminal, storing terminal data; the server is corresponding to the application program installed on the terminal and is matched with the application program on the terminal to run.
(4) The terminal device may refer to an APP (Application) of a software class, or may refer to a client. The system is provided with a visual display interface and can interact with a user; is corresponding to the server, and provides local service for the client. For software applications, except some applications that are only run locally, the software applications are generally installed on a common client terminal and need to be run in cooperation with a server terminal. After the internet has developed, more common applications include e-mail clients for e-mail receiving and sending, and instant messaging clients. For such applications, a corresponding server and a corresponding service program are required in the network to provide corresponding services, such as database services, configuration parameter services, and the like, so that a specific communication connection needs to be established between the client terminal and the server terminal to ensure the normal operation of the application program.
(5) The three-dimensional volume data refers to a data aggregate with three-dimensional space coordinates as a position function, for example, each pixel point in the three-dimensional volume data of the target organ constitutes a three-dimensional structural shape of the target organ.
(6) The three-dimensional surface data is an aggregate of external surface data of the three-dimensional volume data, for example, each pixel point in the three-dimensional surface data of the target organ constitutes a three-dimensional external surface structural shape of the target organ.
With the development of medical imaging technology, CT (computer tomography), MR (magnetic resonance), and other devices are widely used in the process of daily clinical diagnosis. A doctor inputs a two-dimensional image sequence shot by CT, MR and other equipment into terminal equipment, reconstructs a three-dimensional image of a human tissue organ on the terminal equipment, cuts the three-dimensional image and displays the three-dimensional image through a three-dimensional visualization technology. Therefore, the distribution condition of tissues and organs around the focus can be visually shown, an operation scheme can be formulated and simulated according to the distribution condition of the tissues and organs, and doctors are helped to better treat the tissues and organs in the surgical operation, so that how to cut the three-dimensional image for constructing the human tissues and organs plays an extremely important role in the aspects of basic research, clinical application and the like.
The cutting method for three-dimensional images in the prior art generally includes the following two cases. The method comprises the following steps of dividing a closed curve of a three-dimensional image into a plurality of straight lines, and cutting the straight lines on a plane. However, the cutting method needs to traverse the whole three-dimensional volume data of the three-dimensional image to determine the cutting area, and has large calculation amount and low cutting efficiency. In case two, the cutting area is determined by drawing a cutting line on a two-dimensional image and then orthographically projecting the target organ to the two-dimensional image. However, after each cutting line is drawn, the target organ needs to be orthographically projected, and the cutting efficiency is low. For complex tissues and organs needing to be cut for many times, the two conditions can not meet the requirement of cutting efficiency, and a method capable of quickly cutting a three-dimensional image is needed.
Therefore, the application provides a three-dimensional image cutting method. Firstly, a two-dimensional cutting area is determined in a three-dimensional image displayed by a terminal device, cutting parameters are determined according to the two-dimensional cutting area and are sent to a server, then the two-dimensional cutting area is established in the server by utilizing the cutting parameters, a first data set belonging to the two-dimensional cutting area in three-dimensional surface data of a target organ is determined, then the three-dimensional volume data of the target organ is divided into a first part organ area and a second part organ area according to the first data set, the target cutting area is determined by utilizing the comparison of the three-dimensional volume data in the screened smaller organ area and the two-dimensional cutting area, so that compared with the prior art that the cutting area is determined by traversing the whole three-dimensional volume data of the target organ, the method only traverses pixel points in the divided smaller organ area of the target organ to determine the cutting area, and further reduces the calculated data amount, the cutting efficiency is improved.
After introducing the design concept of the embodiment of the present application, some simple descriptions are provided below for application scenarios to which the technical solution of the embodiment of the present application can be applied, and it should be noted that the application scenarios described below are only used for describing the embodiment of the present application and are not limited. In specific implementation, the technical scheme provided by the embodiment of the application can be flexibly applied according to actual needs.
Fig. 1 is a schematic view of an application scenario of a three-dimensional image cutting method according to an embodiment of the present application. The application scenario includes a plurality of terminal devices 101 (including terminal device 101-1, terminal device 101-2, … … terminal device 101-n), server 102. The terminal device 101 and the server 102 are connected via a wireless or wired network, and the terminal device 101 includes but is not limited to a desktop computer, a mobile phone, a mobile computer, a tablet computer, a media player, a smart wearable device, a smart television, and other electronic devices. The server 102 may be a server, a server cluster composed of several servers, or a cloud computing center. The server 102 may be an independent physical server, a server cluster or a distributed system formed by a plurality of physical servers, or a cloud server providing basic cloud computing services such as a cloud service, a cloud database, cloud computing, a cloud function, cloud storage, a network service, cloud communication, middleware service, a domain name service, a security service, a CDN, a big data and artificial intelligence platform, and the like.
The terminal device 101 acquires the three-dimensional surface data of the target organ from the server 102, projects the three-dimensional surface data to the two-dimensional space of the terminal device 101 for display, re-renders the target organ in the two-dimensional space of the terminal device 101 based on the rotation and scaling operations of the target organ by the user, generates cutting parameters for a cutting area drawn in the target organ by the user in a rendered display interface, and sends the cutting parameters to the server 102.
The server 102 receives the cutting parameters sent by the terminal device 101, determines a two-dimensional cutting area in the server 102, determining a first data set belonging to a two-dimensional cutting area in the three-dimensional surface data of the target organ, dividing the three-dimensional volume data of the target organ into a first partial organ area and a second partial organ area by taking a boundary represented by the first data set as a dividing reference, comparing the three-dimensional volume data in the smaller organ area with the two-dimensional cutting area, taking a second data set contained in the two-dimensional cutting area in the compared three-dimensional volume data as the target cutting area, and finally cutting the target cutting area from the three-dimensional volume data, and sending the three-dimensional surface data corresponding to the cut three-dimensional volume data to the terminal device 101, and rendering and displaying the received cut three-dimensional surface data by the terminal device 101.
Of course, the method provided in the embodiment of the present application is not limited to be used in the application scenario shown in fig. 1, and may also be used in other possible application scenarios, and the embodiment of the present application is not limited. The functions that can be implemented by each device in the application scenario shown in fig. 1 will be described in the following method embodiments, and will not be described in detail herein.
To further illustrate the technical solutions provided by the embodiments of the present application, the following detailed description is made with reference to the accompanying drawings and the detailed description. Although the embodiments of the present application provide the method operation steps as shown in the following embodiments or figures, more or less operation steps may be included in the method based on the conventional or non-inventive labor. In steps where no necessary causal relationship exists logically, the order of execution of the steps is not limited to that provided by the embodiments of the present application.
The following describes the technical solution provided in the embodiment of the present application with reference to the application scenario shown in fig. 1.
Referring to fig. 2, an embodiment of the present application provides a method for cutting a three-dimensional image, including the following steps:
s201, acquiring three-dimensional surface data of a target organ from a server;
optionally, a two-dimensional image sequence of the user is acquired with a CT, MR, etc. device, wherein the two-dimensional image sequence contains different organs of the user. And inputting the two-dimensional image sequence into a pre-trained neural network model in the server, so that the neural network model outputs three-dimensional volume data corresponding to each organ. And inputting the three-dimensional volume data of the target organ into a MarchingCube (moving cube) algorithm to obtain the output three-dimensional surface data of the target organ. Here, the training process of the neural network model is not limited, and the training may be performed according to the actual application. Of course, the method of obtaining the three-dimensional surface data is not limited, and any method capable of obtaining the three-dimensional surface data is applicable to the embodiments of the present application.
S202, projecting the three-dimensional surface data to a two-dimensional space to obtain a display coordinate of the three-dimensional surface data;
s203, displaying the target organ based on the display coordinates;
here, the display coordinates are obtained by projecting the three-dimensional surface data to the two-dimensional space in accordance with the projective transformation matrix by previously establishing the projective transformation matrix between the three-dimensional space coordinates of the three-dimensional surface data and the two-dimensional space coordinates of the two-dimensional space.
Illustratively, as shown in FIG. 3a, a schematic diagram of a two-dimensional image sequence with each organ projected into a two-dimensional space for display is shown. Assuming that the target organ is a liver, selecting the target organ in the interface shown in fig. 3a, and closing the screen display of other organs, as shown in fig. 3b, projecting the three-dimensional surface data of the target organ liver to the two-dimensional space in the display interface of the terminal device, and displaying the target organ liver in the display interface.
Taking the projection transformation matrix as an example of a matrix with 4 × 4, assume that three-dimensional plane data a of the target organ liver is { [ ax1, ay1, az1], … … [ axn, ayn, azn ] }, where ax1 is the position of the first three-dimensional plane data of the target organ liver in the x coordinate of the three-dimensional space, ay1 is the position of the first three-dimensional plane data of the target organ liver in the y coordinate of the three-dimensional space, az1 is the position of the first three-dimensional plane data of the target organ liver in the z coordinate of the three-dimensional space, axn is the position of the nth three-dimensional plane data of the target organ liver in the x coordinate of the three-dimensional space, ayn is the position of the nth three-dimensional plane data of the target organ liver in the y coordinate of the three-dimensional space, and azn is the position of the nth three-dimensional plane data of the target organ liver in the z coordinate of the three-dimensional space. The display coordinates of each three-dimensional surface data are calculated according to the following equations 1 to 8. The description of the projection transformation is performed by taking the first three-dimensional surface data [ ax1, ay1, az1] of the target organ liver as an example:
view [0] ═ ax1 × matrix [0] + ay1 × matrix [1] + az1 × matrix [2] + matrix [3 ]; equation 1
view [1] ═ ax1 × matrix [4] + ay1 × matrix [5] + az1 × matrix [6] + matrix [7 ]; equation 2
view [2] ═ ax1 × matrix [8] + ay1 × matrix [9] + az1 × matrix [10] + matrix [11 ]; equation 3
view [3] ═ ax1 × matrix [12] + ay1 × matrix [13] + az1 × matrix [14] + matrix [15 ]; equation 4
view [0]/═ view [3 ]; equation 5
view [1]/═ view [3 ]; equation 6
x ═ (view [0] +1.0) × width _ width/2; equation 7
y ═ - (view [1] +1.0) × win _ height/2; equation 8
Wherein, matrix [0] -matrix [15] is a projection transformation matrix of 4 × 4 from left to right and matrix values from top to bottom, view [0] -view [3] is an intermediate variable in the process of calculating display coordinates, x and y are two-dimensional space coordinates of a two-dimensional space respectively, and win _ width is the width of a display interface of the terminal device and win _ height is the height of the display interface of the terminal device.
S204, re-rendering the target organ based on rotation and scaling operations;
exemplarily, as shown in fig. 3c, after the three-dimensional surface data of the target organ liver is projected to the two-dimensional space in the display interface of the terminal device, the target organ liver is re-rendered for display after the user performs the rotation and zoom operations on the target organ liver in the terminal device. The specific operation process of the user rotation and zooming is not limited, and the adjustment can be performed according to the actual application situation.
And after the liver display angle of the target organ is adjusted in the display interface, the projection transformation matrix is adaptively changed according to the adjustment of the target organ.
And S205, generating cutting parameters based on the setting operation of the cutting area aiming at the target organ in the display interface, and sending the cutting parameters to a server.
For example, as shown in fig. 3d, after the display angle of the target organ liver is adjusted, the operation is performed based on the setting of the cutting region of the target organ liver in the display interface by the user, that is, the closed curve drawn in fig. 3d by the user. And generating cutting parameters based on the closed curve, wherein the cutting parameters at least comprise identification of a cut target organ, namely the organ, and two-dimensional cutting area information. The two-dimensional cutting region information may be a two-dimensional image of a specified size, and closed curve position information, where the closed curve position information may be coordinate information of each point of the closed curve in two-dimensional space coordinates of the two-dimensional image.
Through the setting operation of cutting the region to the target organ in terminal equipment, and send to the server after generating the cutting parameter for utilize the cutting parameter to cut the target organ in the server, then show the target organ after the cutting in terminal equipment, for carrying out the cutting of target organ in terminal equipment among the prior art, reduce and occupy terminal equipment hardware resources, avoid terminal equipment card pause or application to dodge and retreat the phenomenon.
After a cutting area is drawn in a terminal device and cutting parameters are sent to a server, referring to fig. 4, a method for cutting a three-dimensional image at the server end is shown, which includes the following steps:
s401, receiving cutting parameters sent by terminal equipment, wherein the cutting parameters at least comprise information of a cut target organ and a two-dimensional cutting area;
s402, determining a two-dimensional cutting area based on the two-dimensional cutting area information;
specifically, according to the above description, the two-dimensional cutting area information may be a two-dimensional image with a specified size and closed curve position information, and the two-dimensional cutting area is determined in the two-dimensional image by constructing the two-dimensional image with the specified size in the server and based on the closed curve position information preset in the two-dimensional image.
Illustratively, as shown in fig. 5a, assuming that the two-dimensional image of the specified size is 900 × 1000, a blank two-dimensional image of 900 × 1000 is first created in the server. And then, determining a two-dimensional cutting area in the two-dimensional image by utilizing the position information of the preset closed curve in the two-dimensional image. Similarly, the preset closed curve position information in the two-dimensional image may be coordinate information of each point of the closed curve in a two-dimensional space coordinate of the two-dimensional image, which is only an example and is not limited to the specific position information of the closed curve preset in the two-dimensional image. And a seed point growing algorithm is adopted, the pixel value of each pixel point in the closed curve area in the two-dimensional image is set to be a first color value A, and the pixel value of each pixel point outside the closed curve area is set to be a second color value B. For example, a is set to 1 and B is set to 0.
By constructing a two-dimensional image with the same size as that of the terminal equipment in the server and determining a two-dimensional cutting area with the same size, a more accurate target cutting area can be obtained and cutting processing can be carried out.
S403, determining a first data set belonging to a two-dimensional cutting area in the three-dimensional surface data of the target organ;
optionally, the cutting parameters further include a projective transformation matrix, and the projective transformation matrix is used for projecting the three-dimensional data of the target organ into a two-dimensional space for display; here, the three-dimensional data may be three-dimensional surface data or three-dimensional volume data.
Based on the projection transformation matrix, the three-dimensional surface data is projected to a two-dimensional space to obtain a two-dimensional image, namely a display coordinate, a two-dimensional image part located in a two-dimensional cutting area in the two-dimensional image is determined, the three-dimensional surface data corresponding to the two-dimensional image part is determined, and a first data set is obtained.
The process of projection by using the projective transformation matrix can refer to the above description, and therefore, the description thereof is omitted.
For example, a first data set belonging to a two-dimensional cutting region among three-dimensional plane data a { [ ax1, ay1, az1], … … [ axn, ayn, azn ] } of a target organ liver is defined as B { [ bx1, by1, bz1], … [ bxn, byn, bzn ] }.
The three-dimensional surface data are projected through the projection transformation matrix, each pixel point in the three-dimensional surface data can be accurately obtained, and then a more accurate target cutting area is obtained and cutting processing is carried out.
Then, in step S404, the three-dimensional volume data of the target organ is divided into a first partial organ region and a second partial organ region with the boundary represented by the first data set as a division reference; wherein the three-dimensional volume data included within the outer surface of the first data set is a first partial organ region;
specifically, while the first data set is determined, a third data set which does not belong to the two-dimensional cutting region in the three-dimensional surface data of the target organ may be determined, then a first three-dimensional space range corresponding to the first data set and a second three-dimensional space range corresponding to the third data set are respectively determined, a data point included in the first three-dimensional space range in the three-dimensional volume data is used as a first partial organ region, and a data point included in the second three-dimensional space range in the three-dimensional volume data is used as a second partial organ region. Here, the determination process of the third data set may refer to the determination process of the first data set, and will not be described in detail here.
For example, a third data set, which does not belong to a two-dimensional cutting region, of three-dimensional plane data a { [ ax1, ay1, az1], … … [ axn, ayn, azn ] } of a target organ liver is defined as C { [ cx1, cy1, cz1], … [ cxn, cyn, czn ] }.
By combining the data sets in the two-dimensional cutting area and outside the two-dimensional cutting area, the target organ can be divided into a first part organ and a second part organ more accurately, and then the more accurate target cutting area is obtained and cutting processing is carried out.
Determining the length of the first data set in each coordinate axis direction, and determining the first cube as a first three-dimensional space range based on the length of each coordinate axis direction as the side length of the first cube; and determining the length of the third data set in each coordinate axis direction, and determining the second cube as a second three-dimensional space range based on the length of each coordinate axis direction as the side length of the second cube.
And further obtaining the lengths of the coordinate axes corresponding to the data set according to the determined data set, and determining a spatial range by using a cube formed by each length, so that each data in the data set can be contained in the determined spatial range, and further more accurate target cutting areas are obtained and are subjected to cutting processing.
S405, screening out three-dimensional volume data in a smaller organ region from the first part organ region and the second part organ region, and comparing the three-dimensional volume data with the two-dimensional cutting region to obtain a second data set contained in the two-dimensional cutting region in the three-dimensional volume data as a target cutting region;
and S406, cutting the target cutting area from the three-dimensional volume data.
Specifically, the number of pixel points of three-dimensional volume data accommodated in a first cube and the number of pixel points of three-dimensional volume data accommodated in a second cube are determined based on the pixel pitch; comparing the number of pixel points of the three-dimensional volume data accommodated in the first cube with the number of pixel points of the three-dimensional volume data accommodated in the second cube; if the number of pixel points of the three-dimensional volume data contained in the first cube is smaller than that of the three-dimensional volume data contained in the second cube, taking the first part of organ area as a smaller organ area; and if the number of pixel points of the three-dimensional volume data accommodated in the first cube is not less than that of the three-dimensional volume data accommodated in the second cube, taking the second partial organ region as a smaller organ region.
Illustratively, taking the first data set as an example to describe the calculation process, it is assumed that the first data set B { [ bx1, by1, bz1], … [ bxn, byn, bzn ] } has a minimum value and a maximum value in the x coordinate axis direction of x _ min and x _ max, respectively, a minimum value and a maximum value in the y coordinate axis direction of y _ min and y _ max, respectively, and a minimum value and a maximum value in the z coordinate axis direction of z _ min and z _ max, respectively.
Respectively calculating the pixel point numbers BX, BY and BZ in the x coordinate axis direction, the y coordinate axis direction and the z coordinate axis direction according to the following formula 9-formula 11:
BX ═ x _ max-x _ min)/spacing _ x; equation 9
BY ═ y _ max-y _ min)/spacing _ y; equation 10
BZ ═ z _ max-z _ min)/spacing _ z; equation 11
The spacing _ x, spacing _ y and spacing _ z are pixel distances of two-dimensional images in the two-dimensional image sequence in a defined coordinate axis direction respectively and can be acquired from three-dimensional volume data of a target organ. Then BX BY BZ is determined as the number of pixel points of the three-dimensional volume data contained in the first cube, and similarly, the number of pixel points of the three-dimensional volume data contained in the second cube can be determined based on the above calculation method, and finally, the two calculated data are compared to determine a smaller organ region.
The pixel number of the three-dimensional data contained in each cube is calculated by utilizing the pixel pitch, the size of the organ can be more accurately determined according to the pixel number of the three-dimensional data contained in the cube, and then the target cutting area is determined by utilizing the smaller organ area, so that the more accurate target cutting area is obtained and cutting processing is carried out.
In the process of determining the target cutting area, in order to reduce the calculation data volume, a smaller organ area is selected for traversal, so that the data volume needing to be traversed is reduced, and the speed of determining the target cutting area is increased.
The following describes the procedure of determining the target cutting region when the first or second organ portion is a smaller organ portion:
the first part of the organ region is a smaller organ region
Projecting each three-dimensional volume data within the first portion of the organ region into a two-dimensional space based on the projective transformation matrix; and a second data set consisting of three-dimensional volume data points projected into the two-dimensional space and located in the two-dimensional cutting area is used as a target cutting area.
The process of projection by using the projective transformation matrix can refer to the above description, and therefore, the description thereof is omitted.
For example, after being projected into the two-dimensional space, the pixel value of the pixel point of the corresponding position of the three-dimensional volume data located in the two-dimensional cutting region in the whole three-dimensional volume data of the target organ may be set as B. And then generating corresponding three-dimensional surface data according to the whole three-dimensional volume data of the target organ after the pixel value is adjusted, sending the three-dimensional surface data to the terminal equipment, and rendering in the terminal equipment. As shown in fig. 5b, a display image in the terminal device after the cutting process of the target cutting region from the three-dimensional volume data is shown. Further, as shown in fig. 5c, a schematic diagram of the target cutting region after the cutting process is performed on the target cutting region based on the image displayed in fig. 3b is shown.
Here, the three-dimensional volume data of the target organ after the cutting process is input into the marching cube algorithm to obtain the three-dimensional surface data of the target organ after the cutting process, which is only an example and is not limited to a specific conversion method for converting the three-dimensional volume data of the target organ after the cutting process into the three-dimensional surface data.
Aiming at the condition that the first part organ area is a smaller organ area, whether each three-dimensional volume data in the first part organ area is located in the two-dimensional cutting area after being projected to the two-dimensional space is judged, and the pixel value of the pixel point in the three-dimensional volume data of the target organ is adjusted, so that a more accurate target cutting area is obtained and cutting processing is carried out.
(II) the second partial organ region is a smaller organ region
Projecting each three-dimensional volume data within the second portion of the organ region into a two-dimensional space based on the projective transformation matrix; determining a three-dimensional volume data point projected into the two-dimensional space and located outside the two-dimensional cutting area as a fourth data set; a second data set consisting of three-dimensional volume data points in the three-dimensional volume data other than the fourth data set is used as the target cutting region.
Exemplarily, a pixel value of each pixel point in the whole three-dimensional volume data of the target organ is set as B, a pixel value of a pixel point at a corresponding position in the whole three-dimensional volume data of the target organ of the three-dimensional volume data outside the two-dimensional cutting region after being projected to the two-dimensional space is set as a, then, corresponding three-dimensional surface data is generated according to the whole three-dimensional volume data of the target organ after the pixel value is adjusted, the three-dimensional surface data is sent to the terminal device, and rendering is performed in the terminal device, as shown in fig. 5B, a display image in the terminal device after the target cutting region is cut from the three-dimensional volume data is shown.
And aiming at the condition that the second part organ area is a smaller organ area, whether each three-dimensional volume data in the second part organ area is located outside the two-dimensional cutting area after being projected to the two-dimensional space is judged, and the pixel value of the pixel point in the three-dimensional volume data of the target organ is adjusted, so that a more accurate target cutting area is obtained and cutting processing is carried out.
In an embodiment of the application, along with the development of a network transmission technology, the difficult miscellaneous diseases of medical poor areas are solved by utilizing remote diagnosis and treatment and remote consultation, and the treatment scheme is more and more popular with doctors. As shown in fig. 6, the method specifically includes the following steps:
s601, displaying a two-dimensional image of a target organ at a webpage end, and generating cutting parameters aiming at a two-dimensional cutting area of the target organ in a display interface;
s602, sending cutting parameters to a server, wherein the cutting parameters at least comprise the size of a two-dimensional image, position information of a preset closed curve in the two-dimensional image and a projection transformation matrix of projecting three-dimensional surface data of a target organ to a webpage end;
s603, the server constructs a blank two-dimensional image with a specified size according to the received cutting parameters; and determining a two-dimensional cutting area in the two-dimensional image based on the preset closed curve position information in the two-dimensional image.
S604, projecting the three-dimensional surface data to a two-dimensional space based on the projection transformation matrix to obtain a two-dimensional image; determining a two-dimensional image part positioned in a two-dimensional cutting area in the two-dimensional image; and determining three-dimensional surface data corresponding to the two-dimensional image part to obtain a first data set.
S605, determining a third data set which does not belong to the two-dimensional cutting area in the three-dimensional surface data of the target organ.
S606, determining the length of the first data set in each coordinate axis direction, and determining a first cube as a first three-dimensional space range based on the length of each coordinate axis direction as the side length of the first cube; and determining the length of the third data set in each coordinate axis direction, and determining the second cube as a second three-dimensional space range based on the length of each coordinate axis direction as the side length of the second cube.
S607, taking the data points contained in the first three-dimensional space range in the three-dimensional volume data as a first partial organ area; and taking data points contained in a second three-dimensional space range in the three-dimensional volume data as a second partial organ area.
S608, determining the number of pixel points of the three-dimensional volume data contained in the first cube and the number of pixel points of the three-dimensional volume data contained in the second cube based on the pixel pitch; if the number of pixel points of the three-dimensional volume data contained in the first cube is smaller than that of the three-dimensional volume data contained in the second cube, taking the first part of organ area as a smaller organ area; and if the number of pixel points of the three-dimensional volume data accommodated in the first cube is not less than that of the three-dimensional volume data accommodated in the second cube, taking the second partial organ region as a smaller organ region.
S609, projecting each three-dimensional volume data in the first part organ area to a two-dimensional space based on the projection transformation matrix aiming at the condition that the first part organ area is a smaller organ area; the projection transformation matrix is used for projecting the three-dimensional data of the target organ to a two-dimensional space for display; and a second data set consisting of three-dimensional volume data points projected into the two-dimensional space and located in the two-dimensional cutting area is used as a target cutting area.
S6010, for the case that the second partial organ area is a smaller organ area, projecting each three-dimensional volume data in the second partial organ area to a two-dimensional space based on a projection transformation matrix; determining a three-dimensional volume data point projected into the two-dimensional space and located outside the two-dimensional cutting area as a fourth data set; a second data set consisting of three-dimensional volume data points in the three-dimensional volume data other than the fourth data set is used as the target cutting region.
S6011, cutting the target cutting area from the three-dimensional volume data, and sending three-dimensional surface data corresponding to the cut three-dimensional volume data to the terminal device.
S6012, displaying the cut target organ in the terminal device according to the cut three-dimensional surface data.
According to the method and the device, the two-dimensional cutting area is determined in the server by receiving the cutting parameters sent by the terminal device, the first data set which belongs to the two-dimensional cutting area in the three-dimensional surface data of the target organ is determined, the boundary represented by the first data set is used as a dividing reference, the three-dimensional volume data of the target organ is divided into the first partial organ area and the second partial organ area, the three-dimensional volume data in the smaller organ area is compared with the two-dimensional cutting area, the second data set which is contained in the two-dimensional cutting area in the compared three-dimensional volume data is used as the target cutting area, and finally the target cutting area is cut from the three-dimensional volume data, so that compared with the method and the device for determining the cutting area by traversing the three-dimensional volume data, the calculated data amount is reduced, and the cutting efficiency is improved.
After describing the method of cutting a three-dimensional image according to an exemplary embodiment of the present application, an electronic device according to another exemplary embodiment of the present application will be described next.
As will be appreciated by one skilled in the art, aspects of the present application may be embodied as a system, method or program product. Accordingly, various aspects of the present application may be embodied in the form of: an entirely hardware embodiment, an entirely software embodiment (including firmware, microcode, etc.) or an embodiment combining hardware and software aspects that may all generally be referred to herein as a "circuit," module "or" system.
In some possible implementations, an electronic device according to the present application may include at least one processor, and at least one memory. Wherein the memory stores program code which, when executed by the processor, causes the processor to perform the steps of the method for cutting a three-dimensional image according to various exemplary embodiments of the present application described above in the present specification. For example, the processor may perform steps in a cutting method such as a three-dimensional image.
The electronic device 130 according to this embodiment of the present application is described below with reference to fig. 7. The electronic device 130 shown in fig. 7 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present application.
As shown in fig. 7, the electronic device 130 is represented in the form of a general electronic device. The components of the electronic device 130 may include, but are not limited to: the at least one processor 131, the at least one memory 132, and a bus 133 that connects the various system components (including the memory 132 and the processor 131).
Bus 133 represents one or more of any of several types of bus structures, including a memory bus or memory controller, a peripheral bus, a processor, or a local bus using any of a variety of bus architectures.
The memory 132 may include readable media in the form of volatile memory, such as Random Access Memory (RAM)1321 and/or cache memory 1322, and may further include Read Only Memory (ROM) 1323.
Memory 132 may also include a program/utility 1325 having a set (at least one) of program modules 1324, such program modules 1324 including, but not limited to: an operating system, one or more application programs, other program modules, and program data, each of which, or some combination thereof, may comprise an implementation of a network environment.
The electronic device 130 may also communicate with one or more external devices 134 (e.g., keyboard, pointing device, etc.), with one or more devices that enable a user to interact with the electronic device 130, and/or with any devices (e.g., router, modem, etc.) that enable the electronic device 130 to communicate with one or more other electronic devices. Such communication may occur via input/output (I/O) interfaces 135. Also, the electronic device 130 may communicate with one or more networks (e.g., a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network, such as the internet) via the network adapter 136. As shown, network adapter 136 communicates with other modules for electronic device 130 over bus 133. It should be understood that although not shown in the figures, other hardware and/or software modules may be used in conjunction with electronic device 130, including but not limited to: microcode, device drivers, redundant processors, external disk drive arrays, RAID systems, tape drives, and data backup storage systems, among others.
In an exemplary embodiment, a computer-readable storage medium comprising instructions, such as the memory 132 comprising instructions, executable by the processor 131 to perform the above-described method is also provided. Alternatively, the computer readable storage medium may be a ROM, a Random Access Memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like.
In an exemplary embodiment, there is also provided a computer program product comprising computer programs/instructions which when executed by the processor 131 implement any of the methods of cutting three-dimensional images as provided herein.
In an exemplary embodiment, aspects of a method for cutting a three-dimensional image provided by the present application may also be implemented in the form of a program product including program code for causing a computer device to perform the steps of a method for cutting a three-dimensional image according to various exemplary embodiments of the present application described above in this specification when the program product is run on the computer device.
The program product may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. A readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium include: an electrical connection having one or more wires, a portable disk, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
The program product for cutting of three-dimensional images of the embodiments of the present application may employ a portable compact disc read only memory (CD-ROM) and include program code, and may be run on an electronic device. However, the program product of the present application is not limited thereto, and in this document, a readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
A readable signal medium may include a propagated data signal with readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A readable signal medium may also be any readable medium that is not a readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Program code for carrying out operations of the present application may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the consumer electronic device, partly on the consumer electronic device, as a stand-alone software package, partly on the consumer electronic device and partly on a remote electronic device, or entirely on the remote electronic device or server. In the case of remote electronic devices, the remote electronic devices may be connected to the consumer electronic device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external electronic device (e.g., through the internet using an internet service provider).
It should be noted that although several units or sub-units of the apparatus are mentioned in the above detailed description, such division is merely exemplary and not mandatory. Indeed, the features and functions of two or more units described above may be embodied in one unit, according to embodiments of the application. Conversely, the features and functions of one unit described above may be further divided into embodiments by a plurality of units.
Further, while the operations of the methods of the present application are depicted in the drawings in a particular order, this does not require or imply that these operations must be performed in this particular order, or that all of the illustrated operations must be performed, to achieve desirable results. Additionally or alternatively, certain steps may be omitted, multiple steps combined into one step execution, and/or one step broken down into multiple step executions.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable image scaling apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable image scaling apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable three-dimensional imaging apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable three-dimensional image cutting apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
While the preferred embodiments of the present application have been described, additional variations and modifications in those embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. Therefore, it is intended that the appended claims be interpreted as including preferred embodiments and all alterations and modifications as fall within the scope of the application.
It will be apparent to those skilled in the art that various changes and modifications may be made in the present application without departing from the spirit and scope of the application. Thus, if such modifications and variations of the present application fall within the scope of the claims of the present application and their equivalents, the present application is intended to include such modifications and variations as well.
- 上一篇:石墨接头机器人自动装卡簧、装栓机
- 下一篇:一种基于质心的结构光角点检测方法