Optical lens aligning method and system
1. The optical lens aligning method is characterized by comprising the following steps:
enabling the Z-axis of the imaging sensor to approach the lens to a focusing initial position, and starting to move away from the lens from the focusing initial position to perform a central focusing step;
after the center focusing is finished, moving a sensitive lens of the lens on an XY plane to perform lens alignment;
after the lens alignment is completed, the Z-axis direction of the imaging sensor reaches the focusing initial position and the defocusing step is executed from the focusing initial position to the distance away from the lens.
2. The optical lens barrel alignment method as claimed in claim 1, wherein: the imaging sensor is continuously close to the lens along the Z-axis direction from the initial position, the imaging of the central area of the image is subject to the process from blurring to clearness and then blurring, and the imaging is stopped when the central area of the image is from clearness to blurring and the current position is defined as the focusing initial position.
3. The optical lens barrel alignment method as claimed in claim 1, wherein: in the central focusing step, a plurality of groups of image definition values and corresponding Z-axis positions are collected, the FocusBestPosition of the Z-axis corresponding to the maximum image definition value is finally recorded, and the imaging sensor is moved to the FocusBestPosition after focusing is completed.
4. The optical lens barrel alignment method as claimed in claim 3, wherein: the center focusing step performs the following operations:
calculating the definition value FocusVal of the central area of the current image before each movement;
comparing the sharpness value FocusVal with a preset sharpness maximum value FocusRef to control the moving distance of the next step:
when the difference value between the sharpness value FocusVal and the preset sharpness maximum value FocusDiff is larger than or equal to the threshold FocusDiff1, the imaging sensor moves away from the lens by a MoveDistance 1;
when the difference value between the sharpness value FocusVal and the preset sharpness maximum value FocusRef is greater than or equal to the threshold FocusDiff2 and smaller than the threshold FocusDiff1, the imaging sensor moves away from the lens by a MoveDistance 2;
when the sharpness value FocusVal approaches or exceeds the preset sharpness maximum value FocusRef, the imaging sensor moves away from the lens by a distance MoveDistance3, wherein the approach means that the difference value between the sharpness value FocusVal and the preset sharpness maximum value FocusRef is smaller than a threshold FocusDiff 3;
repeatedly executing the operation, and recording the definition maximum FocusMax of the definition value FocusVal and the position FocusBestPosition of the Z axis when the definition maximum FocusMax appears;
when the difference value between the definition value FocusVal and the definition maximum value FocusMax is larger than or equal to the threshold Focusdrop, focusing is completed; moving the imaging sensor to a position FocusBestPosition after focusing is completed;
and outputting a central focusing curve by taking the Z-axis position or the movement distance in the focusing process as an abscissa and taking the definition value FocusVal of the central area as an ordinate.
5. The optical lens barrel alignment method as claimed in claim 3, wherein: the lens aligning step comprises a coarse adjusting step, a fine adjusting step and a fine adjusting step which are sequentially executed; wherein:
the coarse adjustment step takes the FocusBestPosition as the center, so that the sensitive lens moves in a set moving area to obtain the image definition values corresponding to the coordinate points, and finally the FocusWholeBestPosition1 corresponding to the maximum value of the image definition is recorded; and moving the sensitive lens to a position focuss wheolebestposition 1 after coarse adjustment is completed;
the fine adjustment step takes the FocusWholeBestPosition1 as the center, reduces the active area of the sensitive lens to obtain the image definition values corresponding to the coordinate points in the area, and finally records the FocusWholeBestPosition2 corresponding to the maximum value of the image definition; and moving the sensitive lens to a position focuss wheleebestposition 2 after fine adjustment is completed;
the fine adjustment step further reduces the active area of the sensitive lens by taking the position FocusWholeBestPosition2 as the center to obtain the image definition values corresponding to the coordinate points in the area, and finally records the position FocusWholeBestPosition3 corresponding to the maximum value of the image definition; and moves the sensitive lens to position focuss wheelebestposition 3 after the fine adjustment is completed.
6. The optical lens barrel alignment method as claimed in claim 5, wherein:
the coarse tuning step performs the following operations:
setting the active areas of the X axis and the Y axis as [ XRange1, Yrange1] and setting the single movement amount of the axis as MoveStep1 by taking the current position FocusBestPosition as a plane center;
dividing an active area [ XRange1, Yrange1] into grids by taking the moving amount MoveSte1p1 as a unit, wherein each junction point in the grids is a stop point of the movement;
moving the sensitive lens to each stopping point in a traversing manner, and acquiring a definition value FocusValCen of a central area of the image and definition values FocusValAround (n) of a plurality of peripheral areas when the sensitive lens is positioned at the stopping point, wherein n is 0,1,2 and …;
calculating the overall definition FocusWhole of the image;
FocusWhole=FocusValCen*CenPower+∑nfocusvalaround (n) aroundpower (n), n ═ 0,1,2, …; wherein, CenPower and AroundPower (n) are weighting coefficients of each resolution, and the value range is [0, 1%];
Comparing the total definition value FocusWhole with a preset definition maximum value FocusWholeRefA to control the moving distance of the next step:
when the difference value of the total definition value FocusWhole and the preset definition maximum value FocusWholeRefA is larger than or equal to a threshold value FocusWholeDiffA1, the movable distance of the sensitive lens is MoveDistanceA 1;
when the difference value of the total definition value FocusWhole and the preset definition maximum value FocusWholeRefA is greater than or equal to a threshold FocusWholeDiffA2 and less than a threshold FocusWholeDiffA1, the moving distance of the sensitive lens is MoveDistanceA 2;
when the total definition value FocusWhole approaches or exceeds the preset definition maximum value FocusWholeRefA, the sensitive lens moves by a distance MoveDistanceA3, wherein the approach means that the difference between the total definition value FocusWhole and the preset definition maximum value FocusRef is less than a threshold value FocusWholeDiffA 3;
traversing each stopping point and recording the maximum value FocusWholeMax of the total definition appearing in the process of the total definition FocusWhole and the position FocusWholeBestPosition1 corresponding to the maximum value FocusWholeMax of the total definition; after traversing each stopping point, the coarse tuning core is completed, and the sensitive lens moves to the FocusWholeBestPosition 1;
outputting a coarse tuning core curve by taking the traversal sequence number of each stop point in the coarse tuning step as an abscissa and the definition value as an ordinate; wherein, curves corresponding to the central region definition value FocusValCen and the peripheral region definition value FocusValAround (n) are respectively marked by different colors or line types;
the fine-tuning step performs the following operations:
with the current position FocusWholeBestPosition1 as a plane center, setting the active areas of the X axis and the Y axis as [ XRange2 and Yrange2] and setting the single movement amount of the axes as MoveStep2, wherein XRange2 is smaller than XRange1, Yrange2 is smaller than Yrange1, and MoveStep2 is smaller than MoveStep 1;
dividing an active area [ XRange2, Yrange2] into grids by taking the moving amount MoveSte1p2 as a unit, wherein each junction point in the grids is a stop point of the movement;
moving the sensitive lens to each stopping point in a traversing manner, and acquiring a definition value FocusValCen of a central area of the image and definition values FocusValAround (n) of a plurality of peripheral areas when the sensitive lens is positioned at the stopping point, wherein n is 0,1,2 and …;
calculating the overall definition FocusWhole of the image;
FocusWhole=FocusValCen*CenPower+∑nfocusvalaround (n) aroundpower (n), n ═ 0,1,2, …; wherein CenPower and AroundPower (n) are weighting coefficients of each resolutionThe value range is [0, 1]];
Comparing the total definition value FocusWhole with a preset definition maximum value FocusWholeRefA to control the moving distance of the next step:
when the difference value of the total definition value FocusWhole and the preset definition maximum value FocusWholeRefA is larger than or equal to a threshold value FocusWholeDiffA1, the movable distance of the sensitive lens is MoveDistanceA 1;
when the difference value of the total definition value FocusWhole and the preset definition maximum value FocusWholeRefA is greater than or equal to a threshold FocusWholeDiffA2 and less than a threshold FocusWholeDiffA1, the moving distance of the sensitive lens is MoveDistanceA 2;
when the total definition value FocusWhole approaches or exceeds the preset definition maximum value FocusWholeRefA, the sensitive lens moves by a distance MoveDistanceA3, wherein the approach means that the difference between the total definition value FocusWhole and the preset definition maximum value FocusRef is less than a threshold value FocusWholeDiffA 3;
traversing each stopping point and recording the maximum value FocusWholeMeax of the total definition in the process and the corresponding position FocusWholeBestPosition 2; after traversing each stopping point, the sensitive lens moves to the position focuss wheelebestposition 2;
outputting a fine adjustment core curve by taking the traversal sequence number of each stop point in the fine adjustment step as an abscissa and the definition value as an ordinate; wherein, curves corresponding to the central region definition value FocusValCen and the peripheral region definition value FocusValAround (n) are respectively marked by different colors or line types;
the fine tuning step performs the following operations:
setting the moving ranges of an X axis and a Y axis as [ XRange3 and Yrange3] and the single movement amount of the axis as MoveStep3 by taking the current position FocusWholeBestPosition2 as a plane center, wherein XRange3 is smaller than XRange2, Yrange3 is smaller than Yrange2, and MoveStep3 is smaller than MoveStep 2;
dividing an active area [ XRange3, Yrange3] into grids by taking the moving amount MoveSte1p3 as a unit, wherein each junction point in the grids is a stop point of the movement;
moving the sensitive lens to each stopping point in a traversing manner, and acquiring a definition value FocusValCen of a central area of the image and definition values FocusValAround (n) of a plurality of peripheral areas when the sensitive lens is positioned at the stopping point, wherein n is 0,1,2 and …;
calculating the overall definition FocusWhole of the image;
FocusWhole=FocusValCen*CenPower+∑nfocusvalaround (n) aroundpower (n), n ═ 0,1,2, …; wherein, CenPower and AroundPower (n) are weighting coefficients of each resolution, and the value range is [0, 1%];
Comparing the total definition value FocusWhole with a preset definition maximum value FocusWholeRefA to control the moving distance of the next step:
when the difference value of the total definition value FocusWhole and the preset definition maximum value FocusWholeRefA is larger than or equal to a threshold value FocusWholeDiffA1, the movable distance of the sensitive lens is MoveDistanceA 1;
when the difference value of the total definition value FocusWhole and the preset definition maximum value FocusWholeRefA is greater than or equal to a threshold FocusWholeDiffA2 and less than a threshold FocusWholeDiffA1, the moving distance of the sensitive lens is MoveDistanceA 2;
when the total definition value FocusWhole approaches or exceeds the preset definition maximum value FocusWholeRefA, the sensitive lens moves by a distance MoveDistanceA3, wherein the approach means that the difference between the total definition value FocusWhole and the preset definition maximum value FocusRef is less than a threshold value FocusWholeDiffA 3;
traversing each stopping point and recording the maximum value FocusWholeMeax of the total definition in the process and the corresponding position FocusWholeBestPosition 3; after traversing each stopping point, the sensitive lens moves to the position focuss wheelebestposition 3;
outputting a fine tuning core curve by taking the traversal sequence number of each stop point in the fine tuning step as an abscissa and the definition value as an ordinate; wherein, curves corresponding to the central region definition value focusValCen and the peripheral region definition value focusValAround (n) are respectively marked by different colors or line types.
7. The optical lens barrel alignment method according to claim 5 or 6, wherein: in the defocusing step, collecting a plurality of groups of image definition values and corresponding Z-axis positions, and finally recording the position FocusWholeBestPosition4 of the Z axis corresponding to the maximum value of the image definition; the imaging sensor is moved to position focuss wheelebestposition 4 after defocus is completed.
8. The optical lens barrel alignment method as claimed in claim 7, wherein: the defocusing step performs the following operations:
acquiring an image before each movement and calculating a definition value FocusValCen of a central area of the image and definition values FocusValAround (n) of a plurality of peripheral areas, wherein n is 0,1,2 and …;
calculating the overall definition FocusWhole of the image;
FocusWhole=FocusValCen*CenPower+∑nfocusvalaround (n) aroundpower (n), n ═ 0,1,2, …; wherein, CenPower and AroundPower (n) are weighting coefficients of each resolution, and the value range is [0, 1%];
Comparing the total definition value FocusWhole with a preset definition maximum value FocusWholeRefB to control the moving distance of the next step:
when the difference value of the total definition value FocusWhole and the preset definition maximum value FocusWholeRefB is larger than or equal to a threshold FocusWholeDiffB1, the imaging sensor moves away from the lens by a MoveDistanceB 1;
when the difference value of the total definition value FocusWhole and the preset definition maximum value FocusWholeRefA is greater than or equal to the threshold FocusWholeDiffB2 and less than the threshold FocusWholeDiffA1, the imaging sensor moves away from the lens by a distance MoveDistanceB 2;
when the total definition value FocusWhole approaches or exceeds the preset definition maximum value FocusWholeRefB, the imaging sensor moves away from the lens by a distance MoveDistanceB3, wherein the approach means that the difference between the total definition value FocusWhole and the preset definition maximum value FocusWholeRefA is smaller than a threshold FocusWholeDiffA 3;
repeatedly executing the operation, marking a definition maximum FocusWholeMax where the total definition value FocusWhole appears in the process and a position FocusWholeBestposition4 of the Z axis when the definition maximum FocusWholeMax appears;
when the difference value between the recorded total definition value FocusWhole and the definition maximum value FocusWholeMax is larger than or equal to the threshold FocusWholeDrap or when the total moving length exceeds the threshold MoveDistanceMax, defocusing is completed;
moving the imaging sensor to a position FocusWholeBestPosition4 after defocus is complete;
outputting a defocusing curve by taking the Z axis position or the movement distance in the focusing process as a horizontal coordinate and the definition value as a vertical coordinate; wherein, curves corresponding to the central region definition value focusValCen and the peripheral region definition value focusValAround (n) are respectively marked by different colors or line types.
9. Optical lens aligning system, comprising:
the image acquisition module is connected with an imaging sensor and is used for selecting the center of the image or/and a plurality of peripheral areas to shoot the images with the same characteristic shape;
the motion control module is used for controlling the XY axis mechanism to drive the sensitive lens of the lens to move horizontally and controlling the Z axis mechanism to drive the imaging sensor to move vertically;
the lens central focusing module is used for carrying out operation processing on the acquired image data so as to finish lens focusing and generate central focusing data;
the lens aligning module is used for performing operation processing on the acquired image data to finish lens aligning and generating aligning data;
and the lens defocusing module is used for performing operation processing on the acquired image data to finish lens defocusing and generate defocusing data.
10. The optical lens barrel alignment system as claimed in claim 9, wherein:
the lens center focusing module comprises
A first definition calculating unit for calculating the definition value of the central region of the current image,
A first imaging sensor vertical movement calculating and controlling unit for generating an instruction for driving the imaging sensor to shift a distance in the next step according to the comparison result of the current sharpness value,
A first sharpness peak determination and location unit for determining a sharpness peak to generate an instruction for driving the imaging sensor to shift to a Z-axis position corresponding to the sharpness peak, and
a focusing curve output unit for outputting a central focusing curve;
the lens aligning module comprises
A second definition calculating unit for calculating definition values of the center and peripheral regions of the current image,
A lens plane movement calculation and control unit for generating an instruction for driving the next displacement distance of the sensitive lens according to the comparison result of the current total definition value,
A lens rough adjusting unit for performing a rough adjusting operation of the lens,
A lens fine-tuning unit for performing a lens fine-tuning operation,
A lens trimming unit for performing lens trimming operations, and
a core adjustment curve output unit for outputting a core adjustment curve;
the lens out-of-focus module comprises
A third sharpness calculation unit for calculating sharpness values of the center and peripheral regions of the current image,
A second imaging sensor vertical movement calculating and controlling unit for generating an instruction for driving the next step of the displacement distance of the imaging sensor according to the comparison result of the current total definition value,
A second sharpness peak determination and location unit for determining a sharpness peak to generate an instruction for driving the imaging sensor to shift to a Z-axis position corresponding to the sharpness peak, and
and a defocus curve output unit for outputting a defocus curve.
Background
An optical lens is composed of a plurality of lenses, wherein the sensitive lens in the optical design can be identified, and the position change of the sensitive lens has great influence on the whole resolution of the optical lens, so that the position of the sensitive lens is moved on the horizontal plane by using lens aligning equipment in production to obtain the position with the best resolution, the yield of the lens resolution is improved, and the quality of the lens resolution is improved.
Disclosure of Invention
The invention provides an optical lens aligning method and system applied to lens aligning equipment, which automatically execute the aligning operation of a sensitive lens and are realized by the following technical means:
the optical lens aligning method of the invention comprises the following steps:
enabling the Z-axis of the imaging sensor to approach the lens to a focusing initial position, and starting to move away from the lens from the focusing initial position to perform a central focusing step;
after the center focusing is finished, moving a sensitive lens of the lens on an XY plane to perform lens alignment;
after the lens alignment is completed, the Z-axis direction of the imaging sensor reaches the focusing initial position and the defocusing step is executed from the focusing initial position to the distance away from the lens.
Further, the imaging sensor is continuously close to the lens along the Z-axis direction from the initial position, the imaging of the central area of the image is performed through the process from blurring to clearness and then blurring, and the current position is defined as the focusing initial position when the central area of the image is in the clear-to-fuzzy critical state.
Further, in the central focusing step, a plurality of groups of image definition values and corresponding Z-axis positions are acquired, the FocusBestPosition of the Z-axis corresponding to the maximum image definition value is finally recorded, and the imaging sensor is moved to the FocusBestPosition after focusing is completed.
Further, the center focusing step performs the following operations:
calculating the definition value FocusVal of the central area of the current image before each movement;
comparing the sharpness value FocusVal with a preset sharpness maximum value FocusRef to control the moving distance of the next step:
when the difference value between the sharpness value FocusVal and the preset sharpness maximum value FocusDiff is larger than or equal to the threshold FocusDiff1, the imaging sensor moves away from the lens by a MoveDistance 1;
when the difference value between the sharpness value FocusVal and the preset sharpness maximum value FocusRef is greater than or equal to the threshold FocusDiff2 and smaller than the threshold FocusDiff1, the imaging sensor moves away from the lens by a MoveDistance 2;
when the sharpness value FocusVal approaches or exceeds the preset sharpness maximum value FocusRef, the imaging sensor moves away from the lens by a distance MoveDistance3, wherein the approach means that the difference value between the sharpness value FocusVal and the preset sharpness maximum value FocusRef is smaller than a threshold FocusDiff 3;
repeatedly executing the operation, and recording the definition maximum FocusMax of the definition value FocusVal and the position FocusBestPosition of the Z axis when the definition maximum FocusMax appears;
when the difference value between the definition value FocusVal and the definition maximum value FocusMax is larger than or equal to the threshold Focusdrop, focusing is completed; moving the imaging sensor to a position FocusBestPosition after focusing is completed;
and outputting a central focusing curve by taking the Z-axis position or the movement distance in the focusing process as an abscissa and taking the definition value FocusVal of the central area as an ordinate.
Further, the lens aligning step comprises a coarse adjusting step, a fine adjusting step and a fine adjusting step which are sequentially executed; wherein:
the coarse adjustment step takes the FocusBestPosition as the center, so that the sensitive lens moves in a set moving area to obtain the image definition values corresponding to the coordinate points, and finally the FocusWholeBestPosition1 corresponding to the maximum value of the image definition is recorded; and moving the sensitive lens to a position focuss wheolebestposition 1 after coarse adjustment is completed;
the fine adjustment step takes the FocusWholeBestPosition1 as the center, reduces the active area of the sensitive lens to obtain the image definition values corresponding to the coordinate points in the area, and finally records the FocusWholeBestPosition2 corresponding to the maximum value of the image definition; and moving the sensitive lens to a position focuss wheleebestposition 2 after fine adjustment is completed;
the fine adjustment step further reduces the active area of the sensitive lens by taking the position FocusWholeBestPosition2 as the center to obtain the image definition values corresponding to the coordinate points in the area, and finally records the position FocusWholeBestPosition3 corresponding to the maximum value of the image definition; and moves the sensitive lens to position focuss wheelebestposition 3 after the fine adjustment is completed.
Further, the coarse tuning step performs the following operations:
setting the active areas of the X axis and the Y axis as [ XRange1, Yrange1] and setting the single movement amount of the axis as MoveStep1 by taking the current position FocusBestPosition as a plane center;
dividing an active area [ XRange1, Yrange1] into grids by taking the moving amount MoveSte1p1 as a unit, wherein each junction point in the grids is a stop point of the movement;
moving the sensitive lens to each stopping point in a traversing manner, and acquiring a definition value FocusValCen of a central area of the image and definition values FocusValAround (n) of a plurality of peripheral areas when the sensitive lens is positioned at the stopping point, wherein n is 0,1,2 and …;
calculating the overall definition FocusWhole of the image;
FocusWhole=FocusValCen*CenPower+∑nfocusvalaround (n) aroundpower (n), n ═ 0,1,2, …; wherein, CenPower and AroundPower (n) are weighting coefficients of each resolution, and the value range is [0, 1%];
Comparing the total definition value FocusWhole with a preset definition maximum value FocusWholeRefA to control the moving distance of the next step:
when the difference value of the total definition value FocusWhole and the preset definition maximum value FocusWholeRefA is larger than or equal to a threshold value FocusWholeDiffA1, the movable distance of the sensitive lens is MoveDistanceA 1;
when the difference value of the total definition value FocusWhole and the preset definition maximum value FocusWholeRefA is greater than or equal to a threshold FocusWholeDiffA2 and less than a threshold FocusWholeDiffA1, the moving distance of the sensitive lens is MoveDistanceA 2;
when the total definition value FocusWhole approaches or exceeds the preset definition maximum value FocusWholeRefA, the sensitive lens moves by a distance MoveDistanceA3, wherein the approach means that the difference between the total definition value FocusWhole and the preset definition maximum value FocusRef is less than a threshold value FocusWholeDiffA 3;
traversing each stopping point and recording the maximum value FocusWholeMax of the total definition appearing in the process of the total definition FocusWhole and the position FocusWholeBestPosition1 corresponding to the maximum value FocusWholeMax of the total definition; after traversing each stopping point, the coarse tuning core is completed, and the sensitive lens moves to the FocusWholeBestPosition 1;
outputting a coarse tuning core curve by taking the traversal sequence number of each stop point in the coarse tuning step as an abscissa and the definition value as an ordinate; wherein, curves corresponding to the central region definition value FocusValCen and the peripheral region definition value FocusValAround (n) are respectively marked by different colors or line types;
the fine-tuning step performs the following operations:
with the current position FocusWholeBestPosition1 as a plane center, setting the active areas of the X axis and the Y axis as [ XRange2 and Yrange2] and setting the single movement amount of the axes as MoveStep2, wherein XRange2 is smaller than XRange1, Yrange2 is smaller than Yrange1, and MoveStep2 is smaller than MoveStep 1;
dividing an active area [ XRange2, Yrange2] into grids by taking the moving amount MoveSte1p2 as a unit, wherein each junction point in the grids is a stop point of the movement;
moving the sensitive lens to each stopping point in a traversing manner, and acquiring a definition value FocusValCen of a central area of the image and definition values FocusValAround (n) of a plurality of peripheral areas when the sensitive lens is positioned at the stopping point, wherein n is 0,1,2 and …;
calculating the overall definition FocusWhole of the image;
FocusWhole=FocusValCen*CenPower+∑nfocusvalaround (n) aroundpower (n), n ═ 0,1,2, …; wherein, CenPower and AroundPower (n) are each solutionThe value range of the weight coefficient of the resolution is [0, 1]];
Comparing the total definition value FocusWhole with a preset definition maximum value FocusWholeRefA to control the moving distance of the next step:
when the difference value of the total definition value FocusWhole and the preset definition maximum value FocusWholeRefA is larger than or equal to a threshold value FocusWholeDiffA1, the movable distance of the sensitive lens is MoveDistanceA 1;
when the difference value of the total definition value FocusWhole and the preset definition maximum value FocusWholeRefA is greater than or equal to a threshold FocusWholeDiffA2 and less than a threshold FocusWholeDiffA1, the moving distance of the sensitive lens is MoveDistanceA 2;
when the total definition value FocusWhole approaches or exceeds the preset definition maximum value FocusWholeRefA, the sensitive lens moves by a distance MoveDistanceA3, wherein the approach means that the difference between the total definition value FocusWhole and the preset definition maximum value FocusRef is less than a threshold value FocusWholeDiffA 3;
traversing each stopping point and recording the maximum value FocusWholeMeax of the total definition in the process and the corresponding position FocusWholeBestPosition 2; after traversing each stopping point, the sensitive lens moves to the position focuss wheelebestposition 2;
outputting a fine adjustment core curve by taking the traversal sequence number of each stop point in the fine adjustment step as an abscissa and the definition value as an ordinate; wherein, curves corresponding to the central region definition value FocusValCen and the peripheral region definition value FocusValAround (n) are respectively marked by different colors or line types;
the fine tuning step performs the following operations:
setting the moving ranges of an X axis and a Y axis as [ XRange3 and Yrange3] and the single movement amount of the axis as MoveStep3 by taking the current position FocusWholeBestPosition2 as a plane center, wherein XRange3 is smaller than XRange2, Yrange3 is smaller than Yrange2, and MoveStep3 is smaller than MoveStep 2;
dividing an active area [ XRange3, Yrange3] into grids by taking the moving amount MoveSte1p3 as a unit, wherein each junction point in the grids is a stop point of the movement;
moving the sensitive lens to each stopping point in a traversing manner, and acquiring a definition value FocusValCen of a central area of the image and definition values FocusValAround (n) of a plurality of peripheral areas when the sensitive lens is positioned at the stopping point, wherein n is 0,1,2 and …;
calculating the overall definition FocusWhole of the image;
FocusWhole=FocusValCen*CenPower+∑nfocusvalaround (n) aroundpower (n), n ═ 0,1,2, …; wherein, CenPower and AroundPower (n) are weighting coefficients of each resolution, and the value range is [0, 1%];
Comparing the total definition value FocusWhole with a preset definition maximum value FocusWholeRefA to control the moving distance of the next step:
when the difference value of the total definition value FocusWhole and the preset definition maximum value FocusWholeRefA is larger than or equal to a threshold value FocusWholeDiffA1, the movable distance of the sensitive lens is MoveDistanceA 1;
when the difference value of the total definition value FocusWhole and the preset definition maximum value FocusWholeRefA is greater than or equal to a threshold FocusWholeDiffA2 and less than a threshold FocusWholeDiffA1, the moving distance of the sensitive lens is MoveDistanceA 2;
when the total definition value FocusWhole approaches or exceeds the preset definition maximum value FocusWholeRefA, the sensitive lens moves by a distance MoveDistanceA3, wherein the approach means that the difference between the total definition value FocusWhole and the preset definition maximum value FocusRef is less than a threshold value FocusWholeDiffA 3;
traversing each stopping point and recording the maximum value FocusWholeMeax of the total definition in the process and the corresponding position FocusWholeBestPosition 3; after traversing each stopping point, the sensitive lens moves to the position focuss wheelebestposition 3;
outputting a fine tuning core curve by taking the traversal sequence number of each stop point in the fine tuning step as an abscissa and the definition value as an ordinate; wherein, curves corresponding to the central region definition value focusValCen and the peripheral region definition value focusValAround (n) are respectively marked by different colors or line types.
Further, in the defocusing step, a plurality of groups of image definition values and corresponding Z-axis positions are collected, and finally, the FocusWholeBestPosition4 of the Z-axis position corresponding to the maximum image definition value is recorded; the imaging sensor is moved to position focuss wheelebestposition 4 after defocus is completed.
Further, the defocusing step performs the following operations:
acquiring an image before each movement and calculating a definition value FocusValCen of a central area of the image and definition values FocusValAround (n) of a plurality of peripheral areas, wherein n is 0,1,2 and …;
calculating the overall definition FocusWhole of the image;
FocusWhole=FocusValCen*CenPower+∑nfocusvalaround (n) aroundpower (n), n ═ 0,1,2, …; wherein, CenPower and AroundPower (n) are weighting coefficients of each resolution, and the value range is [0, 1%];
Comparing the total definition value FocusWhole with a preset definition maximum value FocusWholeRefB to control the moving distance of the next step:
when the difference value of the total definition value FocusWhole and the preset definition maximum value FocusWholeRefB is larger than or equal to a threshold FocusWholeDiffB1, the imaging sensor moves away from the lens by a MoveDistanceB 1;
when the difference value of the total definition value FocusWhole and the preset definition maximum value FocusWholeRefA is greater than or equal to the threshold FocusWholeDiffB2 and less than the threshold FocusWholeDiffA1, the imaging sensor moves away from the lens by a distance MoveDistanceB 2;
when the total definition value FocusWhole approaches or exceeds the preset definition maximum value FocusWholeRefB, the imaging sensor moves away from the lens by a distance MoveDistanceB3, wherein the approach means that the difference between the total definition value FocusWhole and the preset definition maximum value FocusWholeRefA is smaller than a threshold FocusWholeDiffA 3;
repeatedly executing the operation, marking a definition maximum FocusWholeMax where the total definition value FocusWhole appears in the process and a position FocusWholeBestposition4 of the Z axis when the definition maximum FocusWholeMax appears;
when the difference value between the recorded total definition value FocusWhole and the definition maximum value FocusWholeMax is larger than or equal to the threshold FocusWholeDrap or when the total moving length exceeds the threshold MoveDistanceMax, defocusing is completed;
moving the imaging sensor to a position FocusWholeBestPosition4 after defocus is complete;
outputting a defocusing curve by taking the Z axis position or the movement distance in the focusing process as a horizontal coordinate and the definition value as a vertical coordinate; wherein, curves corresponding to the central region definition value focusValCen and the peripheral region definition value focusValAround (n) are respectively marked by different colors or line types.
The optical lens aligning system of the present invention includes:
the image acquisition module is connected with an imaging sensor and is used for selecting the center of the image or/and a plurality of peripheral areas to shoot the images with the same characteristic shape;
the motion control module is used for controlling the XY axis mechanism to drive the sensitive lens of the lens to move horizontally and controlling the Z axis mechanism to drive the imaging sensor to move vertically;
the lens central focusing module is used for carrying out operation processing on the acquired image data so as to finish lens focusing and generate central focusing data;
the lens aligning module is used for performing operation processing on the acquired image data to finish lens aligning and generating aligning data;
and the lens defocusing module is used for performing operation processing on the acquired image data to finish lens defocusing and generate defocusing data.
Further, the lens center focusing module comprises
A first definition calculating unit for calculating the definition value of the central region of the current image,
A first imaging sensor vertical movement calculating and controlling unit for generating an instruction for driving the imaging sensor to shift a distance in the next step according to the comparison result of the current sharpness value,
A first sharpness peak determination and location unit for determining a sharpness peak to generate an instruction for driving the imaging sensor to shift to a Z-axis position corresponding to the sharpness peak, and
a focusing curve output unit for outputting a central focusing curve;
the lens aligning module comprises
A second definition calculating unit for calculating definition values of the center and peripheral regions of the current image,
A lens plane movement calculation and control unit for generating an instruction for driving the next displacement distance of the sensitive lens according to the comparison result of the current total definition value,
A lens rough adjusting unit for performing a rough adjusting operation of the lens,
A lens fine-tuning unit for performing a lens fine-tuning operation,
A lens trimming unit for performing lens trimming operations, and
a core adjustment curve output unit for outputting a core adjustment curve;
the lens out-of-focus module comprises
A third sharpness calculation unit for calculating sharpness values of the center and peripheral regions of the current image,
A second imaging sensor vertical movement calculating and controlling unit for generating an instruction for driving the next step of the displacement distance of the imaging sensor according to the comparison result of the current total definition value,
A second sharpness peak determination and location unit for determining a sharpness peak to generate an instruction for driving the imaging sensor to shift to a Z-axis position corresponding to the sharpness peak, and
and a defocus curve output unit for outputting a defocus curve.
The invention has the beneficial effects that: a unique core adjustment algorithm is set for the core adjustment equipment and the core adjustment process, so that the core adjustment equipment automatically executes the core adjustment operation of the lens, the three operations comprise lens center focusing, lens core adjustment and lens defocusing, wherein the lens core adjustment is also provided with the steps of coarse adjustment, fine adjustment and fine adjustment, and the core adjustment efficiency and the lens resolution yield are effectively improved.
Drawings
Fig. 1 is a general flowchart of the optical lens alignment method of the present invention.
Fig. 2 is a lens center focusing flow chart of the optical lens alignment method of the present invention.
Fig. 3 is a lens alignment rough adjustment flow chart of the optical lens alignment method of the present invention.
Fig. 4 is a lens alignment fine adjustment flow chart of the optical lens alignment method of the present invention.
Fig. 5 is a lens alignment fine adjustment flow chart of the optical lens alignment method of the present invention.
Fig. 6 is a lens defocusing flow chart of the optical lens alignment method of the present invention.
Fig. 7 is a structural diagram of the optical lens aligning system of the present invention.
Fig. 8 is an architecture diagram of a lens center focusing module, a lens alignment module and a lens defocusing module of the optical lens alignment system of the present invention.
Fig. 9 is a screenshot of an image captured by the image capture module of the present invention.
Fig. 10 is a defocus curve map of the present invention.
Detailed Description
The scheme of the present application is further described below with reference to the accompanying drawings 1 to 10:
referring to fig. 1 to 6, the method for aligning an optical lens includes the steps of:
s1, the imaging sensor is made to approach the lens from the initial position continuously through the Z-axis mechanism, the imaging of the central area of the image goes through the process from blurring to clear to blurring, the temporary Z-axis mechanism from clear to fuzzy stops in the central area of the image, and the current position is defined as the focusing initial position FocusStartPosition;
s2, the imaging sensor is far away from the lens through the Z-axis mechanism, and the center focusing step is executed:
s21, calculating the definition value FocusVal of the central area of the current image before each movement;
s22, comparing the sharpness value FocusVal with a preset sharpness maximum value FocusRef (maximum possible sharpness value) to control the distance moved in the next step:
when the difference value between the sharpness value FocusVal and the preset sharpness maximum value FocusDiff is larger than or equal to the threshold FocusDiff1, the imaging sensor moves away from the lens by a MoveDistance 1;
when the difference value between the sharpness value FocusVal and the preset sharpness maximum value FocusRef is greater than or equal to the threshold FocusDiff2 and smaller than the threshold FocusDiff1, the imaging sensor moves away from the lens by a MoveDistance 2;
when the sharpness value FocusVal approaches or exceeds the preset sharpness maximum value FocusRef, the imaging sensor moves away from the lens by a distance MoveDistance3, wherein the approach means that the difference value between the sharpness value FocusVal and the preset sharpness maximum value FocusRef is smaller than a threshold FocusDiff 3;
s23, repeatedly executing the steps S21 and S22, and recording the definition maximum FocusMax when the definition value FocusVal appears and the position FocusBestPosition of the Z axis when the definition maximum FocusMax appears;
s24, when the difference value between the definition value FocusVal and the definition maximum value FocusMax is larger than or equal to the threshold Focusdrop, focusing is completed; moving the imaging sensor to a position FocusBestPosition after focusing is completed;
s25, outputting a central focusing curve by taking the Z-axis position or the movement distance in the focusing process as a horizontal coordinate and the sharpness FocusVal of the central area as a vertical coordinate;
s3, moving the sensitive lens of the lens on the horizontal plane through an XY axis mechanism to execute the coarse adjustment step of lens alignment:
s31, setting the active areas of the X axis and the Y axis as [ XRange1, Yrange1] and setting the single movement amount of the axis as MoveStep1 by taking the current position FocusBestPosition as a plane center;
s32, dividing the active area [ XRange1, Yrange1] into grids by taking the moving amount MoveSte1p1 as a unit, wherein each junction point in the grids is a stop point of the movement;
s33, moving the sensitive lens to each stopping point, and collecting a sharpness value FocusValCen of a central area of the image and sharpness values focusvalaround (n) of a plurality of peripheral areas when the sensitive lens is located at the stopping point, where n is 0,1,2, …;
s34, calculating the overall definition FocusWhole of the image;
FocusWhole=FocusValCen*CenPower+∑nfocusvalaround (n) aroundpower (n), n ═ 0,1,2, …; wherein, CenPower and AroundPower (n) are weighting coefficients of each resolution, and the value range is [0, 1%];
S35, comparing the total sharpness value focuswheole with a preset sharpness maximum value focuswhelerefa (maximum possible sharpness) to control the distance of the next step:
when the difference value of the total definition value FocusWhole and the preset definition maximum value FocusWholeRefA is larger than or equal to a threshold value FocusWholeDiffA1, the movable distance of the sensitive lens is MoveDistanceA 1;
when the difference value of the total definition value FocusWhole and the preset definition maximum value FocusWholeRefA is greater than or equal to a threshold FocusWholeDiffA2 and less than a threshold FocusWholeDiffA1, the moving distance of the sensitive lens is MoveDistanceA 2;
when the total definition value FocusWhole approaches or exceeds the preset definition maximum value FocusWholeRefA, the sensitive lens moves by a distance MoveDistanceA3, wherein the approach means that the difference between the total definition value FocusWhole and the preset definition maximum value FocusRef is less than a threshold value FocusWholeDiffA 3;
s36, traversing each stopping point and recording the maximum value FocusWholeMax of the total definition appearing in the process of the total definition FocusWhole and the corresponding position FocusWholeBestPosition 1; after traversing each stopping point, the coarse tuning core is completed, and the sensitive lens moves to the FocusWholeBestPosition 1;
s37, outputting a coarse tuning core curve by taking the traversal sequence number of each stop point in the coarse tuning step as an abscissa and the definition value as an ordinate; wherein, curves corresponding to the central region definition value FocusValCen and the peripheral region definition value FocusValAround (n) are respectively marked by different colors or line types;
s4, moving the sensitive lens of the lens on the horizontal plane through an XY axis mechanism to execute the fine adjustment step of lens alignment:
s41, setting the active areas of the X axis and the Y axis as [ XRange2 and Yrange2] and the single movement amount of the axes as MoveStep2 by taking the current position FocusWholeBestPosition1 as a plane center, wherein XRange2 is smaller than XRange1, Yrange2 is smaller than Yrange1, and MoveStep2 is smaller than MoveStep 1;
s42, dividing the active area [ XRange2, Yrange2] into grids by taking the moving amount MoveSte1p2 as a unit, wherein each junction point in the grids is a stop point of the movement;
s43, moving the sensitive lens to each stopping point, and collecting a sharpness value FocusValCen of a central area of the image and sharpness values focusvalaround (n) of a plurality of peripheral areas when the sensitive lens is located at the stopping point, where n is 0,1,2, …;
s44, calculating the overall definition FocusWhole of the image;
FocusWhole=FocusValCen*CenPower+∑nfocusvalaround (n) aroundpower (n), n ═ 0,1,2, …; wherein, CenPower and AroundPower (n) are weighting coefficients of each resolution, and the value range is [0, 1%];
S45, comparing the total sharpness value focuswheole with a preset sharpness maximum value focuswhelerefa (maximum possible sharpness) to control the distance of the next step:
when the difference value of the total definition value FocusWhole and the preset definition maximum value FocusWholeRefA is larger than or equal to a threshold value FocusWholeDiffA1, the movable distance of the sensitive lens is MoveDistanceA 1;
when the difference value of the total definition value FocusWhole and the preset definition maximum value FocusWholeRefA is greater than or equal to a threshold FocusWholeDiffA2 and less than a threshold FocusWholeDiffA1, the moving distance of the sensitive lens is MoveDistanceA 2;
when the total definition value FocusWhole approaches or exceeds the preset definition maximum value FocusWholeRefA, the sensitive lens moves by a distance MoveDistanceA3, wherein the approach means that the difference between the total definition value FocusWhole and the preset definition maximum value FocusRef is less than a threshold value FocusWholeDiffA 3;
s46, traversing each stopping point and recording the total definition maximum FocusWholeMax and the corresponding position FocusWholeBestPosition2 in the process; after traversing each stopping point, the sensitive lens moves to the position focuss wheelebestposition 2;
s47, outputting a fine adjustment core curve by taking the traversal sequence number of each stop point in the fine adjustment step as an abscissa and the definition value as an ordinate; wherein, curves corresponding to the central region definition value FocusValCen and the peripheral region definition value FocusValAround (n) are respectively marked by different colors or line types;
s5, moving the sensitive lens of the lens on the horizontal plane through an XY axis mechanism to execute the fine adjustment step of lens alignment:
s51, setting the moving range of an X axis and a Y axis as [ XRange3 and Yrange3] and the single moving amount of the axes as MoveStep3 by taking the current position FocusWholeBestPosition2 as a plane center, wherein XRange3 is smaller than XRange2, Yrange3 is smaller than Yrange2, and MoveStep3 is smaller than MoveStep 2;
s52, dividing the active area [ XRange3, Yrange3] into grids by taking the moving amount MoveSte1p3 as a unit, wherein each junction point in the grids is a stop point of the movement;
s53, moving the sensitive lens to each stopping point, and collecting a sharpness value FocusValCen of a central area of the image and sharpness values focusvalaround (n) of a plurality of peripheral areas when the sensitive lens is located at the stopping point, where n is 0,1,2, …;
s54, calculating the overall definition FocusWhole of the image;
FocusWhole=FocusValCen*CenPower+∑nfocusvalaround (n) aroundpower (n), n ═ 0,1,2, …; wherein, CenPower and AroundPower (n) are weighting coefficients of each resolution, and the value range is [0, 1%];
S55, comparing the total sharpness value focuswheole with a preset sharpness maximum value focuswhelerefa (maximum possible sharpness) to control the distance of the next step:
when the difference value of the total definition value FocusWhole and the preset definition maximum value FocusWholeRefA is larger than or equal to a threshold value FocusWholeDiffA1, the movable distance of the sensitive lens is MoveDistanceA 1;
when the difference value of the total definition value FocusWhole and the preset definition maximum value FocusWholeRefA is greater than or equal to a threshold FocusWholeDiffA2 and less than a threshold FocusWholeDiffA1, the moving distance of the sensitive lens is MoveDistanceA 2;
when the total definition value FocusWhole approaches or exceeds the preset definition maximum value FocusWholeRefA, the sensitive lens moves by a distance MoveDistanceA3, wherein the approach means that the difference between the total definition value FocusWhole and the preset definition maximum value FocusRef is less than a threshold value FocusWholeDiffA 3;
s56, traversing each stopping point and recording the total definition maximum FocusWholeMax and the corresponding position FocusWholeBestPosition3 in the process; after traversing each stopping point, the sensitive lens moves to the position focuss wheelebestposition 3;
s57, outputting a fine tuning core curve by taking the traversal sequence number of each stop point in the fine tuning step as an abscissa and the definition value as an ordinate; wherein, curves corresponding to the central region definition value FocusValCen and the peripheral region definition value FocusValAround (n) are respectively marked by different colors or line types;
after the steps of coarse adjustment and fine adjustment to fine adjustment, the sensitive lens of the lens is positioned to a more accurate position;
s6, the imaging sensor reaches the focus initial position FocusStartPosition through the Z-axis, and starts to move away from the lens continuously from the position, and the defocus step is executed:
s61, acquiring an image before each movement, and calculating a sharpness value FocusValCen of a central region of the image and sharpness values focusvalaround (n) of a plurality of peripheral regions, where n is 0,1,2, …;
s62, calculating the overall definition FocusWhole of the image;
FocusWhole=FocusValCen*CenPower+∑nfocusvalaround (n) aroundpower (n), n ═ 0,1,2, …; wherein, CenPower and AroundPower (n) are weighting coefficients of each resolution, and the value range is [0, 1%];
S63, comparing the total sharpness value focuswheole with a preset sharpness maximum focuswhelereb (maximum possible sharpness) to control the distance of the next step:
when the difference value of the total definition value FocusWhole and the preset definition maximum value FocusWholeRefB is larger than or equal to a threshold FocusWholeDiffB1, the imaging sensor moves away from the lens by a MoveDistanceB 1;
when the difference value of the total definition value FocusWhole and the preset definition maximum value FocusWholeRefA is greater than or equal to the threshold FocusWholeDiffB2 and less than the threshold FocusWholeDiffA1, the imaging sensor moves away from the lens by a distance MoveDistanceB 2;
when the total definition value FocusWhole approaches or exceeds the preset definition maximum value FocusWholeRefB, the imaging sensor moves away from the lens by a distance MoveDistanceB3, wherein the approach means that the difference between the total definition value FocusWhole and the preset definition maximum value FocusWholeRefA is smaller than a threshold FocusWholeDiffA 3;
s64, repeatedly executing the steps S61-S63, marking the maximum value FocusWholeMax of the definition of the total definition FocusWhole in the process and the position FocusWholeBestPosition4 of the Z axis when the maximum value FocusWholeMax appears;
s65, when the difference between the recorded total sharpness value focuswhele and the maximum sharpness value focuswhelemax is greater than or equal to the threshold focuswheledrop, or when the total movement length exceeds the threshold movedistancememax, defocus is completed;
s66, moving the imaging sensor to a FocusWholeBestPosition4 after defocusing is completed;
s67, outputting a defocusing curve by taking the Z-axis position or the movement distance in the focusing process as a horizontal coordinate and the definition value as a vertical coordinate; wherein, curves corresponding to the central region definition value focusValCen and the peripheral region definition value focusValAround (n) are respectively marked by different colors or line types.
Referring to fig. 7 to 8, an optical lens barrel aligning system for performing the above method includes:
the image acquisition module is connected with an imaging sensor and is used for selecting the center of the image or/and a plurality of peripheral areas to shoot the images with the same characteristic shape;
the motion control module is used for controlling the XY axis mechanism to drive the sensitive lens of the lens to move horizontally and controlling the Z axis mechanism to drive the imaging sensor to move vertically;
the lens central focusing module is used for carrying out operation processing on the acquired image data so as to finish lens focusing and generate central focusing data;
the lens aligning module is used for performing operation processing on the acquired image data to finish lens aligning and generating aligning data;
and the lens defocusing module is used for performing operation processing on the acquired image data to finish lens defocusing and generate defocusing data.
Specifically, the lens center focusing module comprises a first definition calculating unit, a first imaging sensor vertical movement calculating and controlling unit, a first definition peak value judging and positioning unit and a focusing curve output unit, wherein the first definition calculating unit is used for calculating a definition value of a current image center area, the first imaging sensor vertical movement calculating and controlling unit is used for generating an instruction for driving the imaging sensor to shift to a next shift distance according to a comparison result of the current definition value, the first definition peak value judging and positioning unit is used for judging a definition peak value so as to generate an instruction for driving the imaging sensor to shift to a Z-axis position corresponding to the definition peak value, and the focusing curve output unit is used for outputting a center focusing curve;
the lens aligning module comprises a second definition calculating unit, a lens plane movement calculating and controlling unit, a lens coarse adjusting unit, a lens fine adjusting unit and an aligning curve output unit, wherein the second definition calculating unit is used for calculating definition values of the center and the peripheral area of a current image, the lens plane movement calculating and controlling unit is used for generating an instruction for driving the next displacement distance of the sensitive lens according to a comparison result of a current total definition value, the lens coarse adjusting unit is used for executing lens coarse adjusting operation, the lens fine adjusting unit is used for executing lens fine adjusting operation, and the aligning curve output;
the lens defocusing module comprises a third definition calculating unit, a second imaging sensor vertical movement calculating and controlling unit, a second definition peak value judging and positioning unit and a defocusing curve output unit, wherein the third definition calculating unit is used for calculating definition values of the center and the peripheral area of a current image, the second imaging sensor vertical movement calculating and controlling unit is used for generating an instruction for driving the imaging sensor to shift to a next displacement distance according to a comparison result of a current total definition value, the second definition peak value judging and positioning unit is used for judging a definition peak value so as to generate an instruction for driving the imaging sensor to shift to a Z-axis position corresponding to the definition peak value, and the defocusing curve output unit is used for outputting a defocusing curve.
The above preferred embodiments should be considered as examples of the embodiments of the present application, and technical deductions, substitutions, improvements and the like similar to, similar to or based on the embodiments of the present application should be considered as the protection scope of the present patent.
- 上一篇:石墨接头机器人自动装卡簧、装栓机
- 下一篇:半自动镜头组装装置及其组装方法