A 3D reconstruction method based on grid laser and gray scale photo for visual inspection of welds

A 3D reconstruction method based on grid laser and gray scale photo for visual inspection of welds

Optics and Laser Technology 119 (2019) 105648 Contents lists available at ScienceDirect Optics and Laser Technology journal homepage: www.elsevier.c...

6MB Sizes 0 Downloads 2 Views

Optics and Laser Technology 119 (2019) 105648

Contents lists available at ScienceDirect

Optics and Laser Technology journal homepage: www.elsevier.com/locate/optlastec

Full length article

A 3D reconstruction method based on grid laser and gray scale photo for visual inspection of welds

T



Nana Jia, Zhiyong Li , Jieliang Ren, Yongji Wang, Liuqing Yang College of Materials Science and Engineering, North University of China, Taiyuan, Shanxi 030051, China

H I GH L IG H T S

laser is used to calculate the height value based on triangulation method. • AA grid reconstruction method is proposed by fusion analysis of height and gray value. • A 3D based system is developed based on the 3D reconstruction method. • Thelaser-photo • system can measure 3D contour of the weld surface with satisfied accuracy.

A R T I C LE I N FO

A B S T R A C T

Keywords: 3D Reconstruction Grid laser Gray photo Weld Visual inspection

Three-dimensional reconstruction of the weld surface is critical for online visual inspection of welds. A new three-dimensional reconstruction algorithm is proposed by fusion analysis of the images of weld under grid laser illumination and without additional light sources. A grid laser was used to form a 9 × 9 orthogonal beam grid on the weld surface and a single camera was used to make sure the images were captured in the same coordinate system. For every single grid, the height value at the points with grid laser beam was calculated. After that, the height value was combined with the gray value of the image to reconstruct the three-dimensional contour of the weld by nonlinear fitting algorithm. The developed laser-photo hybrid method acquires the 3D information of weld surface by a shot of picture rather than the continuous collecting method used in the line-laser sensor. The designed laser-photo based system can measure welds with satisfied accuracy and reliability, which eliminates errors caused by deflection between measured system and the workpiece.

1. Introduction Welding beads with poor profile can reduce the mechanical properties of the welding joints, shorten the lifetime of the products, and even cause a collapse of the structure [1]. Therefore, standards for quality evaluation of welds regarding geometry imperfections and presence of flaws have been established by different organizations [2–4]. Visual inspection is important for detection of frequently happened weld defects such as superficial imperfections and flaws [5]. Traditionally, visual inspection is implemented by qualified inspectors [6] and the process is time consuming. Moreover, inaccuracy occurs due to the subjectivity of the inspectors and precision of the tools. With the development of modern industry, advanced techniques such as image recognition method [7,8] and 3D reconstruction method [9–11] are used for automatic visual inspection of welds. The sensors used for detection can be divided into two-dimensional method [12,13]



and three-dimensional method [14,15] according to the acquired information. The two-dimensional method acquires images without additional structured light illumination and the feature information is extracted from the images after a certain image processing. Li et al. [16] used a digital camera for quality evaluation of weld seams of pipes by back propagation neural network algorithm. Leo et al. [17] designed an automatic system to monitor welds of stainless steel kegs based on a vision approach without using the structured light. The two-dimensional method provides a simple and convenient solution for visual inspection of welds. However, the method acquires the feature information of welds only by the gray value of the image which is susceptible to be affected by the ambient light source. Furthermore, the two-dimensional method cannot measure the height of the welds. The three-dimensional method is performed by projecting an additional structured light source, such as laser light, on the surface of an

Corresponding author. E-mail address: [email protected] (Z. Li).

https://doi.org/10.1016/j.optlastec.2019.105648 Received 7 August 2018; Received in revised form 3 June 2019; Accepted 11 June 2019 0030-3992/ © 2019 Elsevier Ltd. All rights reserved.

Optics and Laser Technology 119 (2019) 105648

N. Jia, et al.

object to measure convexity of the surface with the distorted light stripes. Shao et al. [18] proposed a sensor with three laser stripes to obtain width, center position and normal vector of narrow butt joint weld seam. He et al. [19] used the images of weld under laser illumination to detect weld profile by visual attention model. Yanbiao et al. [20] designed a system with structured laser light to detect welds based on the morphological image processing method and CCOT algorithm, which resolved the interference of strong arc and splash. Zhang et al. [21] proposed a cross-structure light to capture three-dimensional (3D) information of the weld seam by scanning, which presented a root mean square (RMS) error within 0.407 mm. The above sensors used line laser to detect the 3D information of welds while having the benefits of versatility and strong anti-interference capability. However, in order to acquire the full view of the weld, the line laser must scan the weld surface with a relative low speed, which reduces the speed of detection and makes it sensitive to vibration caused by object movement [22]. Structured grating light method captures the full view of object by projecting structured grating light with coded pattern on the object [23]. Wang et al. [24] got 3D shape of the object by calculating the phase map of the distorted image obtained by a single-shot. Le et al. [25] used structured illumination imaging and data fusion to reconstruct the 3D surface with accurate shape edge. Therefore, the structured grating light method can obtain the three-dimensional contour of the object by a single image, improving the accuracy and speed of measurement. However, the data processing is always complex and time consuming which brings difficulty for industrial application. Therefore, it is proposed in this manuscript a new 3D reconstruction method that can get the 3D information of object with acceptable accuracy and detection speed. The 3D reconstruction method is based on fusion analysis of the images of the weld without additional light sources and orthogonal structured grid laser illumination using a single camera. With the orthogonal structured grid laser, the height value is calculated and is nonlinearly fitted with the gray value in every tiny grid, which can improve the speed and accuracy of the measurement. Experimental results indicate that the proposed method is effective in obtaining the 3D contour of the welds.

Table 1 Configuration of the designed sensor. Device

Camera

Laser projector

Parameters

CMOS: MV-U300:2/3 in. Resolution: 2048 × 1536 pixel Pixel size: 3.2 μm × 3.2 μm Focal length: 6–12 mm Frame rate: 12 fps

Size: ∅16 × 70 mm Wave length: 650 nm Operating voltage: DC 5.2 V Output power: 60 mw Lens angle: 8。

because it could obtain the surface information of weld in both vertical and horizontal directions. The laser device projects a laser beam on the weld surface to form a 9 × 9 grid laser which is then captured by the CMOS camera and transmitted to computer for image processing. In Fig. 1, the grid laser is distorted by the surface of the weld and the height of the weld is calculated with the triangulation method. After obtaining the height value, the gray value of the image without additional light sources is obtained which is fitted with the height value within a small grid skeleton. After the height value of each small grid is calculated and integrated, the 3D shape of the weld is reconstructed.

2.2. Measurement system principle Fig. 2a and b show the schematic of the grid laser triangulation measuring device and an actual picture of it, respectively. The grid laser is deformed by the surface of the weld and the deformed laser is captured in a single image by a CMOS camera. The captured image is then processed to obtain a skeleton image with single pixel in width. As shown in Fig. 2a, there are arbitrary points P and Q on the laser stripe of line 5. The coordinates of point P and Q in the skeleton image with single pixel width are expressed as (u, v ) and (u1, v1) . Thus, we can use values v and v1 to obtain the absolute height of point P and Q. Finally, the absolute height of all points on the skeleton line can be acquired.

2.3. Measurement system calibration 2. Grid laser-gray photo 3D reconstruction system As shown in Fig. 3a, the camera and the laser are installed on the same plane with an angle θ5. A 9 × 9 orthogonal grid laser beam emits from the laser generator and illuminates on the weld. In order to calculate the height of the weld, calibration of the designed system is required. The fixed sensor (camera and laser) is perpendicular to the calibration board which is located on an optical platform with accuracy of 0.01 mm in three directions. The calibration process includes calibration of the CMOS camera and calibration of the laser generator.

2.1. Measurement system design As shown in Fig. 1, the proposed grid laser-gray photo 3D system includes a high resolution CMOS camera, a semiconductor grid laser and a front-end sensor fixing device. This laser was adopted as an additional auxiliary light source to obtain structural light because of its monochromaticity, coherence, and directionality. The configuration of the designed sensor is shown in Table 1. The grid laser was employed

Fig. 1. The grid laser-gray photo based 3D reconstruction system. 2

Optics and Laser Technology 119 (2019) 105648

N. Jia, et al.

Fig. 2. Design and structure of the sensor. (a) Design of sensor, (b) structure of sensor.

2.3.1. Camera calibration As shown in Fig. 5, for the camera calibration, the intrinsic parameters of the camera are calibrated using Zhang’s method [26] and a chessboard is used as a target checkerboard of camera calibration, by which a camera captures 10–15 images of the target checkerboard from different perspectives. In this paper, the camera is fixed and the target checkerboard can move freely. 15 images of the target checkerboard are captured from different perspectives and the checkerboard calibration images are imported into MATLAB for calibration. The camera calibration parameters such as intrinsic parameters and distortion coefficients are shown in Table 2. In order to make a quantitative analysis of the calibration results, the angle reprojection error of the checkerboard images is analyzed. Fig. 4 is the projection error distribution map of the corner reprojection. In Fig. 4, the reprojection error is compact and is evenly distributed around the center point. The mean angle reprojection error is only 0.1022 pixels, which proves that the accuracy of camera calibration is relatively high. As shown in Table 2, f / dx = 2.399 × 103 , f / dy = 2.398 × 103 and dx = dy = 3.2 × 10−3 mm. Therefore, f is 7.6752 mm.

Table 2 Camera calibration parameters. Parameters

Value

Camera_matrix

3 0 0⎤ ⎡ 2.399 × 10 ⎢ 0 2.398 × 103 0 ⎥ ⎢ ⎥ 3 770.970 1⎦ ⎣1.038 × 10 [− 0.1308 0.1731] [0 0]

Radial Dist_coeff Tangential Dist_coeff

2.3.2. Laser calibration As shown in Fig. 3b, laser line 5 is used for calibration of the laser generator. As indicated in Fig. 3a, H5 represents the working distance from the laser to the workpiece. θ5 is the angle between the laser beam and the optical axis of the camera lens. d is the distance from the lens to the camera imaging surface. L5 is the distance from the intersection of the laser beam optical axis and the lens optical axis to the lens. ΔH and Δn are the measuring ranges of the laser and the camera detection. As shown in Fig. 5, after obtaining the focal length f, the image distance d

Fig. 3. Schematic diagram of calibration. (a) Light-path of the sensor, (b) 9 × 9 grid laser. 3

Optics and Laser Technology 119 (2019) 105648

N. Jia, et al.

Table 3 Calibrated basic parameters of the system. Parameter

f

d

L5

θ5

H5

Value (mm)

7.6752

7.8500

345

11.6821°

317.7000

Table 3 shows the parameters obtained from the calibration process. 3. Methodology for 3D reconstruction As shown in Fig. 6, the 3D reconstruction process is fulfilled by the image processing, height calculation and 3D reconstruction algorithm. The calculation algorithm and the correction process are discussed in detail for each step. Fig. 4. The distribution of corner reprojection error.

3.1. Image processing

is calculated according to the proportion of the actual size (X) and the pixel size (x) of a unit length in calibration board. Then, L5 can be calculated through the Gaussian imaging formula. Scattering reflection occurs when the laser lines project on the S plane, which are then focused by lens on the CMOS camera. Therefore, the moving distance ΔH and the corresponding moving pixel difference Δn of the grid center O (Fig. 3b) are substituted into the triangulation formula to calculate the value of angle θ5 . Finally, H5 is calculated according to the ratio of laser projection distance (500 mm) and the standard line length (70 mm).

Image acquisition and processing are critical because it inputs data for the 3D reconstruction. As shown in Fig. 6, the images of the weld under grid laser illumination and without additional light sources are captured respectively. The weld image without additional light sources is captured by turning off the laser generator, while the camera aperture is adjusted and other conditions maintain the same. Firstly, the collected images are filtered and de-noised to obtain the weld images with improved clarity, which are then converted into gray image by grayscale processing. After that, the gray-scale image under grid laser

Fig. 5. Flow chart of the calibration process. 4

Optics and Laser Technology 119 (2019) 105648

N. Jia, et al.

Fig. 6. The flow diagram for reconstruction of the 3D profile.

lines can be obtained:

illumination is binarized to obtain a black-and-white image with pixel value of only 0 and 1. After that, skeleton thinning and burr removal algorithms are developed and used to obtain a skeleton image with a single-pixel in width.

1 − 4 line equation: HA = Hi −

5 line equation: HA = H5 + 3.2. Height calculation

ΔnbL5 dsinθ5 − Δnbcosθ5

6 − 9 line equation: HA = Hi +

After image processing, the skeleton image is used to calculate the height of the weld based on the triangulation method.

ΔnbLi cosαi dsinθi + Δnbcosθi

ΔnbLi cosαi dsinθi − Δncosθi

(1)

(2)

(3)

where d is the distance from the lens to the camera image plane; θi is the fixed angle between the optical axis of the laser beam and the optical axis of the camera lens; f is the focal length; Δn is the pixel difference and b is the pixel size. αi is the angle between main axis of the laser line 5 and the remaining laser lines which can be calculated by Eq. (4), where α 0 = 1°.

3.2.1. Absolute height calculation As shown in Fig. 7, the absolute height calculation method used for the 9 × 9 grid laser is based on the triangulation method, which is divided into direct type and oblique type. For the 9 × 9 grid laser, line 5 is projected perpendicularly to the weld and the direct type triangulation method is used as described in Fig. 3a. The oblique type triangulation method is used for other laser lines as shown in Fig. 7. Fig. 7a shows the measurement principle of line 1–4 and Fig. 7b shows the measurement principle of line 6–9. According to the schematic diagram in Fig. 3 and Fig. 7, the absolute height of each point on the skeleton

αi = (n − 5) α 0

(4)

The angle θi between the incident laser line and the lens axis is calculated by:

θi = θ5 + (n − 5) α 0 5

(5)

Optics and Laser Technology 119 (2019) 105648

N. Jia, et al.

Fig. 7. Triangulation measurement method. (a) Line 1–4, (b) line 6–9, (c) optical path geometry diagram.

Fig. 8. The left and right deflection correction. (a) Deflect to the right, (b) deflect to the left.

relationship, which makes it possible for hand-held measuring of weld with the grid laser-gray photo 3D system. Fig. 8 shows the left and right deflection of the device. Assuming that the deflection angle is β , the value of β can be obtained according to the geometric relationship of ΔABC in Fig. 8. The absolute height from any point on each skeleton to the object surface can be corrected with Eq. (8).

According to the geometric relationship in Fig. 7c, the calculation of Li and Hi of the remaining lines can be obtained by Eqs. (6) and (7). Li is the distance from the intersection of the optical axis of laser beam and the optical axis of the camera lens to the front surface of the receiving lens. Hi is the distance from the intersection of the optical axis of laser beam and the optical axis of camera to the laser.

Hi =

H5sinθ5 cosαi sinθi

Li = L5 ±

H5sinαi sinθi

(6)

Hn' =

Hn × cos(β − αi ) cosαi

(8)

where Hn is the absolute height from workpiece surface to the laser before the correction; Hn' is the absolute height from workpiece surface to the laser after the correction; β is the tilt angle; αi is the angle between the main axis of laser line 5 and the remaining laser lines. Fig. 9 shows inside and outside deflection which are caused by system deflection, γ is the deflection angle. The height difference ΔH and the pixel difference n between the initial point and the end point at any skeleton can be calculated. After that, the height difference between individual points at one skeleton can be calculated as Δh = ΔH / n + 1. The inside and outside correction is performed by Eq.

(7)

where the “+” is used for line 1–4 and the “−” is used for line 6–9. 3.2.2. Height correction As is known, it is difficult to install the sensor with exact vertical position. The deflection between the sensor and the workpieces brings errors which cannot be corrected by traditional laser scan method. However, with the grid laser method used in this manuscript, the position deflection can be corrected by algorithm based on geometric 6

Optics and Laser Technology 119 (2019) 105648

N. Jia, et al.

Fig. 9. The outside and inside deflection correction. (a) Deflect to the outside, (b) deflect to the inside.

Fig. 10. Fitting curves and residual values of different orders. (a) The different order fitting curve, (b) residual values of different order fitting.

(9):

Hn'' = Hn' + n y × Δh = Hn' + n y ×

ΔH n+1

(9)

where Hn'' is the absolute height from workpiece surface to the laser after the correction; Hn' is the absolute height from workpiece surface to the laser before the correction; n y is the pixel difference from the any point to the initial point at one skeleton. With the absolute height correction algorithm, the errors caused by deflection can be eliminated which improves accuracy of the measurement. 3.3. 3D reconstruction algorithms As discussed in 3.1, the height can only be calculated for points on the skeleton. Therefore, a 3D reconstruction algorithm method is proposed to get the 3D contour of the weld for points without grid lines. Firstly, the height value and the gray value are fitted in every small grid of the 9 × 9 grid laser. Secondly, the gray value is transferred to height value according to the fitting relationship, which is used to fill the gap

Fig. 11. The fitting result of height value and gray value.

7

Optics and Laser Technology 119 (2019) 105648

N. Jia, et al.

Table 4 The estimated parameters of the polynomial curve fitting. Function

a0

a1

a2

a3

a4

a5

yi

643.747

−12.4296

0.1848

−0.0014

4.9676 × 10−6

− 7.1627 × 10−9

Fig. 12. Noise distribution of three-dimensional contour of welds. (a) Schematic diagram of the large-scale noise, (b) schematic diagram of small-scale noise after eliminating the large-scale noise, (c) schematic diagram of the removal of small-scale noise.

acquired. Then, the gray values of points with the same coordinates R are extracted from the gray image without additional light sources. (iii) The least square method is used to fit the height values obtained by step (i) with the gray values obtained by step (ii). The height value is set as y coordinate and the gray value is set as x coordinate. The polynomial curve fitting is used to fit the height values and the gray values. A polynomial fitting is constructed by means of addition, multiplication and exponentiation to a non-negative power as shown in Eq. (10).

yi = a 0 + a1 x i + a2 x i2 + ⋯+an x in

(10)

The matrix form of Eq. (10) is:

Fig. 13. Schematic diagram of removing small-scale noise points.

yi = Ax i'

of every small grid. Finally, integrating each small grid can obtain the 3D contour of the weld. The fitting algorithm and height filling process are introduced in detail.

(11)

where x i' = [1, x i , x i2, ⋯, x in], A = [a0 , a1, a2, ⋯, an]. A is the polynomial parameter vector, and n is the order of polynomial function. The polynomial parameters in Eq. (10) can be estimated by weighting the least-squares errors, minimizing the residual according to Eq. (12).

3.3.1. Fitting algorithm Through the previous calculation, the height value at the points under grid laser illumination is calculated and the gray value of the image without additional light sources in the same coordinate is extracted. Laser line 5 and 6 are used to illustrate the fitting algorithm for 3D reconstruction.

A = argmin

⎧1 ⎨N ⎩

N −1

∑ wi (yi − yi∗ )2⎫⎬ i=0



(12)

wi is calculated by the following equation:

(i) As shown in Fig. 3b, the small grid between line 5, 6 and line e, f is extracted, which is named as grid 1. The coordinates of points in the skeleton line of grid 1 with single pixel width are supposed to be (x , y ) . The coordinate values (x , y ) are stored in matrix R. The absolute height of the points can be calculated through the triangulation method.

wi = exp ⎡− ⎢ ⎣

(yi − yi∗ )2 ⎤ 2σ 2 ⎥ ⎦

(13)

where A is the parameter vector to be estimated; yi is the value predicted from the model in Eq. (10); yi∗ is the height value calculated based on the triangulation method; N is the total number of the measured data points; wi is the weight to adjust the contribution of the data fitting to the model; σ is the coefficient to control the relationship between the fitting error and the weight. Firstly, the weighted least squares method is used to estimate the parameter vector A with an initial estimate of the weight to optimize Eq. (12). The weight wi is an array whose initial element is set to 1. Once the parameter vector A is estimated, it will be substituted to Eq.

x,y ⎡ 1 1⎤ x2 , y2 ⎥ ⎢ R= ⎢ ⋮⋮ ⎥ ⎢ x n , yn ⎥ ⎦ ⎣ (ii) The gray image of weld without additional light sources is 8

Optics and Laser Technology 119 (2019) 105648

N. Jia, et al.

Fig. 14. Calibration of the reconstruction algorithm. (a) Image under grid laser illumination, (b) image without additional light sources, (c) absolute height, (d) height measurement error, (e) reconstruction 3D contour, (f) 3D contour after denoising.

3.3.2. Height filling Through the above steps, the height value and the gray value on a single grid are fitted to get the fitting relationship as shown in Eq. (14). The gray values of the points on the grid gap N = [g1, g2 , g3, ⋯, gn] are extracted from the gray image without additional light sources, which is then brought into Eq. (14) to get the height values M = [y1 , y2 , y3 , ⋯, yn ].

(10) to predict yi . Finally, the weight will be updated by substituting yi into Eq. (13) and the updated weight is transformed to Eq. (10) to estimate the parameter vector A. The algorithm iterates until the weight is stable and vector A is brought into Eq. (10) to obtain a polynomial curve fitting relationship for the height value and the gray value. Different values of fitting order n are used for nonlinear fitting of height and gray values as shown in Fig. 10a. The residuals of different fitting orders are shown in Fig. 10b. When the fitting order is 1, the polynomial fitting becomes linear fitting and the residual is 0.219. When the fitting order n is 5, the residual is with the smallest value of 0.198. When n is 6, the residual value increases to 1.418. Therefore, considering the residuals and the fitting effect, n is selected to be 5. Fig. 11 shows the fitting result and Table 4 provides the estimated parameters when the fitting order n is 5.

yi = a0 + a1 x1 + a2 x2 + a3 x3 + a4 x 4 + a5 x5

(14)

The height values M constitute the contour of a single grid area. After all the small grids are calculated, the data are integrated to obtain 3D profile of the object.

9

Optics and Laser Technology 119 (2019) 105648

N. Jia, et al.

Fig. 15. The collected welds images. (a) Weld image without additional light sources of weld 1, (b) Weld image without additional light sources of weld 2, (c) and (d) are the corresponding weld images under grid laser illumination.

3.4. Three-dimensional contour de-noising

Wc = e During the reconstruction of the weld profile, some abnormal points appeared in the three-dimensional fitting process. The noise points deviated from the main body of the profile must be removed. As shown in Fig. 12a, few noise points have large scale errors which are far away from profile of the weld. The threshold value was set to remove the large-scale noise and the result is shown in Fig. 12b. After large-scale noise is removed, there was still small-scale noise on the surface of the profile (Fig. 12b) which mainly appears in the filling process of height values for three-dimensional reconstruction. The bilateral filtering method was used to remove the small-scale noise while maintaining the characteristics of the weld by avoiding excessive smoothness of the weld surface. The bilateral filtering algorithm and smoothing steps are as follows:

pi' = pi + an

Ws = e

∑kij ∈ Nqi Wc ( ∥pi − kij∥ )·Ws ( 〈ni ,

(15)

pi − kij 〉 )

y2 2σs2

(18)

3.5. Calibration of reconstruction algorithm In order to calibrate the proposed 3D reconstruction algorithm, a curved surface similar to the weld surface was used for testing. Fig. 14a and b show the images of the curved surface under grid laser illumination and without additional light sources separately. Fig. 14c shows the absolute height of the curved surface calculated by the triangulation

pi − kij〉 )·(ni , pi − kij )

∑kij ∈ Nqi Wc ( ∥pi − kij∥ )·Ws ( 〈ni ,



(17)

Wc is the bilateral filtering weight function, called the spatial domain weight. Ws is used to capture the normal change between adjacent points, called feature domain weight. σc controls the smoothness and σs controls the degree of feature retention, which are standard deviation of the Gaussian function of the space domain and the range respectively. Firstly, the adjacent points of pi in its k-nearest neighbors are determined, as shown in Fig. 13. Then the geometric distance ∥qi − kij∥ of data point pi to the adjacent point kij is calculated and the inner product 〈ni , pi − kij〉 is also obtained. After that, the values of Wc and Ws can be calculated to obtain the bilateral weight factor a. Finally, the original data are replaced with the new data until all data points are calculated. As shown in Fig. 12c, small-range noise is filtered out.

where Pi is the original data point, Pi' is the filtered data point, n is the normal vector direction of Pi , and a is the bilateral weight factor, defined as

a=

2 − x2 2σc

(16) 10

Optics and Laser Technology 119 (2019) 105648

N. Jia, et al.

Fig. 15a, the weld seam is obtained with stable welding current and arc voltage, named as weld 1. In Fig. 15b, the weld seam is obtained by reducing welding current and arc voltage, named as weld 2. As shown in Fig. 15c and d, when the grid laser illuminates on the surface of the weld, specular reflection occurs which brings strong noise affecting the image segmentation, recognition and exaction. Therefore, the noise must be removed in the binarization process by separating the target image from the background with a local selection threshold value. Fig. 16a and b are the binarization and skeletonization results of weld 1. The deformation of each skeleton line perpendicular to the weld is nearly the same. Fig. 16c and d show the binarized image and the skeleton image of weld 2 after burr removal. The deformation of each skeleton line changes as the shape of the weld changes. 4.2. Absolute height calculation of weld After obtaining the skeleton of welds, the absolute height of the weld surface were calculated and corrected. Fig. 17a and b are the corresponding three-dimensional contour of weld 1 and 2 respectively The true height was measured by a weld gauge with accuracy of 0.01 mm and the measured height was obtained by the proposed system. The errors between the true height and the measured height are shown in Fig. 18. As for the weld 1, the error between the true height and the measured height is ± 0.07 mm, as shown in Fig. 18a. Fig. 18b shows the error distribution of weld 2 and the error is ± 0.09 mm. The accuracy of the measurement method is acceptable for industry application.

Fig. 16. Resulting images of weld after image processing. (a) Binary image of weld 1, (b) single-pixel skeletonized image of weld 1, (c) binary image of weld 2, (d) single-pixel skeletonized image of weld 2.

4.3. Three-dimensional reconstruction results

method. Fig. 14d shows the error between the true height and the measured height. Fig. 14e and f are the reconstructed three-dimensional contour and the reconstructed three-dimensional contour after denoising process. As shown in Fig. 14d, the error of the height calculation on the curved paper is ± 0.06 mm. By comparing the actual curved surface topography with the reconstructed three-dimensional topography, the recovery results were in good agreement with the actual shape of the curved paper. Therefore, the proposed 3D reconstruction algorithm is reliable and the reconstruction result is with enough accuracy.

The 3D reconstruction algorithm as shown in part 3.3 is used to acquire the contour of the weld surface. The reconstructed three-dimensional contours of weld 1 and weld 2 are shown respectively in Fig. 19a and b. Then, the three-dimensional contour is denoised and the results are shown in Fig. 19c and d, which corresponds to the welds as shown in Fig. 15a and b. 5. Conclusion (1) A new 3D reconstruction method has been proposed based on fusion analysis of the images of object under orthogonal structured grid laser illumination and without additional light sources. The orthogonal structured grid laser used in the detection system enables a full view of the object by a single shot. (2) The 3D reconstruction process was fulfilled by the image

4. Visual inspection of welds with the 3D method 4.1. Weld image acquisition and processing The proposed method was used to perform the three-dimensional reconstruction of the weld formed at preset welding parameter. In

Fig. 17. The results of the weld absolute height calculation. (a) The absolute height of weld 1, (b) the absolute height of weld 2. 11

Optics and Laser Technology 119 (2019) 105648

N. Jia, et al.

Fig. 18. The height calculation and error distribution. (a) Weld height value calculation and error distribution of weld 1, (b) weld height value calculation and error distribution of weld 2.

Fig. 19. Three-dimensional reconstruction of welds. (a) Three-dimensional contour of weld 1, (b) three-dimensional contour of weld 2, (c) three-dimensional contour of weld 1 after denoising, (d) three-dimensional contour of weld 2 after denoising.

12

Optics and Laser Technology 119 (2019) 105648

N. Jia, et al.

processing, height calculation and 3D reconstruction algorithm. The bilateral filtering algorithm has been used for 3D reconstruction of weld and the obtained weld contour was consistent with the actual profile of weld. (3) The 3D reconstruction algorithm was performed on the curved surface and industrial welds. Based on system calibration, an error less than ± 0.06 mm was yielded for curved surface measurement and ± 0.09 mm for welds measurement. The low-cost measurement system can provide an acceptable measurement accuracy which makes it liable to be used in industry. In the future work, we will use a more precise point clouds device to achieve accurate ground truth for calibration of the system. (4) The height value obtained by the 3D reconstruction algorithm in this paper is important to detect defects such as lack of fusion, incompletely filled. However, a traditional visual inspection test is still required to detect defects such as surface cracks and surface breaking porosity.

dimensional reconstruction, Opt. Laser Technol. 73 (2015) 54–62. [6] American Welding Society, Examination Book of Specification, 2006. [7] B.F. Wu, H.Y. Huang, C.J. Chen, Y.H. Chen, C.W. Chang, et al., A vision-based blind spot warning system for daytime and nighttime driver assistance, Comput. Electr. Eng. 39 (3) (2013) 846–862. [8] G. Cattaneo, G. Roscigno, U.F. Petrillo, Improving the experimental analysis of tampered image detection algorithm for biometric systems, Pattern Recogn. Lett. 000 (2017) 1–9. [9] M. Ra, H.G. Jung, J.K. Suhr, W.Y. Kim, Part-based Vehicle detection in siderectilinear image for blind-spot detection, Expert Syst. Appl. 101 (2018) 116–128. [10] S. Barone, P. Neri, A. Paoli, V. Razionale, Structured light stereo catadioptric scanner based on a spherical mirror, Opt. Lasers Eng. 107 (2018) 1–12. [11] Q. Mei, J. Gao, H. Lin, Y. Chen, H. Yunbo, et al., Structured light telecentric stereoscopic vision 3D measurement system based on Scheimpflug condition, Opt. Lasers Eng. 86 (2016) 83–91. [12] B.A. Abu-Nabah, A.O. ElSoussi, E.R.K.A. Alami, Simple laser vision sensor calibration for surface profilling applications, Opt. Lasers Eng. 84 (2016) 51–61. [13] B.R. Lee, H.C. Nguyen, Development of laser-vision system for three-dimensional circle detection and radius measurement, Opt. Int. J. Light Electron. Opt. 126 (24) (2015) 5412–5419. [14] J. Roca-Pardinas, H. Lorenzo, P. Arias, J. Armesto, From laser point clouds to surfaces: Statistical nonparametric methods for three-dimensional reconstruction, Comput. Aided Des. 40 (5) (2008) 646–652. [15] S. Zhang, High-speed 3D shape measurement with structured light methods: A review, Opt. Lasers Eng. 106 (2018) 119–131. [16] L.X. Li, L. Xiao, H.Q. Liao, S. Liu, B. Ye, Welding quality monitoring of high frequency straight seam pipe based on image feature, J. Mater. Process. Technol. 246 (2017) 285–290. [17] M. Leo, M.D. Coco, P. Carcagni, P. Spagnolo, P.L. Mazzeo, et al., Automatic visual monitoring of welding procedure in stainless steel kegs, Opt. Lasers Eng. 104 (2017) 220–231. [18] W.J. Shao, Y. Huang, Y. Zhang, A noval weld seam detection method for space weld seam of narrow butt joint in laser welding, Opt. Laser Technol. 99 (2018) 39–51. [19] Y.S. He, Y.L. Xu, Y.X. Chen, H.B. Chen, S.B. Chen, Weld seam profile detection and feature point extraction for multi-pass route planning based on visual attention model, Robot Cim-Int. Manuf. 37 (2016) 251–261. [20] Z. Yanbiao, C. Tao, Laser vision seam tracking system based on image processing and continuous convolution operator tracker, Opt. Lasers Eng. 105 (2018) 141–149. [21] L.G. Zhang, W. Ke, Q.X. Ye, J.B. Jiao, A noval laser vision sensor for weld line detection on wall-climbing robot, Opt. Laser Technol. 60 (2) (2014) 69–79. [22] V. Pagliarulo, F. Farroni, P. Ferraro, A. Lanzotti, M. Martorelli, et al., Combining ESPI with laser scanning for 3D characterization of racing tyres sections, Opt. Lasers Eng. 104 (2018) 71–77. [23] D. Khan, M.A. Shirazi, M.Y. Kim, Single shot laser speckle based 3D acquisition system for medical applications, Opt. Lasers Eng. (2018,;105:) 43–53. [24] Z.Z. Wang, Y.M. Yang, Single-shot three-dimensional reconstruction based on structured light line pattern, Opt. Lasers Eng. 106 (2018) 10–16. [25] M.T. Le, L.C. Chen, C.J. Lin, Reconstruction of accurate 3-D surfaces with sharp edges using digital structured light projection and multi-dimensional image fusion, Opt. Lasers Eng. 96 (2017) 17–34. [26] Z. Zhang, A flexible new tachnique for camera calibration, IEEE Trans. Pattern Anal. Mach. Intell. 22 (11) (2000) 1330–1334.

Acknowledgement The work described in this paper was financially supported by Shanxi provincial key research and development plan (No. 201603D321122). Thanks Liyu Fan, Juntao Yang, Hang Liu for their technological help on system design and programming. Appendix A. Supplementary material Supplementary data to this article can be found online at https:// doi.org/10.1016/j.optlastec.2019.105648. References [1] H.H. Chu, Z.Y. Wang, A vision-based system for post-welding quality measurement and defect detection, Int. Adv. Manuf. Technol. 86 (9–12) (2016) 3007–3014. [2] E.-I.-, Welding. fusion-welded joints in steel, nickel, tianium and their alloys (beam welding excluded), Quality levels for imperfections (ISO 5817:2003 corrected version. 2005, including Technical Corrigendum) European for Standization, 2009. [3] Welding and allied processes - classification of geometric imperfections in metallic materials - part 1: Fusion welding, European committee for standardization, en-iso 6520-1, 2007. [4] Specification for unfired fusion welded pressure vessels The British Standards Institution 2015 PD 5500, 2015. [5] M. Rodriguez-Martin, S. Laguela, D. Gonzalez-Aguilera, P. Rodriguez-Gonzalvez, Procedure for quality inspection of welds based on macro-photogrammetric three-

13