You are on page 1of 6

Proceedings of the 2005 IEEE/ASME WD2-02

International Conference on Advanced Intelligent Mechatronics


Monterey, California, USA, 24-28 July, 2005

Development of an Automatic Optical Measurement System for


Automotive Part Surface Inspection
Quan Shi, Ning Xi, and Yifan Chen

Abstract This paper introduces an automated 3D optical A robotic area sensing system is developed for dimen-
measurement system. In industrial inspection, Coordinate Mea- sional measurement of inspecting industrial parts. The sys-
surement Machines (CMMs) provide accurate measurement but tem is illustrated in Figure 1. We developed an area sensor
are very time consuming because only one point can be acquired
each time. An area sensor based robotic 3D measurement prototype and an area sensor planner to implement this
system can acquire an automotive parts surface patch-by- robotic area sensing system, which can generate 3D point
patch, which reduces the inspection time signicantly. This clouds of a part automatically.
paper describes a pixel-to-pixel sensor calibration scheme and Gray Code and Line Shifting (GCLS) method can give
a bounding box based sensor planning method which are high sensor precision and accuracy [3] [4]. Precision species
developed for the automatic optical measurement system of
large and complex surfaces. Experiment results are presented a limitation that an area sensor can reach and sensor accuracy
and discussed. species the ability of an area sensor to tell the truth of
a parts 3D dimensions. An area sensor with high accuracy
I. INTRODUCTION may not have high measurement precision.
Calibration of an area sensor has been discussed for years.
Current manufacturing industry needs a rapid product Trobina [5] developed an error model for an active range
dimensional inspection process [1]. Instead of measuring an sensor, in which the 3D sensors model was developed
object point-by-point using Coordinate Measurement Ma- based on camera perspective geometry and a maximum
chine (CMMs), people expect measuring an objects surface likelihood estimation method. A multiple-plane method [6]
patch-by-patch using an area sensor. A lot research has been was developed to calibrated lens distortion of a projector for
done on 3D shape measurement [2]. But, to implement a 3D range sensors. A relative error model [7] was proposed
rapid inspection system into industry application, following for calibration of the active triangulation sensing model.
ve requirements need to be satised: fast, high accuracy, Recently, a self-calibration method [8] has been developed
high precision, high point density, and inexpensive. Precision for an active 3D sensing system. The distance between
is dened as the minimum surface height difference that can correspondent points was also analyzed. The rectication of
be measured by an area sensor. stripe location was well formulated to calculate the distance
To reduce the time of an inspection process, vision based between the pair of correspondent points.
surface measurement methods attract a lot of attention. Vari- Pinhole camera model was widely used in area sensor
eties of 3D sensors, line or area scanners, using laser light or calibration. Therefore, in the area sensor triangulation model,
white light projection, have been developed. Usually, expe- all incident light beams are emitted from one point, the
rienced technicians are required for controlling illumination pupil of the projector lens; and all recorded light beam
conditions, setting the 3D sensor on different viewpoints, and are considered to pass through the pupil of camera lens,
re-calibrating system. Therefore, an automated dimensional which forms two end points of the baseline of active sensing
inspection system is essential to manufacturing industry. triangle. After calibration, all points on the image plane are
calculated based this triangular geometry. This triangulation
Area Sensor CAD model relies on the well calibrated lens. It has been known
Model Model that camera calibration process needs a nonlinear least square
estimation by using Levenberg-Marquardt optimization. But,
Area Sensor 3D Point Clouds Error Map
Planner Measurement Integration Generation without a good initial parameter estimation, the optimization
algorithm will have a local minimum and the calibration may
Fig. 1. An robotic area sensing system for dimensional inspection fail [9][10]. Even if it is calibrated successfully, calibration
residue may not be tolerable for part inspection. Therefore,
The work described in this paper is supported under NSF Grant IIS- off-the-shelf lenses may not be able to be used for a commer-
9796300, IIS-9796287, EIA-9911077, DMI-0115355 and FORD Motor cial 3D sensors. A pixel-to-pixel strategy has been developed
Company university research project. for calibrating off-the-shelf lenses. Instead of calibrating the
Q. Shi is with the Department of Electrical and Computer En-
gineering, Michigan State University, East Lansing, MI 48823, USA frames of a camera and a projector using the pinhole model,
shiquan@egr.msu.edu each pair of corresponding points are calibrated. Therefore,
N. Xi is with Faculty of Electrical and Computer Engineering, Michigan the area sensor parameters are no longer a group of constants,
State University, East Lansing, MI 48823, USA xin@egr.msu.edu
Y. Chen is with the Scientic Research Lab, Ford Motor Company, but a group of matrices corresponding to pairs of pixels.
Dearborn, MI, USA ychen1@ford.com To get a 3D point cloud, the area sensor need to be

0-7803-9046-6/05/$20.00 2005 IEEE. 1557


Camera Projector
calibrated in 3D space. Usually, the baseline between the d
camera and the projection has an offset angle to the direction
of stripe. Simulation analysis shows that angle signicantly
affect the measurement accuracy. Because of the cameras
perspective geometry model, transformation from measured S
2.5D height maps to 3D point clouds is necessary for
part shape measurement. An exploding vector method is
developed in this paper for calibrating the transformation D

matrix. h
Sensor planning has been an active research area [11] [12]. C A
LAC Reference plane
Because a prior CAD model embeds delity into the planning
strategy, Model-based sensor planning methods are more reli- Fig. 2. Measurement conguration of area sensor using structured light
able than non-model-based methods to be used in inspection method
system. Knowledge based methods [13][14] require expert
information. Generate-and-test approach can simplify sensor
planning into a searching problem [15]. Cowan and Kovesi analysis is to nd the corresponding point A and C, so that
et.al [16] made task constraints as searching functions and the distance LAC can be obtained. The surface relative height
solved the task as a constraint-satisfaction problem. One h can be calculated. As reported in [4], GCLS method has
limitation is that the solution space is conned because they better performance in accuracy, precision, and point density
considered fewer degrees of freedom on orientations. Taraba- in 3D shape measurement.
nis et.al [17] developed a MVP system which integrates the B. Calibration of a Digital Area Sensor
robot vision task constraints with CAD models and optical
The positioning error of the area sensor need to be
sensor models. Sheng et.al [18] developed a CAD-Based
calibrated. The frames of the camera and projector, are
robot motion planning system by combining the generate-
not easy to be aligned accurately. Distortion of an off-the-
and-test and the constraint-satisfaction approach. One prob-
shelf lens will cause the measurement errors. A pixel-to-
lem in those system is that sensors usually are considered as
pixel calibration method has been developed based on the
a single camera, or equivalent to a single camera model.
corresponding points. As shown in Figure 3, for each pixel
In applications of 3D sensing using the structured light
on projector plane, there will be a virtual point correspondent
method, the triangulation of the projector and camera makes
on the camera side. With this pixel-to-pixel method, d and S
the area sensor model different to a single camera model.
are no longer constants but matrices. Equation(2) shows the
Therefore, placing an area sensor is a different problem
calculation of the relative height, in which, each digitized
from placing a camera. To estimate area sensor viewpoints,
surface point (xi , yi ) has a pair of corresponding sensor
task constraints for both camera and projector have to be
parameters, d(x,y) and S(x,y) .
satised. For planning 3D sensors, Prieto et.al [19] developed
a 3D laser camera and a NURBS (Non-Uniform Rational d (x1,y1)
B-Splines) based sensor planner, viewpoints are projected Camera
d (x2,y2)
Projector
on a 2D bitmap and solve occlusion and collision problem plane plane

using a 3D voxel model. Chen and Li [20] developed a


model-based sensor planning system by mounting two video S (x1,y1)
cameras on a PUMA robot. An equivalent viewing model
couples constraints of both cameras for the sensor planning
issue. S (x2,y2)

II. AUTOMATED O PTICAL 3D M EASUREMENT S YSTEM Reference


P(x1,y1) P(x2,y2)
A. Measurement using a Digital Area Sensor
The conguration of area sensors and measurement prin- Fig. 3. Calibration of sensor triangulation using pixel-to-pixel strategy
ciple has been shown in Figure 2, where h represents the
distance from one surface point D to the reference plane. L(x,y) S(x,y)
S is the standoff from camera frame to the reference plane h(x,y) = (2)
d(x,y) + L(x,y)
and d represents the baseline distance from the camera to
the projector. From Figure 2, the following equation can be Because calibration is done independently on each pixel,
obtained: distortion problem no longer exists for calibrating (d, S)
and the calibration process is very simply but accurate. A
LAC S
h= (1) multiple plane method and a linear least square method have
d + LAC been developed to calibrated the sensor parameter matrix
Equation (1) shows how to calculate the height from surface d(x,y) and S(x,y) . Equation (3) shows the calibration formula:
points to the measurement reference plane. The task of image by shifting the reference board up multiple times, a set of

1558
heights h and lengthes L can be measured, then (d, S) can
be obtained by this calibration equation.
h(x,y) = h(x,y) h(x,y) (6)

h1 (i, j) L1 (i, j)   h1 (i, j)L1 (i, j) L(x,y) S(x,y) L(x,y) S(x,y)
... d(i,j) = (3) =
...
S(i,j)
..
d(x,y) + L(x,y) d(x,y) + L(x,y)
hn (i, j) Ln (i, j) hn (i, j)Ln (i, j)

Figure 3 only shows the offset of area sensor in the plane Following two tables show simulated values of
perpendicular to the reference plane. An offset angle exists measurement errors caused by different offset angles
in a plane parallel with the reference plane, as can be see in , sensor parameters are: S=490.19mm, d=160.91mm,
Figure 4. Since the projected stripes have same code along R=0.165mm/pixel. When h is a constant, expanding
one direction of the pattern, the corresponding point may will increase measurement error signicantly; when is
fall in an arbitrary position. To eliminate this arbitrariness, a constant, the higher the surface is to the reference, the
the camera and the projector have to be aligned so that bigger is the error obtained. The quantity of the errors
sensor baseline is perpendicular to projection stripes, which show that it is necessary to calibrate the offset angle .
cannot be guaranteed in practice. Similar to the pixel-to-pixel Calibration of (x,y) can be done by following two steps:
calibration method, calibration of those offset angles need to TABLE I
be done individually to eliminate the arbitrariness in nding M EASUREMENT E RROR OF P IXEL D ISTANCE L(x,y) (pixel)
correspondent points.
1 = 2o 2 = 4o 3 = 6o 4 = 8o
Projector h1=20mm 0.025 0.101 0.223 0.408
h2=50mm 0.068 0.270 0.610 1.089
d h3=80mm 0.116 0.464 1.048 1.870
Camera M TABLE II
s
M
M EASUREMENT E RROR OF D EPTH h (mm)

1 = 2o 2 = 4o 3 = 6o 4 = 8o
h1=20mm 0.0117 0.0468 0.1057 0.1885
C C h2=50mm 0.0274 0.1102 0.2472 0.4408
Gauge h3=80mm 0.0408 0.1634 0.3684 0.6568
A
A

1). Detect the group of correspondent points of A and C,
and 2). calculate the offset angle to the axis of image plane.
Equation (2) only measures a height map, or called 2.5D
Fig. 4. Area sensor offset angle in XY plane
map, of a 3D surface. Task of area sensing for dimension
inspection requires a 3D point cloud, which not only tells
In Figure 4, point M is the desired camera position which depth but also length and width information of 3D surfaces.
makes the baseline MP perpendicular to the projection stripe. an exploding vector method has been developed to detect
For a point on the top of a gauge with equal height, point
A and point C are the correspondent points for measuring Z

the relative height from the part surface to the reference. It V


Px2
Px1 Image Plane

can be seen that point A and point C are on the intersection


line between the triangle plane to the reference plane with a
length of L(x,y) . But, in reality, the camera is setup at point
M and therefore the triangle plane, has another intersection
B
line which has an angle to the desired one. The point A D

and point C, which are correspondent points of a surface C


X
point with same height to the reference plane, has different X1 X2
length L(x,y) . The difference of L(x,y) causes an error
h(x,y) in depth measurement in equation (2). Equation Fig. 5. Exploding vector method for transformation from a height map to
a 3D point cloud
(5) shows the relationship between the offset angle and the
measurement error of L(x,y) . And equation (6) tells the
height error caused by offset angle L(x,y) . the x,y coordinates of the measured 3D points. Figure 5
illustrates a vector on the image plane which corresponds to
the depth change of a point along Z direction. In Figure 5,
point B and point C have same image coordinates Px2
L(x,y) = L(x,y) sec() (4)
but different x coordinates x1 and x2 respectively. Without
L(x,y) = L(x,y)
L(x,y) depth information, it is impossible to tell the difference
= (1 sec()) L(x,y) (5) of x coordinates between point B and point C. In another

1559
2) Field of view determines the size of maximum in-
spection area. It is usually a rectangular eld. In the
developed 3D sensor planner, it was determined by the
standoff and camera viewing angles.
3) Resolution denes the entitys minimal dimension to
be mapped onto one camera pixel.
4) Point density is a constraint determined by the eld
of view and the resolution of the projector, which is a
new constraint developed for the automated 3D sensor
(a) (b)
planning system. Point density constraint ensures that
Fig. 6. Calibration for X,Y coordinates using exploding vectors enough points can be measured for certain area of
(a)simulation result (b)measured Results surfaces.
5) Focus constraint denes the farthest measurement dis-
tance and the nearest measurement distance from a
situation, point C and point D have same coordinates x2 viewpoint.
but different image coordinates Px1 and Px2 . A zooming
phenomena will be shown on the image plane if moving the A bounding box method has been developed to integrate
object in the viewing direction to the camera. A 3D point, all of those constraints for searching potential viewpoints
when ying towards the camera, forms an expansion track, to satisfy a tolerant measurement error, as shown in Figure
which is called an exploding vector V . We utilize exploding 7.
vectors to calibrate x,y coordinates with given measured
depth h(x,y) . For independence, a group of exploding vectors
were calibrated out so that x,y location of a 3D surface point
then can be projected back to the reference plane. Bound box with meshed CAD
model

Figure 6(a) shows the exploding phenomena, three groups


of calibration points are detected in sub-pixel accuracy and
images are stacked together to show exploding vectors. In
calibration, as many as 30 groups of calibration points are Camera
FOV region
Projector
FOV region

used to calibrate the vector K(x,y) . Figure 6(b) shows the


Camera Projector
simulation result that using the calibrated exploding vector Focus region Focus region

K(x,y) to estimate the location of correspondent points. It Camera Projector


can be seen that the estimation result is very close to the
measurement result. Fig. 7. Integrating planning constraints using a bounding box method
A physical coordinate system can be built on the reference
plane by using a check board. The transformation from the
image frame to this physical cartesian frame can be obtained The viewpoint search algorithm can be described as a
through a linear least square method as shown in equation(7) process of satisfying the camera constraints under the condi-
    tions of satisfying the projector constraints. An optimization
Px Ix dx process to minimize the inspection error can help to obtain
= TS TR (7)
Py Iy dy the potential viewpoints:
where (Ix , Iy ) are image coordinates, (dx , dy ) are translation 
N
distance from the physical origin to the image origin. TS min F (Ci1 , Ci2 , Ci3 , Ci4 , Ci5 Rc | Pi1 , Pi2 , Pi3 , Pi4 , Pi5 Rp )


represents the scaling transformation and TR represents the P i=1
(8)
rotation transformation.

where P represents the potential viewpoints, N represents
C. Area Sensor Planning the number of meshed triangles in the inspected eld,
Ci1 ,Ci2 ,Ci3 ,Ci4 ,Ci5 and Pi1 ,Pi2 ,Pi3 ,Pi4 ,Pi5 are camera and
In manufacturing, a dimensional inspection system re- projector constraints respectively. If there is one solution
quires the area sensor to be congured on multiple view-

of P existing for satisfying equation (8), and the overall
points automatically. An area sensor planner takes the CAD measurement error of this bounded patch can be tolerated, a
model and area sensor model as inputs, and output a group viewpoint is congured and assigned to this bounding box.
of sensor congurations, which satisfy constraints of 3D If no solution exists, the inspected patch and its respective
measurement. Currently, there are ve constraints satised bounding box will be split, each sub-patch will be tested to
in the developed area sensor planner: assign a viewpoint using this search algorithm. This process
1) Visibility constraint demands that there is less occlu- iteratively executed till a solution is reached. Figure 8 shows
sion between the entity and the area sensor along the this recursive searching process for obtaining viewpoints that
line of projection and collection. satisfy the relative constraints.

1560
Tessellated CAD model

Partition into patches based on


the area sensor model

Bounding Box Generation

Calculate viewpoint region of


the bounding box

Find the optimal viewpoint


pose

Output Viewpoint
Fig. 10. Calibration of transformation matrix from 2.5D height maps to
3D point clouds
Fig. 8. A recursive viewpoint searching algorithm.

a standard deviation of 0.0162mm.


III. E XPERIMENTS I MPLEMENTATION
TABLE III
Area sensor calibration was made on a DYNA 2400C NC
M EASUREMENT ACCURACY AFTER CALIBRATION
machine, a reference board was xed on the platform of the
NC machine and can be moved along 3 directions. The area No. of calibration points Mean: (mm) Std: (mm)
sensor is made by a Sony XCD710 camera and a Plus V- 4096 0.0018 0.0261
16384 0.0011 0.0185
1100 digital projector. A plano convex lens is mounted in
65536 0.0006 0.0162
the front of the projector lens. This enables the projected
image to be focused around 500mm from the projector. The
inspection eld is 160mm 120mm. One Matrox image
adapter is installed on a PC for image collection. Software
for grabbing images and analysis is developed using C++
and Matlab. Figure 9 shows the calibration setup.

(b) Point cloud measured


before calibration

(a) A well fabricated gauge (c) Point cloud measured


for calibration evaluation after calibration

Fig. 11. Measurement evaluation on a gauge (Four slots were made on the
gauge, from left to right, the depth are: 32 1m, 24 1m, 14 2m,
and 6 2m )

A well fabricated gauge has been used to evaluate a


Fig. 9. Calibrating the area sensor on DYNA 2400C NC machine dimensional measurement after area sensor calibration. The
gauge shown in Figure 11(a) has four slots, from left to
Figure 10 shows the detected calibration check board for right: 32 1m, 24 1m, 14 2m, and 6 2m.
transformation from 2.5D height map to 3D point clouds. Figure 11(b) is a point cloud measured before calibration
A measurement on a known height at surface was used to and Figure 11(c) shows the measured result. It can been
evaluate the calibration result. A statistic result is shown in seen that our area sensor prototype has an ability to detect
the table III, in which when applying more calibration points, the a depth change as small as to 14m. Figure 12
the mean of the measurement error is closed to 0mm with shows an implementation of the robotic area sensing system

1561
for measuring automotive parts and can be extended to many
areas which require fast and accurate dimension measure-
ment of 3D surfaces.
R EFERENCES
[1] F. Chen, G. M. Brown, and M. Song. Overview of three-dimensional
shape measurement using optical methods, Optical Engineering, Vol.
39, No. 1, pp. 10-22, 2000.
[2] R. Jarvis, Range sensing for computer vision, Advances in Image
Communications. Elsevier Science Publishers, Amsterdam, pp. 17-56,
1993.
Fig. 12. Dimensional measurement of a door panel part using the automated [3] J. Salvi, J. Pags, and J. Batlle, Pattern codication strategies in
area sensing system structured light systems, Pattern Recognition, Vol. 37, No. 4, pp. 827-
849, 2004.
[4] J. Guhring, Dense 3-D surface acquisition by structured light using
off-the-shelf components, Videometrics and Optical Methods for 3D
Shape Measurement, SPIE (4309), pp. 220-231, 2001.
[5] M. Trobina, Error Model of a Coded-Light Range Sensor, BIWI-TR-
164, Sept. 21, 1995.
[6] T. S. Shen, C. H. Meng, Digital Projector Calibration for 3-D Active
Vision System, Transaction of the ASME, Vol. 124, pp. 126-134, Feb,
2002.
[7] G. Sansoni, M. Carocci, and T. Rodella, Calibration and Performace
Evaluation of a 3-D Imaging Sensor Based on the Projection of Struc-
tured Light, IEEE Transactions on Instrumentation and Measurement,
Vol. 49, No. 3, pp. 628-636, 2000.
Fig. 13. 3D shape measurement of a surface with a small depth range [8] Y. F. Li, and S. Y. Chen, Automatic Recalibration of an Active
Structured Light Vision System, IEEE Transactions on Robotics and
Automation, Vol.19, No.2, pp. 259-268, 2003.
[9] J. Weng, P. Cohen, and M. Herniou, Camera Calibration with distor-
tion models and accuracy evaluation, IEEE Transactions on Pattern
Analysis and Machine Intelligence, PAMI-14(10): pp. 965-980, 1992.
[10] J. Heikkila, and O. silven, A Four-step Camera Calibration Procedure
with Implicit Image Correction, In Proc. of IEEE Computer Vision
and Pattern Recognition, pp. 1106-1112, 1997.
[11] T. S. Newman, A. K. Jain, A Survey of Automated Visual Inspection ,
Computer Vision and Image Understanding, Vol. 61, No. 2, pp. 231-
262, 1995.
[12] W. R. Scott, G. Roth, J. F. Rivest, View Planning for Automated Three-
Dimensional Object Reconstruction and Inspection, ACM Computing
Fig. 14. 3D shape measurement of a surface with a big depth range Surveys, Vol. 35, No. 1, pp. 64-96, 2003.
[13] H. D. Park and O. R. Mitchell, CAD based planning and execution of
inspection, In Proceedings of the 1988 Computer Society Conference
on Computer Vision and Pattern Recognition, pp. 858-863, 1988.
for optical dimensional measurement of a door panel in [14] M. Marefat and R. L. Kashyap, Automatic construction of process
automotive industry. Figure 13 and Figure 14 demonstrate plans from solid model representations, IEEE transactions on systems,
man, and cybernetics, Vol. 22, No. 5, pp. 1097-1115, 1992.
the measurement point clouds. [15] S. Sakane, M. Ishii, and M. Kakikura, Occusion avoidance of visual
sensors based on a hand eye action simulator system: Heaven,
IV. CONCLUSION Advanced Robotics, Vol. 2 No.2, pp. 149-165, 1987.
[16] C. K. Cowan and P. D. Kovesi, Automatic sensor placement from vision
Methods for implementing an automated dimensional op- task requirements, IEEE transaction on pattern analysis and machine
tical measurement system has been presented in this paper. A intelligence, Vol. 10, No. 3, pp. 407-416, 1988.
prototype of an area sensor has been developed. A series of [17] K. Tarabanis, R. Tsai, and P. Allen, The MVP sensor planning system
for robotic vision tasks, IEEE Transactions on Robot and Automation,
calibration methods of a robotic area sensing system have Vol. 11, No. 1, pp. 72-85, 1995.
been described in this paper. A pixel-to-pixel calibration [18] W. Sheng, N. Xi, M. Song, and Y. Chen, CAD-Guided Robot Motion
strategy can effectively obtain the sensor positioning errors, Planning, International Journal of Industrial Robot, pp. 143-151,
March 2001.
which exist not only in a plane perpendicular to reference [19] F. Prieto, T. Redarce, P. Boulanger, and R. Lepage, CAD-Based Range
but also a plane parallel to the reference. To transform a Sensor Placement for Optimum 3D Data Acquisition, in Proceedings
height map to a 3D point cloud, an exploding vector method of 2nd International Conference, on 3-D Digital Imaging and Model-
ing, pp. 128-137, 1999.
has been developed to calibrate the transformation using [20] S. Y. Chen and Y. F. Li, Automatic Sensor Placement for Model-Based
the measured depth information. A modied bounding box Robot Vision, IEEE Transactions on System Man and Cybernetics, Part
method has been developed for planning the area sensor. B, Vol. 34, No. 1, pp. 393-408, 2004.
Both the constraints of the camera and the projector have
been integrated using the developed bounding box method.
An automated 3D surface measurement system then was
implemented on a PUMA robot. Experiment results show
the successful testing of such a system on measuring an
industrial part. The developed methods are not constrained

1562

You might also like