Professional Documents
Culture Documents
Abstract This paper introduces an automated 3D optical A robotic area sensing system is developed for dimen-
measurement system. In industrial inspection, Coordinate Mea- sional measurement of inspecting industrial parts. The sys-
surement Machines (CMMs) provide accurate measurement but tem is illustrated in Figure 1. We developed an area sensor
are very time consuming because only one point can be acquired
each time. An area sensor based robotic 3D measurement prototype and an area sensor planner to implement this
system can acquire an automotive parts surface patch-by- robotic area sensing system, which can generate 3D point
patch, which reduces the inspection time signicantly. This clouds of a part automatically.
paper describes a pixel-to-pixel sensor calibration scheme and Gray Code and Line Shifting (GCLS) method can give
a bounding box based sensor planning method which are high sensor precision and accuracy [3] [4]. Precision species
developed for the automatic optical measurement system of
large and complex surfaces. Experiment results are presented a limitation that an area sensor can reach and sensor accuracy
and discussed. species the ability of an area sensor to tell the truth of
a parts 3D dimensions. An area sensor with high accuracy
I. INTRODUCTION may not have high measurement precision.
Calibration of an area sensor has been discussed for years.
Current manufacturing industry needs a rapid product Trobina [5] developed an error model for an active range
dimensional inspection process [1]. Instead of measuring an sensor, in which the 3D sensors model was developed
object point-by-point using Coordinate Measurement Ma- based on camera perspective geometry and a maximum
chine (CMMs), people expect measuring an objects surface likelihood estimation method. A multiple-plane method [6]
patch-by-patch using an area sensor. A lot research has been was developed to calibrated lens distortion of a projector for
done on 3D shape measurement [2]. But, to implement a 3D range sensors. A relative error model [7] was proposed
rapid inspection system into industry application, following for calibration of the active triangulation sensing model.
ve requirements need to be satised: fast, high accuracy, Recently, a self-calibration method [8] has been developed
high precision, high point density, and inexpensive. Precision for an active 3D sensing system. The distance between
is dened as the minimum surface height difference that can correspondent points was also analyzed. The rectication of
be measured by an area sensor. stripe location was well formulated to calculate the distance
To reduce the time of an inspection process, vision based between the pair of correspondent points.
surface measurement methods attract a lot of attention. Vari- Pinhole camera model was widely used in area sensor
eties of 3D sensors, line or area scanners, using laser light or calibration. Therefore, in the area sensor triangulation model,
white light projection, have been developed. Usually, expe- all incident light beams are emitted from one point, the
rienced technicians are required for controlling illumination pupil of the projector lens; and all recorded light beam
conditions, setting the 3D sensor on different viewpoints, and are considered to pass through the pupil of camera lens,
re-calibrating system. Therefore, an automated dimensional which forms two end points of the baseline of active sensing
inspection system is essential to manufacturing industry. triangle. After calibration, all points on the image plane are
calculated based this triangular geometry. This triangulation
Area Sensor CAD model relies on the well calibrated lens. It has been known
Model Model that camera calibration process needs a nonlinear least square
estimation by using Levenberg-Marquardt optimization. But,
Area Sensor 3D Point Clouds Error Map
Planner Measurement Integration Generation without a good initial parameter estimation, the optimization
algorithm will have a local minimum and the calibration may
Fig. 1. An robotic area sensing system for dimensional inspection fail [9][10]. Even if it is calibrated successfully, calibration
residue may not be tolerable for part inspection. Therefore,
The work described in this paper is supported under NSF Grant IIS- off-the-shelf lenses may not be able to be used for a commer-
9796300, IIS-9796287, EIA-9911077, DMI-0115355 and FORD Motor cial 3D sensors. A pixel-to-pixel strategy has been developed
Company university research project. for calibrating off-the-shelf lenses. Instead of calibrating the
Q. Shi is with the Department of Electrical and Computer En-
gineering, Michigan State University, East Lansing, MI 48823, USA frames of a camera and a projector using the pinhole model,
shiquan@egr.msu.edu each pair of corresponding points are calibrated. Therefore,
N. Xi is with Faculty of Electrical and Computer Engineering, Michigan the area sensor parameters are no longer a group of constants,
State University, East Lansing, MI 48823, USA xin@egr.msu.edu
Y. Chen is with the Scientic Research Lab, Ford Motor Company, but a group of matrices corresponding to pairs of pixels.
Dearborn, MI, USA ychen1@ford.com To get a 3D point cloud, the area sensor need to be
matrix. h
Sensor planning has been an active research area [11] [12]. C A
LAC Reference plane
Because a prior CAD model embeds delity into the planning
strategy, Model-based sensor planning methods are more reli- Fig. 2. Measurement conguration of area sensor using structured light
able than non-model-based methods to be used in inspection method
system. Knowledge based methods [13][14] require expert
information. Generate-and-test approach can simplify sensor
planning into a searching problem [15]. Cowan and Kovesi analysis is to nd the corresponding point A and C, so that
et.al [16] made task constraints as searching functions and the distance LAC can be obtained. The surface relative height
solved the task as a constraint-satisfaction problem. One h can be calculated. As reported in [4], GCLS method has
limitation is that the solution space is conned because they better performance in accuracy, precision, and point density
considered fewer degrees of freedom on orientations. Taraba- in 3D shape measurement.
nis et.al [17] developed a MVP system which integrates the B. Calibration of a Digital Area Sensor
robot vision task constraints with CAD models and optical
The positioning error of the area sensor need to be
sensor models. Sheng et.al [18] developed a CAD-Based
calibrated. The frames of the camera and projector, are
robot motion planning system by combining the generate-
not easy to be aligned accurately. Distortion of an off-the-
and-test and the constraint-satisfaction approach. One prob-
shelf lens will cause the measurement errors. A pixel-to-
lem in those system is that sensors usually are considered as
pixel calibration method has been developed based on the
a single camera, or equivalent to a single camera model.
corresponding points. As shown in Figure 3, for each pixel
In applications of 3D sensing using the structured light
on projector plane, there will be a virtual point correspondent
method, the triangulation of the projector and camera makes
on the camera side. With this pixel-to-pixel method, d and S
the area sensor model different to a single camera model.
are no longer constants but matrices. Equation(2) shows the
Therefore, placing an area sensor is a different problem
calculation of the relative height, in which, each digitized
from placing a camera. To estimate area sensor viewpoints,
surface point (xi , yi ) has a pair of corresponding sensor
task constraints for both camera and projector have to be
parameters, d(x,y) and S(x,y) .
satised. For planning 3D sensors, Prieto et.al [19] developed
a 3D laser camera and a NURBS (Non-Uniform Rational d (x1,y1)
B-Splines) based sensor planner, viewpoints are projected Camera
d (x2,y2)
Projector
on a 2D bitmap and solve occlusion and collision problem plane plane
1558
heights h and lengthes L can be measured, then (d, S) can
be obtained by this calibration equation.
h(x,y) = h(x,y) h(x,y) (6)
h1 (i, j) L1 (i, j) h1 (i, j)L1 (i, j) L(x,y) S(x,y) L(x,y) S(x,y)
... d(i,j) = (3) =
...
S(i,j)
..
d(x,y) + L(x,y) d(x,y) + L(x,y)
hn (i, j) Ln (i, j) hn (i, j)Ln (i, j)
Figure 3 only shows the offset of area sensor in the plane Following two tables show simulated values of
perpendicular to the reference plane. An offset angle exists measurement errors caused by different offset angles
in a plane parallel with the reference plane, as can be see in , sensor parameters are: S=490.19mm, d=160.91mm,
Figure 4. Since the projected stripes have same code along R=0.165mm/pixel. When h is a constant, expanding
one direction of the pattern, the corresponding point may will increase measurement error signicantly; when is
fall in an arbitrary position. To eliminate this arbitrariness, a constant, the higher the surface is to the reference, the
the camera and the projector have to be aligned so that bigger is the error obtained. The quantity of the errors
sensor baseline is perpendicular to projection stripes, which show that it is necessary to calibrate the offset angle .
cannot be guaranteed in practice. Similar to the pixel-to-pixel Calibration of (x,y) can be done by following two steps:
calibration method, calibration of those offset angles need to TABLE I
be done individually to eliminate the arbitrariness in nding M EASUREMENT E RROR OF P IXEL D ISTANCE L(x,y) (pixel)
correspondent points.
1 = 2o 2 = 4o 3 = 6o 4 = 8o
Projector h1=20mm 0.025 0.101 0.223 0.408
h2=50mm 0.068 0.270 0.610 1.089
d h3=80mm 0.116 0.464 1.048 1.870
Camera M TABLE II
s
M
M EASUREMENT E RROR OF D EPTH h (mm)
1 = 2o 2 = 4o 3 = 6o 4 = 8o
h1=20mm 0.0117 0.0468 0.1057 0.1885
C C h2=50mm 0.0274 0.1102 0.2472 0.4408
Gauge h3=80mm 0.0408 0.1634 0.3684 0.6568
A
A
1). Detect the group of correspondent points of A and C,
and 2). calculate the offset angle to the axis of image plane.
Equation (2) only measures a height map, or called 2.5D
Fig. 4. Area sensor offset angle in XY plane
map, of a 3D surface. Task of area sensing for dimension
inspection requires a 3D point cloud, which not only tells
In Figure 4, point M is the desired camera position which depth but also length and width information of 3D surfaces.
makes the baseline MP perpendicular to the projection stripe. an exploding vector method has been developed to detect
For a point on the top of a gauge with equal height, point
A and point C are the correspondent points for measuring Z
1559
2) Field of view determines the size of maximum in-
spection area. It is usually a rectangular eld. In the
developed 3D sensor planner, it was determined by the
standoff and camera viewing angles.
3) Resolution denes the entitys minimal dimension to
be mapped onto one camera pixel.
4) Point density is a constraint determined by the eld
of view and the resolution of the projector, which is a
new constraint developed for the automated 3D sensor
(a) (b)
planning system. Point density constraint ensures that
Fig. 6. Calibration for X,Y coordinates using exploding vectors enough points can be measured for certain area of
(a)simulation result (b)measured Results surfaces.
5) Focus constraint denes the farthest measurement dis-
tance and the nearest measurement distance from a
situation, point C and point D have same coordinates x2 viewpoint.
but different image coordinates Px1 and Px2 . A zooming
phenomena will be shown on the image plane if moving the A bounding box method has been developed to integrate
object in the viewing direction to the camera. A 3D point, all of those constraints for searching potential viewpoints
when ying towards the camera, forms an expansion track, to satisfy a tolerant measurement error, as shown in Figure
which is called an exploding vector V . We utilize exploding 7.
vectors to calibrate x,y coordinates with given measured
depth h(x,y) . For independence, a group of exploding vectors
were calibrated out so that x,y location of a 3D surface point
then can be projected back to the reference plane. Bound box with meshed CAD
model
1560
Tessellated CAD model
Output Viewpoint
Fig. 10. Calibration of transformation matrix from 2.5D height maps to
3D point clouds
Fig. 8. A recursive viewpoint searching algorithm.
Fig. 11. Measurement evaluation on a gauge (Four slots were made on the
gauge, from left to right, the depth are: 32 1m, 24 1m, 14 2m,
and 6 2m )
1561
for measuring automotive parts and can be extended to many
areas which require fast and accurate dimension measure-
ment of 3D surfaces.
R EFERENCES
[1] F. Chen, G. M. Brown, and M. Song. Overview of three-dimensional
shape measurement using optical methods, Optical Engineering, Vol.
39, No. 1, pp. 10-22, 2000.
[2] R. Jarvis, Range sensing for computer vision, Advances in Image
Communications. Elsevier Science Publishers, Amsterdam, pp. 17-56,
1993.
Fig. 12. Dimensional measurement of a door panel part using the automated [3] J. Salvi, J. Pags, and J. Batlle, Pattern codication strategies in
area sensing system structured light systems, Pattern Recognition, Vol. 37, No. 4, pp. 827-
849, 2004.
[4] J. Guhring, Dense 3-D surface acquisition by structured light using
off-the-shelf components, Videometrics and Optical Methods for 3D
Shape Measurement, SPIE (4309), pp. 220-231, 2001.
[5] M. Trobina, Error Model of a Coded-Light Range Sensor, BIWI-TR-
164, Sept. 21, 1995.
[6] T. S. Shen, C. H. Meng, Digital Projector Calibration for 3-D Active
Vision System, Transaction of the ASME, Vol. 124, pp. 126-134, Feb,
2002.
[7] G. Sansoni, M. Carocci, and T. Rodella, Calibration and Performace
Evaluation of a 3-D Imaging Sensor Based on the Projection of Struc-
tured Light, IEEE Transactions on Instrumentation and Measurement,
Vol. 49, No. 3, pp. 628-636, 2000.
Fig. 13. 3D shape measurement of a surface with a small depth range [8] Y. F. Li, and S. Y. Chen, Automatic Recalibration of an Active
Structured Light Vision System, IEEE Transactions on Robotics and
Automation, Vol.19, No.2, pp. 259-268, 2003.
[9] J. Weng, P. Cohen, and M. Herniou, Camera Calibration with distor-
tion models and accuracy evaluation, IEEE Transactions on Pattern
Analysis and Machine Intelligence, PAMI-14(10): pp. 965-980, 1992.
[10] J. Heikkila, and O. silven, A Four-step Camera Calibration Procedure
with Implicit Image Correction, In Proc. of IEEE Computer Vision
and Pattern Recognition, pp. 1106-1112, 1997.
[11] T. S. Newman, A. K. Jain, A Survey of Automated Visual Inspection ,
Computer Vision and Image Understanding, Vol. 61, No. 2, pp. 231-
262, 1995.
[12] W. R. Scott, G. Roth, J. F. Rivest, View Planning for Automated Three-
Dimensional Object Reconstruction and Inspection, ACM Computing
Fig. 14. 3D shape measurement of a surface with a big depth range Surveys, Vol. 35, No. 1, pp. 64-96, 2003.
[13] H. D. Park and O. R. Mitchell, CAD based planning and execution of
inspection, In Proceedings of the 1988 Computer Society Conference
on Computer Vision and Pattern Recognition, pp. 858-863, 1988.
for optical dimensional measurement of a door panel in [14] M. Marefat and R. L. Kashyap, Automatic construction of process
automotive industry. Figure 13 and Figure 14 demonstrate plans from solid model representations, IEEE transactions on systems,
man, and cybernetics, Vol. 22, No. 5, pp. 1097-1115, 1992.
the measurement point clouds. [15] S. Sakane, M. Ishii, and M. Kakikura, Occusion avoidance of visual
sensors based on a hand eye action simulator system: Heaven,
IV. CONCLUSION Advanced Robotics, Vol. 2 No.2, pp. 149-165, 1987.
[16] C. K. Cowan and P. D. Kovesi, Automatic sensor placement from vision
Methods for implementing an automated dimensional op- task requirements, IEEE transaction on pattern analysis and machine
tical measurement system has been presented in this paper. A intelligence, Vol. 10, No. 3, pp. 407-416, 1988.
prototype of an area sensor has been developed. A series of [17] K. Tarabanis, R. Tsai, and P. Allen, The MVP sensor planning system
for robotic vision tasks, IEEE Transactions on Robot and Automation,
calibration methods of a robotic area sensing system have Vol. 11, No. 1, pp. 72-85, 1995.
been described in this paper. A pixel-to-pixel calibration [18] W. Sheng, N. Xi, M. Song, and Y. Chen, CAD-Guided Robot Motion
strategy can effectively obtain the sensor positioning errors, Planning, International Journal of Industrial Robot, pp. 143-151,
March 2001.
which exist not only in a plane perpendicular to reference [19] F. Prieto, T. Redarce, P. Boulanger, and R. Lepage, CAD-Based Range
but also a plane parallel to the reference. To transform a Sensor Placement for Optimum 3D Data Acquisition, in Proceedings
height map to a 3D point cloud, an exploding vector method of 2nd International Conference, on 3-D Digital Imaging and Model-
ing, pp. 128-137, 1999.
has been developed to calibrate the transformation using [20] S. Y. Chen and Y. F. Li, Automatic Sensor Placement for Model-Based
the measured depth information. A modied bounding box Robot Vision, IEEE Transactions on System Man and Cybernetics, Part
method has been developed for planning the area sensor. B, Vol. 34, No. 1, pp. 393-408, 2004.
Both the constraints of the camera and the projector have
been integrated using the developed bounding box method.
An automated 3D surface measurement system then was
implemented on a PUMA robot. Experiment results show
the successful testing of such a system on measuring an
industrial part. The developed methods are not constrained
1562