You are on page 1of 571

Automated Imaging Association

Robotic Industries Association

International Conference
for Vision Guided Robotics

Proceedings

September 30 – October 2, 2008


Sheraton Detroit Novi
Novi, Michigan USA
Welcome to the International Conference for Vision Guided Robotics!
In today’s economy, the need for vision guided robotics technologies is greater than ever.
The Robotic Industries Association (RIA) and the Automated Imaging Association (AIA)
bring you this joint conference to help you apply these technologies in order to boost
productivity, reduce costs and increase quality.

We hope you will find great ideas from the presenters as well as your fellow attendees.
And, we encourage you to meet with the vendors in the tabletop exhibit area who can
offer you products that meet your specific needs.

In addition to this conference, RIA and AIA offer a host of valuable resources that can
help you when you return to your company. We recommend visiting our websites
(www.robotics.org and www.machinevisiononline.org) to find free technical papers, case
studies, and information on upcoming events such as The Vision Show (end of March
2009 in Phoenix, Arizona) and the International Robots, Vision & Motion Control Show
(June 2009 in Chicago, Illinois).

Your feedback is very important to us, so please take the time to complete your
evaluation form and submit it to us onsite (or send it in to our office after the conference).
If you prefer, you can always talk to our staff in person, either here this week or by
calling 734/994-6088 to share your ideas.

Thanks so much for coming, and enjoy the conference!

Sincerely,

Jeffrey A. Burnstein Dana Whalls


Executive Vice President Managing Director
Robotic Industries Association Automated Imaging Association
© 2008 ALL RIGHTS RESERVED

The contents of this book may not be copied or further disseminated


without the written approval of the Robotic Industries Association, the
Automated Imaging Association or the individual authors.

DISCLAIMER
The papers presented in this Proceedings book are the personal
expressions and positions of the respective author and presenter. These
views are not those of the Association, nor are they necessarily endorsed
by the Association or its members. This conference was presented by the
Association to allow robot and vision topics to be openly discussed and
diverse views disseminated. Comments about the contents will be
forwarded to the authors by the Association.

900 Victors Way, Suite 140, Ann Arbor, Michigan 48108


Telephone: 1-734-994-6088 Fax: 1-734-994-3338
Table of Contents
2008 - 2009 Events ......................................................................................................... 3

About Robotic Industries Association (RIA)..................................................................... 4

About Automated Imaging Association (AIA) .................................................................. 4

Exhibitor Index ................................................................................................................ 5

Exhibitor Listings ............................................................................................................. 6

Conference Speakers.................................................................................................... 10

ICVGR Conference Agenda .......................................................................................... 13

Corporate Sponsors: ..................................................................................................... 16

Media Sponsors: ........................................................................................................... 17


Automation Technologies Council

2008 - 2009 Events


Mark Your Calendar!

National Robot Safety Conference October 6-9, 2008


Indianapolis Marriott East Indianapolis, Indiana USA

AIA Networking Reception November 5, 2008


VISION 2008 Stuttgart, Germany

16th Annual Robotics Industry Forum November 5-7, 2008


Portofino Bay Hotel Orlando, Florida USA

17th Annual AIA Business Conference February 4-6, 2008


Marriott Coronado Island Resort Coronado, California USA

AIA International Pavilion March 24 – 26, 2008


Shanghai Exhibition Center Shanghai, China

The Vision Show March 31-April 2, 2008


Phoenix Convention Center Phoenix, Arizona USA

International Robots, Vision & Motion Control Show June 9 – 11, 2009
Donald E. Stephens Convention Center Rosemont (Chicago), Illinois
USA

For full details on these or other events, visit


www.Robotics.org ~ or ~ www.MachineVisionOnline.org
or call 1-734-994-6088
About Robotic Industries Association (RIA)
Robotic Industries Association (RIA) is the only trade association in North America
organized specifically to serve the field of robotics. Founded in 1974, RIA is dedicated
to the exchange of technical and trade related information between robot
manufacturers, distributors, corporate users, accessory equipment and systems
suppliers, consultants, research groups and international organizations. RIA is the
common ground where these groups can come together to discuss challenges and
solutions dealing with the implementation of robotic technology. Some 285 companies
are members of RIA. Members receive many benefits, including discounts on RIA
workshops, conferences and resources.

About Automated Imaging Association (AIA)


Founded in 1984, AIA was organized specifically to promote the global use of image
capture and analysis technology and now represents more than 300 machine vision
suppliers, system integrators, users, researchers, and consulting firms from 27 nations.
The AIA Sponsors many educational conferences and workshops including the
International Robots, Vision, & Motion Control Show, The Vision Show and the annual
AIA Business Conference. AIA also produces an annual Machine Vision Market Study.
Be sure to visit Machine Vision Online (www.machinevisiononline.org), the world’s
leading resource for machine vision information on the internet.
Exhibitor Index
AIA/RIR Standards Activities........................................................................................... 5
Basler Vision Technologies ............................................................................................. 5
CCS America, Inc............................................................................................................ 6
Cognex Corporation ........................................................................................................ 6
Components Express, Inc. .............................................................................................. 6
DENSO Robotics............................................................................................................. 6
Dunkley International, Inc................................................................................................ 6
EPSON Robots ............................................................................................................... 6
FANUC Robotics America, Inc. ....................................................................................... 6
Hitachi Kokusai Electric America Ltd............................................................................... 6
HTE, Inc. ......................................................................................................................... 6
ISRA VISION SYSTEMS, Inc.......................................................................................... 6
Item North America ......................................................................................................... 6
KUKA Robotics ............................................................................................................... 7
LEONI Engineering Products & Services, Inc. ................................................................ 7
LMI Technologies Inc. ..................................................................................................... 7
Matrox Imaging ............................................................................................................... 7
Motoman, Inc. ................................................................................................................. 7
Multi-Contact USA........................................................................................................... 7
MVTec, LLC .................................................................................................................... 7
Nachi Robotic Systems Inc. ............................................................................................ 7
Northwire, Inc. ................................................................................................................. 7
Pilz Automation Safety L.P.............................................................................................. 7
Radix Controls Inc. .......................................................................................................... 8
Schneider Optics, Inc. ..................................................................................................... 8
SICK, Inc. ........................................................................................................................ 8
Stäubli Robotics .............................................................................................................. 8
StockerYale, Inc. ............................................................................................................. 8
Tectivity, Inc. ................................................................................................................... 8
United Sales & Services Inc. ........................................................................................... 8
Valentine Robotics, Inc.................................................................................................... 8
VMT-Pepperl + Fuchs, Inc............................................................................................... 8
WireCrafters LLC............................................................................................................. 9
Exhibitor Listings
Automated Imaging Association products in stock and has designed over Dunkley International, Inc.
Robotic Industries Association 2,000 custom lighting solutions to date. The 1910 Lake Street
900 Victors Way latest products include our new high power Kalamazoo, Michigan 49001
Suite 140 ring and dome lights. Phone: 269-343-5583
Ann Arbor, Michigan 48108 Fax: 269-343-5614
Phone: 734-994-6088 Cognex Corporation Email: pcallan@dunkleyintl.com
Fax: 734-994-3338 One Vision Drive Web: www.dunkleymachinevision.com
Email: jfryman@robotics.org Natick, Massachusetts 01760 Contact: Pat Callan
Web: www.Robotics.org Phone: 508-650-3000 Dunkley International is a supplier of
www.MachineVisionOnline.com Fax: 508-650-3344 turnkey vision systems. Currently we have
Contact: Jeff Fryman Email: mktg@cognex.com systems ranging from high-speed fruits and
The Robotic Industries Association is an Web: www.cognex.com vegetables to final inspection of heavy-duty
Accredited Standards Developer responsible for Contact: John Keating truck transmissions. While much of our
Industrial Robot Standards in the United States. Cognex Corporation is the world’s leading business is geared towards very high
Working with ISO and ANSI, the RIA sponsors provider of vision systems, vision software, volume systems, we also manufacture
the R15 series of standards, most notably the and vision sensors used in manufacturing many custom one of a kind systems.
R15.06 Robot Safety standard and the National automation. Cognex is also a leader in Whether you need a complete vision and
Adoption of ISO 10218-1. RIA hosts annual industrial ID readers. robotic cell or a vision system added to
robot safety events throughout North America. your existing line we can help.
The Automated Imaging Association is the Components Express, Inc.
world leader in sponsoring interoperability 10330 Argonne Woods Drive EPSON Robots
standards for digital machine vision Suite 100 18300 Central Avenue
applications. The Camera Link® and GigE Woodridge, Illinois 60517 Carson, California 90746
Vision™ standards are renowned for their ability Phone: 630-257-0605 Phone: 562-290-5958
to allow integration of total vision systems using Fax: 630-257-0603 Fax: 562-290-5999
the best solution of components available from Email: rberst@componentsexpress.com Email: rick_brookshire@ea.epson.com
multiple suppliers. Web: www.componentsexpress.com Website: www.robots.epson.com
Contact: Ray Berst Contact: Rick Brookshire
Basler Vision Technologies Machine Vision Cables, Camera Link® EPSON Robots is the global leader in PC
855 Springdale Drive Cables, Cables for GigE Vision™, FireWire controlled precision factory automation with a
Suite 610 Cables, Analog Cables, SCSI Cables, product line of hundreds of easy to use
Exton, Pennsylvania 19341 Internal Ribbon Cables, Camera Enclosures, SCARA, Cartesian and 6 axis robots.
Phone: 610-280-0171 Transformers.
Fax: 610-280-7608 FANUC Robotics America, Inc.
Email: tim.coggins@baslerweb.com DENSO Robotics 3900 W. Hamlin Road
Web: www.baslerweb.com Rochester Hills, Michigan 48309
3900 Via Oro Avenue
Contact: Tim Coggins Phone: 800-IQ-ROBOT
Long Beach, California 90810
Basler Vision Technologies specializes in state Phone: 888-476-2689 Fax: 248-276-4227
of the art digital camera solutions for a wide Fax: 310-952-7502 Email: marketing@fanucrobotics.com
variety of demanding vision applications. Over Email: info@densorobotics.com Web: www.fanucrobotics.com
20 years of industry expertise and product Web: www.densorobotics.com Contact: Ed Roney
development is evident in Basler's extensive Contact: Greg Johnson FANUC Robotics America, Inc. is the leading
application knowledge and broad product DENSO offers a wide range of compact, four- supplier of industrial robots and robotic
offering. Area scan and line scan cameras axis SCARA and five- and six-axis articulated systems. Over 200,000 robots are installed
utilize both CCD & CMOS sensors, and robots, for payloads up to 20 kg and reaches worldwide, and more than 200 robot
FireWire, Gigabit Ethernet, and Camera Link from 350 to 1,300 mm. Repeatability is to variations are available to work in a wide
interface technologies. within ±0.015 mm. Standard, dust- and range of applications. The combination of the
mistproof and cleanroom models are available. world’s most reliable robots, process
CCS America, Inc. ANSI and CE compliance enables global expertise, support services, regional locations
deployment. UL-listed models are available for and a network of system integrators provide
5 Burlington Woods
both the US and Canada. Easy-to-use manufacturers in virtually every industry the
Burlington, Massachusetts 01803
programming and 3-D offline simulation tools they need to reduce costs, improve
Phone: 781-272-6900
software, controllers and teaching pendants quality, maximize productivity, and increase
Fax: 781-272-6902
are also offered. their competitive position in the global market.
Email: sales@ccsamerica.com
Web: www.ccsamerica.com
Contact: Barbara Gagnon
CCS is a global manufacturer of LED lighting for
Machine Vision. Due to its quality and
advanced lighting technologies, CCS Inc. is the
#1 supplier for vision systems in the world and
has the largest market share in Japan. CCS
has more than 300 different types of standard
Hitachi Kokusai Electric America Ltd. robotic enclosures are easily designed with the Matrox Imaging
150 Crossways Park Drive support of item’s in-house engineering group. 1055 St. Regis Boulevard
Woodbury, New York 11797 Dorval Quebec H9P 2T4
Phone: 817-490-5124 KUKA Robotics Canada
Fax: 817-490-6116 22500 Key Drive Phone: 514-822-6000, x2438
Email: phyllis.vela@hitachikokusai.com Clinton Township, Michigan 48036 Fax: 514-822-6298
Web: www.hitachikokusai.us Phone: 866-USE-KUKA Email: bruno.parent@matrox.com
Contact: Phyllis Vela Fax: 866-FAX-KUKA Web: www.matroximaging.com
H i t a c h i K o k u s a i E l e c t r i c A m e r i c a L t d Email: rebeccamarkel@kukarobotics.com Contact: Bruno Parent
manufactures and sells miniaturized high-speed, Web: www.kukarobotics.com Matrox Imaging is a leading provider of
high resolution analog and digital cameras. Contact: Rebecca Markel component-level solutions to OEMs and
Cameras are available as monochrome or color. KUKA Robotics offers a broad range of highly integrators involved in various
Outputs include Camera Link®, Firewire 1394A modular robots, covering all common payload manufacturing sectors. Products include
& 1394B, & GigE Vision™ interfaces. Come see categories, from 3 kg to 1000 kg. Over two cameras, interface boards and processing
o u r n e w G i g E c a m e r a l i n e o f f e r i n g s . thirds of the 75,000 KUKA robots installed in platforms, all designed to provide optimum
the field use our open architecture PC-based price-performance within a common
HTE, Inc. controller, making KUKA the number one PC- software environment. Matrox Imaging
1100 Opdyke controlled robot manufacturer in the world. offers a comprehensive collection of
Auburn Hills, Michigan 48326 KUKA controllers are also available for software tools for calibrating (2D and 3D),
Phone: 248-371-1918 integration with other components of your enhancing and transforming images,
Fax: 248-371-2185 automation systems. Other products include locating objects, extracting and measuring
Email: dreed@hte.net SoftPLC, Remote Service, KUKA SIM features, reading character strings, and
Web: www.hte.net simulation software, Networking Services and
decoding and verifying identification marks.
Contact: Daniel Reed a variety of dress packages. In addition, our
Systems Partners - experts in their respective
HTE is an application engineering distributor industries - offer key technologies that Motoman, Inc.
providing hardware and software products transform the KUKA robot into an application- 805 Liberty Lane
specializing in track and trace, error proofing, specific solution. Our advanced KUKA College West Carrollton, Ohio 45449
direct part marking, machine vision and plant enables fast learning through flexible training Phone: 937-847-6200
floor data collection. Now offering 2D and 3D systems that simulate a variety of real-world Fax: 937-847-6277
vision guided robotic solutions based on Shafi applications. KUKA Robotics offers a 24-hour Email: info@motoman.com
Reliabot and Siemens vision. service hotline as well as engineering services. Web: www.motoman.com
Contact: Greg Garmann
ISRA VISION SYSTEMS, Inc. LEONI Engineering High-speed, high-performance Motoman
3350 Pine Tree Road Products & Services, Inc. robots feature payloads from 3-500 kg and
Lansing, Michigan 48911 2505 Industrial Row Drive are available with integrated vision capability
Phone: 517-887-8878 Troy, Michigan 48084 to facilitate multi-processing in a wide range
Fax: 517-887-8444 Phone: 248-655-1900 of applications, including: arc welding;
Email: info.usa@isravision.com Fax: 248-655-1905 assembly; coating; dispensing; material
Web: www.isravision.com Email: chris.miller@leoni.com cutting; material handling; material removal;
Contact: Diane Rizer Web: www.leoni-robotic-solutions.com and spot welding. Integrated vision is used
Contact: Chris Miller for part finding, robot guidance, identification,
ISRA VISION SYSTEMS develops flexible turnkey
and inspection.
machine vision solutions for industrial applications. Tailor Made Robotic Cable and Cable
ISRA specializes in 2D and 3D robot guidance, Management Solutions. Providing Global Multi-Contact USA
web inspection, bead inspection, and assembly Field Service and Project Management.
inspection. Years of experience in machine vision, 5560 Skylane Boulevard
robotic technology and industrial automation LMI Technologies Inc. Santa Rosa, California 95403
provides cost-effective, integrated solutions, fully Phone: 440-243-4929
1673 Clivedon Avenue Fax: 440-243-6628
installed and performance guaranteed. Delta, British Columbia V3M 6V5 Email: d.rababy@multi-contact.com
Canada Web: www.multi-contact.usa.com
item North America Phone: 604-636-1011 Contact: Dave Rababy
925 Glaser Parkway Fax: 604-516-8368
Akron, Ohio 44306 Email: info@lmitechnologies.com Multi-Contact is the world class provider of
Phone: Web: www.lmitechnologies.com industrial robotic cable connectors up to 250
Fax: Contact: Dan Howe amp capacity. Our multilam technology has
Email: virtually unlimited applications due to our
LMI Technologies Inc. is a research and design flexibility. Our connectors offer both
Web:
manufacturing organization specializing in standard and custom designed solutions for a
Contact: Rick Fascione
machine vision applied technologies. The wide and diverse spectrum of applications.
All manufacturing, assembly or automation LMI brands include FireSync, Sensors That Multi-Contact can provide reliable and cost
processes require a robust sub-structure, base or See, HexSight, and maestro. effective solutions for your interconnection
platform as the starting point for design,
requirements. The robotic line of connectors
development and implementation. item North
are small in size and high in performance.
America provides this sub-structure utilizing
structural aluminum and modular components to
replace welded steel with a more efficient, flexible
and visually appealing alternative. Machine
bases, sub-structures, frames, safety, laser and
MVTec, LLC touch screen HMIs, e-stop pushbuttons, product application solutions. Products from
One Broadway, Fl 14 safety sensors, two-hand enabling devices SICK initiate, inspect, confirm, monitor, and
Cambridge, Massachusetts 02142 and light curtains. Certified engineers and safeguard the movement of product in
Phone: 617-401-2112 qualified consultants are available to design industries that use robotics for automation.
Fax: 617-401-3617 systems, manage projects, perform risk With the customer as our focus and
Email: eisele@mvtec.com assessments, perform machine/plant innovation as our guide, SICK is equipped to
Web: www.mvtec.com reviews, install equipment and train deliver unique and superior products to the
Contact: Heiko Eisele personnel. robotics industry.
MVTec provides standard software for machine Stäubli Robotics
vision applications including algorithms for 2D Radix Controls Inc.
and 3D vision-based robotics guidance. 2105 Fasan Drive 201 Parkway West
Oldcastle, Ontario N0R 1L0 Duncan, South Carolina 29334
Nachi Robotic Systems Inc. Canada Phone: 864-486-1980
Phone: 519-737-1012 Fax: 864-486-5497
22285 Roethel Drive Email: d.arceneaux@staubli.com
Novi, Michigan 48375 Fax: 519-737-1810
Email: info@radixcontrols.com Web: www.staubli.com
Phone: 248-305-6542 Contact: David Arceneaux
Fax: 248-605-6542 Web: www.radixcontrols.com
Email: marketing@nachirobotics.com Contact: Ross Rawlings Stäubli is a mechatronics solution provider
Web: www.nachirobotics.com Radix Controls Inc. has been providing North with three dedicated divisions: textile
Contact: Karen Lewis American manufacturers with high-tech tools machinery, connectors and robotics.
they need to keep their production lines Founded in 1892, Stäubli is known
Nachi Robotic Systems Inc. provides successful
robotic solutions for several applications including: competitive for over 15 years. Our vision worldwide for the quality of its methods and
spot welding, arc welding, sealing, dispensing, experts specialize in vision inspection design processes. Featuring high productivity and
material handling, machine loading and unloading, and integration in automotive, food & precision, Stäubli robots offer solutions for
buffing, palletizing, assembly, roller hemming, die- beverage, pharmaceutical & packaging all industries. The comprehensive product
casting, deburring, and press-to-press handling. markets. We also recently won the Product range includes small 4-axis SCARA as well
Nachi robots can handle load capacities from 5 to Innovation Award from Windsor Chamber of as 6-axis medium to heavy-duty robots with
700 kg. Nachi is a full-service supplier and Commerce for one of our proprietary vision payloads ranging from 1kg - 250 kg
certified to ISO 9001:2000. products – Tool Tracker. featuring superior quality and performance.

Northwire, Inc. Schneider Optics, Inc.


StockerYale, Inc.
110 Prospect Way 285 Oser Avenue
275 Kesmark
Osceola, Wisconsin 54020 Hauppauge, New York 11788
Montreal, Quebec H3M 1R2
Phone: 715-294-2121 Phone: 631-761-5000, x204
Canada
Fax: 715-294-3727 Fax: 631-761-5090
Phone: 514-685-1005
Email: cableinfo@northwire.com Email: industrial@schneideroptics.com
Fax: 514-685-3307
Web: www.northwire.com Web: www.schneideroptics.com
Email: lasers@stockeryale.com
Contact: Ken Anderson Contact: Stuart Singer
Web: www.stockeryale.com
Northwire Endurance™ Vision assemblies – the Schneider Optics designs, develops, and Contact: Customer Service
most rugged assemblies for vision system manufactures high performance lenses for StockerYale, Inc. is an independent
applications. Northwire has been producing high machine vision, robotics, document designer and manufacturer of structured
quality industrial grade cable for over 36 years. scan ning, in du stria l in sp ect ion and light lasers, LED modules and fluorescent
That standard of quality has gone into our vision metrology, gauging, military, surveillance, & illumination products, as well as phase
cable assemblies. The high-quality connectors other image processing applications. masks and specialty optical fibers for use in
and Northwire’s advanced, industrial-grade cables Standard products include Compact C- a wide range of markets and industries
provide ultra-reliable interconnectivity in motion mount lenses, Bilateral Telecentric lenses, a
and vision system applications which include
including machine vision, industrial
modular Macro system, large format lenses inspection, telecommunications, military,
CCXC Analog Video, MVC-800 FireWire, GEV- (area & line scan), 3-CCD lenses and
1000™ GigE Vision™ and Camera Link® cable utilities, and medical.
industrial filters. Custom lens solutions are
assemblies. also available. Key markets include Machine
Tectivity, Inc.
Vision, Robotics, Document Scanning,
Pilz Automation Safety L.P. 3099 Tall Timbers
Industrial Inspection, 2D/3D Metrology,
7150 Commerce Boulevard Surveillance, & Hyperspectral Imaging. Milford, Michigan 48380
Canton, Michigan 48187 Phone: 248-676-9797
Phone: 734-354-0275 SICK, Inc. Fax: 248-676-9796
Fax: 734-354-3355 Email: info@tectivity.com
6900 W 110 Street
Email: info@pilzusa.com Web: www.tectivity.com
Minneapolis, Minnesota 55438
Web: www.pilz.com Contact: Jon Heywood
Phone: 952-941-6780
Contact: Customer Service Manufacturer of the VideoModule, LED-
Fax: 952-941-9287
Pilz Automation Safety L.P. manufactures and Email: brian.mcmorris@sick.com Module, and Laser-Module family of
offers a complete line of safe automation Web: www.sickusa.com protective enclosures for robot applications.
solutions and control products. The line Contact: Brian McMorris Also, we are a distributor of lighting,
includes safety relays for automation Whether safeguarding robot assembly areas or lensing, CCD cameras, filters, cables, etc.
applications, safety and general-purpose PLCs, inspecting finished product, companies can
lockout/tagout systems utilizing safety controls, count on SICK for innovative products and top-
motion control systems, monitoring relays, notch expertise to deliver a wide range of
United Sales & Services Inc. WireCrafters LLC
32549 Schoolcraft Road 6208 Strawberry Lane
Livonia, Michigan 48150 Louisville, Kentucky 40214
Phone: 734-522-8100 Phone: 800-626-1816
Fax: 734-522-0818 Fax: 502-361-3857
Email: rweber@ussvision.com Email: bsemones@wirecrafters.com
Web: www.ussvision.com Web: www.wirecrafters.com
Contact: Ron Weber Contact: Butch Semones
USS United Sales & Services Inc., was Supplier of physical barriers for robotic work
established in 1990. USS has become the cells along with value adds such as weld
largest, most diverse total turn-key integrator in curtains and interlocks.
North America. USS is the largest integrator of
DVT/ Cognex products. Our success has
enabled us to obtain a global blanket with
General Motors providing total turn key vision
error proofing solutions. USS has developed
with Cognex the USS Exact Scan that is
capable of reading an entire vehicle for errors at
end of assembly. USS has developed the USS
tracker in conjunction with Shafi technologies
that mounts on the end of any robot and reads
the entire bead real time. These are a few of the
various intangibles we provide to all of our
clients in the most efficient and cost-effective
manner.

Valentine Robotics, Inc.


36625 Metro Court
Sterling Heights, Michigan 48312
Phone: 586-979-9900
Fax: 586-979-9901
Email: andy@valentinerobotics.com
Web: www.valentinerobotics.com
Contact: Andrew Valentine
Valentine Robotics is the North American
distributor of Scorpion Vision robot guidance
and machine vision software. We deliver
turnkey robot and vision systems for all
application types. We offer machine vision
software, components, kits, studies and
integration. Contact www.valentinerobotics.com
Free trial software and Integrator opportunities
available!

VMT-Pepperl + Fuchs, Inc.


3600 Green Court
Suite 490
Ann Arbor, Michigan 48105
Phone: 269-823-4650
Fax: 330-486-0288
Email: vmt-info@us.pepperl-fuchs.com
Web: www.vmt-gmbh.com
Contact: Todd Belt
The Pepperl+Fuchs VMT group has over 20 years
of success in applying complete turnkey systems
for industrial image processing applications.
System solutions are based on self developed
software products adaptable to clients’ specific
needs. With an easy-to-use test and calibration
process along with multiple redundancies, we can
customize a solution to increase safety, improve
quality, speed up production, and reduce costs.
Conference Speakers
Mr. Robert Anderson Mr. Babak Habibi
New Technology Manager President & CTO
Advanced Manufacturing Engineering Braintech Inc.
Chrysler LLC 102 - 930 West 1st Street
800 Chrysler Drive, CIMS 482-04-16 North Vancouver, British Columbia V7P 3N4
Auburn Hills, Michigan 48326 Canada
Phone: 248-944-6076 Phone: 604-988-6440
Fax: 248-841-6272 Fax: 604-986-6131
Email: ra2@chrysler.com Email: bhabibi@braintech.com

Mr. David Arceneaux Mr. Eric Hershberger


Business Development – Marketing Manager Senior Engineer
Stäubli Corporation – Robotics Division Applied Manufacturing Technologies
201 Parkway West 219 Kay Industrial Drive
PO Box 189 Orion, Michigan 48359
Duncan, South Carolina 29334 Phone: 248-409-2000
Phone: 864-486-5416 Fax: 248-409-2027
Fax: 864-486-5497 Email: ehershberger@appliedmfg.com
Email: d.arceneaux@staubli.com
Mr. John Keating
Mr. David Dechow Product Marketing Manager
President Cognex Corporation
Aptúra Machine Vision Solutions 1 Vision Drive
3130 Sovereign Drive Natick, Massachusetts 01760
Suite 5A Phone: 508-650-3000
Lansing, Michigan 48911 Fax: 508-650-3338
Phone: (517) 272-7820, x11 Email: john.keating@cognex.com
Fax: (866) 575-1583
Email: ddechow@apturavision.com Mr. Jens Kuehnle
Research Associate
Mr. René Dencker Eriksen Fraunhofer Institute Manufacturing
Chief Technology Officer Engineering and Automation (IPA)
Scape Technologies Nobelstrasse 12
Kochsgade 31 C, 3. sal 70569 Stuttgart
DK-5000 Odense C Germany
Denmark Phone: 49 711 970 1861
Phone: 45 70 25 31 13 Fax: 49-711-970-1004
Fax: 45 70 25 31 14 Email: kuehnle@ipa.fraunhofer.de
Email: rde@scapetechnologies.com
Mr. Jerry Lane
Mr. Greg Garmann Great Lakes Office Director
Software & Controls Technology Leader Applied Research Associates
Motoman, Inc. 48320 Harbor Drive
1050 Dorset Road Chesterfield Township, Michigan 48047
Troy, Ohio 45373 Phone: 586-242-7778
Phone: 937-440-2668 Fax: 802-728-9871
Fax: 937-440-2626 Email: glane@ara.com
Email: greg.garmann@motoman.com
Mr. Eric Lewis Mr. Bob Rochelle
President North American Sales Manager
Flexomation, LLC Kawasaki Robotics (USA), Inc.
586 Northland Boulevard 28140 Lakeview Drive
Cincinnati, Ohio 45240 Wixom, Michigan 48393
Phone: 513-825-0555 Phone: (248) 446-4211
Fax: 513-825-1870 Fax: (248) 446-4200
Email: info@flexomation.com Email: bob.rochelle@kri-us.com

Mr. Frank Maslar Mr. Adil Shafi


Technical Specialist President
Ford Motor Company SHAFI Innovation, Inc.
6100 Mercury Drive 8060 Kensington Court
Dearborn, Michigan 48239 Brighton, MI 48116-8520
Phone: 313-805-3904 Phone: (248) 446-8200
Email: fmaslar@ford.com Fax: (248) 446-8282
Email: adil.shafi@shafiinc.com
Mr. Michael Muldoon
Business Solutions Engineer Ms. Jane Shi
AV&R Vision & Robotics Inc. Senior Research Scientist
(Averna Vision & Robotics General Motors Corporation
269 Rue Prince 30500 Mound Road
Montreal, Quebec H3C 2N4 MC 480-106-359
Canada Warren, MI 48090-9040
Phone: 514-788-1420 Phone: 586-986-0353
Fax: 514-866-5830 Fax: 586-986-0574
Email: michael.muldoon@avr-vr.com Email: jane.shi@gm.com

Mr. Mark Noschang Mr. Kevin Taylor


Manager of Applications Engineering Vice President
for North America ISRA VISION SYSTEMS, Inc.
Adept Technology, Inc. 3350 Pine Tree Road
11133 Kenwood Road Lansing, Michigan 48911
Cincinnati, Ohio 45242 Phone: 517-887-8878
Phone: 513-792-0266, x106 Fax: 517-887-8444
Fax: 513-792-0274 Email: ktaylor@isravision.com
Email: mark.noschang@adept.com
Mr. James Wells
Mr. Steven Prehn Senior Staff Research Engineer
Senior Product Manager – Vision General Motors Corporation
FANUC Robotics, Inc. 30500 Mound Road
3900 W. Hamlin Road Warren, Michigan 48090
Rochester Hills, Michigan 48309 Phone: 810-602-9879
Phone: 248-276-4065 Fax: 856-856-0574
Email: steven.prehn@fanucrobotics.com Email: james.w.wells@gm.com
Mr. Steven West
Development Manager – Robotic Vision Technology
ABB, Inc.
1250 Brown Road
Auburn Hills, Michigan 48326
Phone: 248-393-7120
Fax: 248-391-8532
Email: steven.w.west@us.abb.com

Mr. Brian Windsor


Business Development Manager – Machine Vision
SICK, Inc.
6900 West 110th Street
Minneapolis, Minnesota 55438
Phone: 952-941-6780
Fax: 952-941-9287
Email: brian.windsor@sick.com

Mr. David Wyatt


Staff Engineer
Applied Manufacturing Technologies
219 Kay Industrial Drive
Orion, Michigan 48359
Phone: 248-409-2073
Fax: 248-409-2027
Email: dwyatt@appliedmfg.com
ICVGR Conference Agenda
Tuesday, September 30, 2008
7:00 am to 8:00 am Registration and Continental Breakfast

8:00 am to 8:15 am Conference Overview and Introductory Remarks

8:15 am to 9:45 am The Basics of Robotics


Bob Rochelle, North American Sales Manager,
Kawasaki Robotics (USA), Inc.

9:45 am to 10:00 am Break

10:00 am to Noon The Basics of Machine Vision


David Dechow, President, Aptúra Machine Vision Solutions

Noon to 1:30 pm Group Luncheon

1:30 pm to 5:00 pm Successfully Integrating Vision Guided Robotics


David Dechow, President, Aptúra Machine Vision Solutions

Evening Optional Group Dinner

Wednesday, October 1, 2008


7:30 am to 8:30 am Registration and Continental Breakfast

8:30 am to 8:45 am Review of Day One and Preview of Day Two

Moderator: Frank Maslar, Advanced Manufacturing Technology


Development – Ford Motor Company

8:45 am to 9:30 am Technology Advances in 2D Vision Guided Robotics


John Keating, In-Sight Product Manager, Cognex Corporation

9:30 am to 10:15 am Top Lessons Learned in Vision Guidance Applications


Eric Hershberger, Senior Engineer & David Wyatt, Staff Engineer
Applied Manufacturing Technologies

10:15 am to 10:30 am Break

10:30 am to 11:00 am How Advancements in Vision Guidance Making Flexible


Feeding Applications Desirable
Eric Lewis, President, Flexomation

11:00 am to 11:30 am Vision Guided Robot Applications for Packaging & Flexible Feeding
Mark Noschang, Applications Engineer, Adept Technology

11:30 am to 1:30 pm Group Lunch and Tabletop Exhibits


See the offerings of leading vision and robotics companies from
around the world who can assist you with your specific needs.
ICVGR Conference Agenda
Wednesday, October 1, 2008
1:30 pm to 2:15 pm High Accuracy Robot Calibration, Wireless Networking, and
Related Technical Issues
Eric Hershberger, Senior Engineer & David Wyatt, Staff Engineer,
Applied Manufacturing Technologies

2:15 pm to 2:45 pm Vision Based Line Tracking


Frank Maslar, Advanced Manufacturing Technology Development,
Ford Motor Company

2:45 pm to 3:00 pm Break

3:00 pm to 3:30 pm Case Study: Robots & Vision in the Automated Pharmacy
David Arceneaux, Business Development & Marketing, Stäubli Robotics

3:30 pm to 4:00 pm Unmanned Systems Intelligence, Vision and Automation Concepts


for Combat Engineer and Other Battlefield Missions
Jerry Lane, Director, Great Lakes Office, Applied Research Associates, Inc.

4:00 pm to 5:15 pm Tabletop Exhibit Viewing and Reception


Your Chance to spend more time with the exhibitors while enjoying
refreshments and networking with your peers.

6:00 p.m. to 10:00 p.m. Exclusive Dinner/Comedy Event


To maximize networking, on October 1st attendees will be transported by bus for dinner at one of metro Detroit’s
finest Italian restaurants, Andiamo. This will be immediately followed by a comedy show at The Second City,
whose unique brand of social and political satire mixed with improvisation has delighted audiences for over 45
years. Sports comedy will be the theme – be prepared for raucous laughs!

Thursday, October 2, 2008


7:30 a.m. to 8:30 a.m. Continental Breakfast

Moderator: Frank Maslar, Advanced Manufacturing Technology


Development – Ford Motor Company
8:30 am to 9:15 am International Trends and Applications in 3D Vision Guided Robotics
Adil Shafi, President SHAFI Innovation Inc.

9:15 am to 9:45 am Advances in 3D Vision Guided Robotics at Fraunhofer IPA


Jens Kuehnle, Research Associate, Fraunhofer IPA

9:45 am to 10:15 am Vision Guided Part Loading/Unloading from Racks for Automotive
Applications – Lessons Learned
Robert Anderson, New Technology Manager, Chrysler LLC

10:15 am to 10:30 am Break

10:30 am to 11:00 am Random Bin Picking Technical Challenges and Approach


Babak Habibi, CTO, Braintech Inc.
ICVGR Conference Agenda
Thursday, October 2, 2008
11:00 am to 11:30 am Random Bin Picking Applications/Solutions
Steven West, Business Development Manager, ABB Inc.

11:30 am to Noon The Need for Generic 3D Bin Picking


René Dencker Eriksen, CTO, Scape Technologies

Noon to 1:00 pm Group Luncheon

1:00 pm to 1:45 pm Robot Visual Servoing – Opportunities and Challenges Ahead


Jane Shi, Senior Research Scientist & James Wells, Senior Staff
Research Engineer, General Motors Corporation

1:45 pm to 2:15 pm 3D Robot Guidance for Cosmetic Sealer Applications


Kevin Taylor, Vice President, ISRA Vision Systems, Inc.

2:15 pm to 2:45 pm Combining Machine Vision and Robotics to Mimic Complex


Human Tasks
Michael Muldoon, Business Solutions Engineer, Averna Vision & Robotics

2:45 pm to 3:00 pm Break

3:00 pm to 3:30 pm Using 3D Laser Scanning for Robotic Guidance


Brian Windsor, Business Development Manager, SICK, Inc.

3:30 pm to 4:00 pm Vision Options for “Dual Arm” Robot Guidance


Greg Garmann, Software & Controls Technology Leader, Motoman Inc.

4:00 pm to 4:30 pm Distance, Pitch & Yaw from a 2D Image


Steve Prehn, Senior Product Manager – Vision, FANUC Robotics America

4:30 pm to 5:00 pm VGR Panel Discussion


Your opportunity to ask specific questions and get insight from this
experienced panel of VGR leaders.
Thank you to our
Corporate Sponsors:
Thank you to our
Media Sponsors:
The Basics of Robotics

Presented by:

Bob Rochelle
Kawasaki Robotics USA
International Conference for Vision Guided Robotics
September 30 – October 2, 2008

Bob Rochelle
North American Sales Manager
Kawasaki Robotics USA

Bob Rochelle
Kawasaki Robotics USA
28140 Lakeview Drive
Wixom, Michigan 48393
Phone: 248-446-4211
Fax: 248-446-4200
Email: bob.rochelle@kri-us.com

Bob Rochelle has a Bachelor’s and Master’s degree in Engineering from Virginia Tech
and holds numerous US and International patents in the automation and food packaging
fields. He has been in the Automation Industry for over 25 years and has held positions
as Design Engineer, Project Manager, R & D Engineer, Engineering Manager, Sales
Engineer and Sales Manager.

He is currently the North American Sales Manager at Kawasaki Robotics with


responsibility for robot and system sales through a direct sales staff or via an Integrator
Network located throughout North, South and Central America.

Bob is a veteran seminar speaker and has taught General Engineering, Project
Management and Robotics for Baker College in Southeast Michigan. He is also the
Chair for the RIA’s New Markets Committee.
The Basics of Robotics

Bob Rochelle
North American Sales Manager
Kawasaki Robotics
References

‹ References:
‹ Robotic Industries Association www.robotics.org
‹ Kawasaki Robotics (USA) Inc. www.kawasakirobotics.com
‹ Denso Robotics www.densorobotics.com
‹ Advance Products Corp www.advanceproductscorp.com
‹ Practical Robotics Services www.prsrobots.com
‹ TDI Covers www.tdicovers.com
‹ Adept Technology www.adepttechnology.com
‹ PAR Systems, Inc www.par.com
‹ Conveying Industries Inc www.conveyind.com
‹ ANSI / RIA Standard R15.06 - 1999
‹ Handbook of Industrial Robotics Edited by Shimon Y. Nof
‹ The Top 10 Application Mistakes Article by George Martin
Outline
‹ Flexible Automation
‹ The Robot Industry
‹ Yesterday, Today, Tomorrow

‹ Terms and Types of Robots

‹ Basic Robot Technology


‹ Mechanical, Controls, Programming

‹ Tooling

‹ Robots in Systems
‹ Robot Based Systems

‹ Vision

‹ Examples - Case Studies

‹ Final Thoughts
Robotics = Flexible Automation
‹ Manual ‹ Dedicated Automation ‹ Flexible Automation
‹ Quick product ‹ High Volume ‹ Quick product change
change ‹ Requires Set-up time ‹ Programmable
‹ Breaks ‹ More maintenance ‹ Higher initial Cost systems
‹ Monotonous ‹ Air Cylinders / ‹ Repeatability by fixtures
tasks actuators ‹ Changeable Cell
‹ Health Claims ‹ Rigid conveyors / configuration
fixtures ‹ Responds to Part Changes
The Robot Industry

‹ History
‹ Today
‹ Tomorrow
‹ General Terms
‹ Types of Industrial Robots
First “Robots”
‹ Steam Man and Electric Man
‹ Robota
‹ Czech word for “forced labor” or “serf”
‹ Karel Capek - Rossum’s Universal Robots
‹ Written in 1920, Premiered in Prague in
1921
‹ Translated into English and performed in
New York in 1923
‹ Isaac Asimov
‹ Coined the word Robotics
‹ 1950’s wrote the Robot Series - part of the
Foundation Series
‹ Drafted the Three Laws of Robotics.
Today’s Industrial Robots

‹ People
‹ George Devol
‹ Joseph Engleberger – “Father of Robotics”
‹ History
‹ 1956
‹ George Devol & Joseph Engleberger met
‹ Began development work of first commercial robot
‹ First Working Model late 1956
‹ 1961 - First Installation
‹ GM - Die Cast Part Extractor
‹ Patented in 1961
‹ Formed Unimation
Early Industrial Robots

‹ Unimation
‹ Universal Automation
‹ Unimate Robot
‹ 4000# Arm
‹ Step by Step Commands
stored on a magnetic drum
‹ Hydraulic Actuators
‹ $100,000 Plus Price

Puma

Programmable
Universal
Machine for
Assembly
Robot Industry - Today

‹ Over 850,000 at work today


‹ Over 100,000 sold per year
‹ Revenue
‹ $5,000,000,000 - robots
‹ $15,000,000,000 - systems
‹ Growth rate greater than 18% yearly
‹ Largest Users
‹ Automotive - 47%
‹ Electronic -15%
‹ Major Applications
‹ Material Handling - 39%
‹ Welding - 30%
‹ Assembly - 8%
Robots Today
90% of Industries that could use robotic automation have yet to consider purchasing their first one.
Applications
‹ Spot Welding
‹ Arc Welding
‹ Coating & Dispensing
‹ Less than 10 pounds
‹ Greater than 10 pounds
‹ Assembly
‹ Less than 10 pounds
‹ Greater than 10 pounds
‹ Material Handling
‹ Packaging / Palletizing
‹ Machine Tending
‹ Body Shop
‹ Other Material Handling
‹ Material Removal
‹ Inspection
Defined by Robotics Industry Association
New Markets and Applications
‹ Service Industry
‹ RoboBar, Food Service
‹ Care for the Elderly
‹ Humanoids
‹ Medical and Pharmaceutical Industries
‹ Prescription Dispensers
‹ Lab Automation
‹ Surgery System – Doctor Guidance
‹ Prosthetics Research and Design
‹ Construction
‹ Manufactured Housing
‹ Machining
Flexible Manufacturing
Terms and Types of Robots

‹ Common Industry Terms and Concepts


‹ Various Types of Industrial Robots
General Terminology
‹ Work Envelope, Work Space or Reach
‹ The set of points representing the
maximum extent or reach of the robot
hand or working tool in all directions.
Also referred to as the working envelope
or robot operating envelope.
‹ All encompassing range of motion
‹ Payload
‹ The maximum total weight that can be
applied to the end of the robot arm
without a sacrifice of any of the
applicable published specifications of
the robot.
‹ Weight carrying capacity
‹ Cycle Time or Speed
‹ Execution time for one task
The Axes – Degrees of Freedom

‹ Degrees of Freedom - Axes


‹ One of a limited number of ways in which a robot joint may move.
‹ Joint 1 - Base Rotation
‹ Joint 2 - Rotation of the lower arm
‹ Joint 3 - Rotation the upper arm
‹ Joint 4 - Swivel of the upper arm
‹ Joint 5 - Bend of the wrist
‹ Joint 6 - Rotation of tool mounting plate
Z
‹ Joint 7 - ??? - Traverse, Turntable, or other motions

‹ Coordinates
‹ Base or World - Origin is in the robot base
‹ Tool Coordinates - Origin is the Tool Center Point Y

X
Multiple Axis System

Axis 7 - Turntable

Axis 1 to 6 - Robot

Axis 8 and 9 – Part Rotators


Axis 10 and 11 – Part Spinners
Common Industrial Robots

‹ Cartesian / Gantry
‹ SCARA
‹ Telescopic
‹ Parallel
‹ Articulated
‹ Modular
Cartesian / Gantry Robots

‹ Four Plus Axes


‹ Simple Motions
‹ Linear X, Y, Z
‹ Tool Rotation
‹ Components
‹ Base / Superstructure
‹ Arm / Runway
‹ Telescope / Carriage
‹ Controls

Packaging / Machining / Water Jet Cutting / Palletizing


SCARA Robots
‹ Four Degrees of Freedom / Advanced Control
‹ One Linear Axis and multiple rotary axes
‹ Motions
‹ Rotational
‹ Linear Z Axis 300 mm
‹ Highly Accurate
25 mm
‹ ± 0.015 mm
‹ Fast and Vibration Free
‹ Adept Cycle: 0.30 – 0.35 seconds Adept Cycle

Packaging / Assembly / Insertion


Telescopic Robots

‹ Clean Room applications


‹ 3, 4 and 5 Axis designs
‹ Specific to Application
‹ Wafer Handling Systems
‹ Flat Panel Screens
Semi Conductor Industry – Clean Room
Parallel Robots
‹ Tripod with three axes
‹ Hexapod with six axes
‹ Very Stiff
‹ Accurate
‹ High Speed

High Speed Pick and Place


Articulated Robots

‹ Most Common / Most Flexible


‹ 4, 5 or 6 Degrees of Freedom
‹ Rotational Motions
Modular Robots

‹ System with a combination of robot types


Beyond Industrial Robots?
Robot Technology
• Robot Mechanical Components
• Robot Controls
• Robot Programming
• Robot Communication

Arm or Manipulator

Controller
Arm or Manipulator
Joint 3 Motor - in rear

Arms

Wrist Joints 4, 5
Counter Balance & 6 Motors

Fork Lift Pockets Joint 1 Motor

Base
Joint 2 Motor
Mounting and Environment
‹ Mounting
‹ Floor, Ceiling or Walls
‹ Proper Fasteners - no Casters
‹ Tracks or Traverse Units

‹ Typical Environmental Specifications


‹ IP65 / 67 Standard
‹ Ambient Temperature: 0 - 52oC
‹ Relative Humidity: 35% - 85% Non Condensing
‹ Optional: Clean Room / Wash down
‹ Hazardous Duty Units - Spray painting
Robot Controllers

‹ Two Components
‹ Controller
‹ Teach Pendant
‹ Design
‹ Microprocessor based
‹ Programmable
‹ Generally One Controller per Robot
‹ Multi Controllers available
Teach Pendant
‹ Design
‹ Hand Held
‹ Programmer's Interface to Robot Controller and Programs
‹ LCD Display
‹ Hard keys for Functions / Keyboard
‹ Functions
‹ Communicates with Controller
‹ Dead man Switches
‹ E - Stop
‹ Monitor
‹ Teaching / Programming
‹ User Interface to robot
‹ Operator’s System Interface Possibility
Communication and Networks
‹ Discrete I/O
‹ Photocoupler, relays, transistors
‹ Relay modules add on
‹ Remote I/O to PLC’s
‹ DeviceNet
‹ Master, Slave, Master & Slave
‹ Profibus
‹ Master, Slave, Master & Slave
‹ Interbus
‹ Ethernet
‹ TCP / IP, I/O adaptor
‹ RS232 / RS485

‹ Internet
‹ Intranet
Programming

‹ Teach Pendant
‹ Programmer holds the teach pendant
‹ Manually teaches the robot
‹ Off Line Programming
Teach Pendant Programming ‹ Program written remotely
‹ Higher level language

‹ Loaded into Robot Controller

‹ Touch up required

‹ No additional hardware is needed.

‹ Check Programs
‹ Slow speed operation
‹ Program Storage
‹ Flash RAM
‹ PC Hard Drive

‹ Other media
PC Programming
Basic Robot Motion Teaching
‹ Motion Instruction
‹ Defines a target position
‹ Interpolation Instruction
‹ Defines how to get to the position
‹ Joint Move - Robot articulates any axis to accomplish the move
‹ Linear Move - Maintains the tool in the orientation specified
‹ Circular Move - Generated by defining three points and a radius to scribe a circle
‹ Speed
‹ Expressed in percent of full speed or a software settable maximum speed.
‹ Termination Instruction
‹ Expressed as a number [1 - 9] most to least accurate.
‹ Defines approach to the target position
‹ Additional Programming Activities
‹ Activities to be complete before moving to the next target position
‹ I / O switching
‹ Data acquisition
Repeatability

‹ Repeatability
‹ Ability of the robot to return to a preprogrammed position.
‹ Closeness of agreement of repeated position movements under
the same conditions to the same location.

Assume repeatability to be +/- 0.004”

0.008”
Robot can position
anywhere within the
• 0.008” diameter circle
• •
and still fall within its
• repeatability specification.


Robots In Systems
‹ Who’s Who in Robot System Industry
‹ Tooling
‹ Control Systems
‹ Systems
‹ Vision
‹ Safety
Who’s Who in the Robotics' Industry
‹ Robot Manufacturers
‹ Manufactures the robot
‹ Provides robot training, maintenance and service
‹ System Integrator [System Builder]
‹ Integrate the robot into a system to perform a specified task
‹ Independent business, industry specific, some allegiance to robot manufacturer
‹ Has knowledge of End User’s business
‹ Designs and builds the robot based system
‹ Purchases robot and all peripheral equipment
‹ Designs and builds systems, writes and maintains programs
‹ Trained on entire cell / provides training on system
‹ Provides system components, installation, training, service and support
‹ End Users
‹ Uses the robotic-based system in production or processing
‹ Knows what is required to accomplish tasks
‹ Ultimate user - needs training, service, maintenance, spare parts
Tooling / End Effectors / E.O.A.T
‹ The tool attached to the robot manipulator or arm that actually performs the work.
‹ Examples
‹ Vacuum Cups
‹ Grippers
‹ Spatulas / Fingers
‹ Spray Nozzles
‹ Dispensers
‹ Buffing Wheels
‹ Machine Tools
‹ Water Jets
‹ Welding Torches / Resistance Welding Guns
‹ Saws
‹ Laser Cutters
‹ Ladles
‹ Adds to the Work Envelope
‹ Adds to the Payload / Torque / Inertia
Tooling Considerations
‹ Parts Fixtures
‹ Repeatable and Positive
‹ Sensors
‹ Part locators / verification of action / QC
‹ Tool Changers
‹ Quick change / machine set-up
‹ Environmental Considerations

‹ No Parts Fixture?
‹ Can Locate

‹ Do I move the part ?


‹ Do I work on a stationary part?
System Control Philosophy

‹ Philosophy 1
‹ Robot Controller does all
‹ System I/O, Tooling Control, Motion Control, Operator Interface

‹ Philosophy 2
‹ Robot Controller
‹ Tooling Control, Motion Control

‹ PLC or PC
‹ System I/O, Operator Interface

‹ Philosophy 3
‹ Robot Controller
‹ Motion Control only

‹ PLC or PC
‹ System I/O, Tooling Control, Operator Interface
Robot System Safety
‹ Responsibility
‹ Robot Manufacturer
‹ Integrator / System Builder / Installer
‹ User
‹ Refer to Resources
‹ ANSI / RIA R15.06-1999
‹ OSHA Standards
‹ CUL / UL [Underwriters Laboratories]
‹ Hazardous materials requirements
ANSI/RIA R15.06-1999
‹ Local Codes
Robotic Industries Association
‹ Good manufacturing practices Ann Arbor, Michigan 48106
(734) 994-6088
‹ Plant Standards www.robotics.org

‹ Personnel training policies


System Development Process
‹ Identify the System Specifications
‹ What do you want to do?
‹ Existing Process, Reach, Payload, Speed, Operator Involvement,

QC Issues, Interface with Production System, Technological


Capability of User
‹ Who is going to Integrate the system?
‹ End user, Integrator, Robot Manufacturer, Combination

‹ System Design and Build


‹ Preliminary Layouts and Design Proposal
‹ Space Required, Parts Movement, Tooling, Safety Concerns, I/O,

Interfaces and Communication, Operator Involvement


‹ Simulations / Cycle Time Study / Verification Tests
‹ Build and test the system prior to shipment
‹ System Start Up and Commissioning
‹ Installation, Start-up and Customer Acceptance
‹ Continuous Improvement
Industrial Robot Systems
‹ System Components
‹ Robot and Controller
‹ Arm Dressing and Risers
‹ End of Arm Tooling
‹ Parts Fixtures or Locators
‹ Interfaces
‹ Pneumatics
‹ Sensors
‹ Electrical Components
‹ Cables
‹ Peripheral Equipment
‹ Varies by application

‹ PLC or External Control


‹ Communication via Network or Discrete I/O

‹ Safety Components
‹ Fence, Gates, Interlocks, Light Curtains, Barriers, Awareness Beacons
Selecting a Systems Integrator
‹ Determine if the Integrator has experience in your industry
‹ Transferable knowledge
‹ Evaluate the Integrator’s background and capabilities
‹ Full Service
‹ Commercial Issues
‹ Check references
‹ The Integrator’s
‹ Robot Manufacturers
‹ Prepare for disaster
‹ What happens?
‹ After sale maintenance
‹ Integrator / Robot manufacturer
‹ Cost
‹ Is the lowest bid the best?
Vision Systems
‹ Peripheral Equipment
‹ Camera
‹ Camera Controller
‹ Light Source
‹ Calibration Check Means
‹ Robot Components
‹ Robot and Controller
‹ Interface to Camera Controller
‹ Software
‹ Applications
‹ Part Location
‹ Inspection
‹ Bin Picking
‹ Real Time Feedback
Bin Picking
Locating or Orientating Parts
Cameras

Camera

Parts
Rack
Robot Guidance

‹ Real Time
‹ Welding
‹ Seam Sealing
‹ Dispensing
10 Reasons to Invest in Robotics

1. Reduce Operating Costs


2. Improve Product Quality and Consistency
3. Improve Quality of Work Environment for Employees
4. Increase Production Output Rates
5. Increase Product Manufacturing Flexibility
6. Reduce Material Waste and Increase Yield
7. Comply With Health and Safety Standards
8. Reduce Labor Turnover and Difficulty Recruiting Workers
9. Reduce Capital Costs like Inventory, WIP
10. Save Space in High Value Manufacturing Areas

Expect System Reliability > 99.5%


The Green Sand Casting Process
‹ Green Sand Casting Process
‹ Create the mold
‹ mixture of sand, clay and moisture

‹ simple materials

‹ materials can be reused or regenerated

‹ low cost materials

‹ Pour molten metal into the molds


‹ Remove the parts
‹ Machining or clean up is required
‹ Green Sand Cast Parts
‹ Require surface finish
‹ Lowest cost casting process
‹ Labor intensive process
‹ Automated mold creating
‹ Recently automated the pouring process
‹ Manual parts removal
Robotic Pouring
‹ Customer’s Results
‹ Four times the capacity
‹ impeded by peripheral equipment

‹ One part every 30 seconds


‹ Reduced labor by three per shift
‹ Energy reduction
‹ automatic furnace lid closure provides insulation

‹ Operator Safety is vastly improved


‹ Reduced material use
‹ same quantity for every part

‹ Parts consistency is 100% reliable


‹ repeatable process

‹ Increased Parts Quality


‹ metal heat more consistent

‹ pour efficiency
10+ Mistakes in Robot Integration
‹ Underestimating Payload and Inertia.
‹ Expecting the robot to do to much.
‹ Underestimating Cable Management Issues.
‹ Not considering all current and future application needs.
‹ Misunderstanding accuracy and repeatability.
‹ Focusing on the robot alone.
‹ Not planning for disaster.
‹ Overlooking the need for options or peripheral equipment for a system.
‹ Not fully utilizing the capabilities of a robot.
‹ Choosing a robot or system solution solely on price.
‹ Thinking that robots are too complicated.
‹ Failure to consider using robotic technology.

Expect System Reliability > 99.5%


Applications
‹ Welding
‹ Spot Welding
‹ Plasma Welding
‹ Coating & Dispensing
‹ Glue Dispensing
‹ Paint
‹ Packaging / Palletizing
‹ Bag Palletizing
‹ Box Palletizing
‹ Muffin Packaging
‹ Material Handling
‹ Press Tending
‹ Forging
‹ Investment Casting
‹ Machine Tending
Defined by Robotic Industries Association
‹ Die Casting
Contact Information

Bob Rochelle
North American Sales Manager
Kawasaki Robotics
28140 Lakeview Drive
Wixom, Michigan 48393
USA
Telephone: 248-446-4211
Email: bob.rochelle@kri-us.com
Web: www.kawasakirobotics.com
The Basics of Machine Vision

Presented by:

David Dechow
Aptúra Machine Vision Solutions
International Conference for Vision Guided Robotics
September 30 – October 2, 2008

David Dechow
President
Aptúra Machine Vision Solutions

David Dechow
Aptúra Machine Vision Solutions
3130 Sovereign Drive, Suite 5A
Lansing, Michigan 48911
Phone: 517-272-7820, x11
Email: ddechow@aptura.com

David Dechow is president and founder of Aptúra Machine Vision Solutions, LLC. Mr.
Dechow has worked in the field of machine vision for over 25 years as a programmer,
engineer, and manager. He served 14 years on the AIA board of directors, and was a
two term president of that board. Mr. Dechow is the 2007 recipient of the AIA
Automated Imaging Achievement Award honoring industry leaders for outstanding
contributions in industrial and/or scientific imaging. Mr. Dechow is a regular speaker at
conferences and seminars, and a frequent contributor to industry trade journals and
magazines and has served on the editorial boards of Vision Systems Design magazine
and Quality Magazine’s Vision and Sensors.
The Basics of Machine Vision

David Dechow
President
Aptúra Machine Vision Solutions
Session Outline

1 Overview/Introduction to Machine Vision

2 Imaging, Lighting and Optics

3 Machine Vision – Getting Data From Images

4 Application Analysis and Specification


The Basics of Machine Vision

MACHINE VISION INTRODUCTION


Overview

• Machine Vision
– Machine vision is the substitution of the human visual sense and
judgment capabilities with a video camera and computer to
perform an inspection task. It is the automatic acquisition and
analysis of images to obtain desired data for controlling or
evaluating a specific part or activity.
– Key Points:
• Automatic – self-acting
• Acquisition and analysis – machine vision uses both
• Non-contact
• Data acquisition – value of the technology
• Control – necessary for reasonable ROI
Overview

• Machine Vision Integration


– Machine vision systems integration is the process
where significant value is added to a machine vision
component by the incorporation of software,
peripheral hardware, mechanical devices, materials
and engineering.
Overview

• Prerequisite Integration Expertise:


– Application-based lighting and optics
– Understanding of imaging and input devices
– Electrical and mechanical engineering
– Industrial automation systems and components
– Machine vision algorithms
– Programming and/or system configuration
– Project management and customer support
Process Overview

• Hardware
execution
Initiate • Camera and
Acquire
Inspection – (if applicable)
Image strobe trigger
external event
-Part Tracking
-Multiple
results
-Other data
• Software
Analysis execution of
inspection
program

-Recipe
changeovers • Determine
Results part status
-Multiple and Process
images/lights communicate Result –
-Part tracking results external
event
System Architectures

• Machine Vision Systems


Camera
Lens Electronics
Imager
Power/Control
Signal Frame Grabber Digital
or other signal Image
conversion

Computer
System Architectures

• Machine Vision Systems, continued


Camera
Lens Electronics
Imager
Power/Control
Signal Frame Grabber Digital
or other signal Image
conversion

Computer

Optional
computer for
operator interface
System Architectures

• Machine Vision Systems, continued


Camera
Lens Electronics
Imager
Power/Control
Signal Frame Grabber Digital
or other signal Image
conversion

Computer

Connected
monitor for
operator interface
System Architectures

• Machine vision hardware is an “image delivery


system”!
– Differentiation of products at the hardware level is
limited
• Physical structure and system layout
• Available camera resolutions
• Input/output options
• Other hardware integration capability
System Architectures

• Machine vision software drives component


capability, reliability, and usability
– Available image processing and analysis tools
– Ability to manipulate imaging and system hardware
– Methodology for inspection task configuration
• Main component differentiation is the software
implementation
• Often, system software complexity increases
with system capability
The Basics of Machine Vision

IMAGING, LIGHTING AND OPTICS


Imaging, Lighting and Optics

• Key Issues
– Imaging
• Application requirements will dictate image space and camera resolutions
– Lighting
• The purpose of lighting for machine vision is to create the highest level of
contrast between features to be inspected relative to the background or
other features
• Competent lighting technique contributes over 80% to the success of an
application
– Optics
• Most machine vision applications use “off the shelf” optics
• Select proper machine vision quality lenses
Imaging Basics

• Image Acquisition
– Performed by a light-gathering silicon device
• CCD, CID, CMOS …
– The imaging chip comes in a variety of physical layouts
• Area

• Line
– Size of the chip varies widely as does the number of individual
picture elements (pixels)
• Typical area chip for machine vision: from .3 to 4+ Mpix
• Physical sizes from ¼" diag. up
• Typical line scan array: from 1K to 12K+
Imaging Basics

• Cameras
– Image sensor supported by electronic circuitry and
packaged for industrial use
– Final output may be analog or digital
• RS170, CCIR, NTSC, PAL, USB, FireWire (1394 a/b),
Camera Link, GigE

Camera
Lens
Electronics
Imager
Power/Control In
Signal Out
Imaging Basics

• Digital image representation


– Common thread is the internal representation of the
image as seen by most algorithms
– The image is stored as a single or multiple image
planes containing arrays of integer numbers
– Each number represents one small section of the
image – a pixel (picture element)
– Lower numbers are darker, higher numbers are
lighter
• Typical range is 0-63, -127, or -255
Imaging Basics

• Internal representation – gray scale image

255 255 255 105 51 41 43 49 101 255 255 255

255 255 255 116 62 44 42 57 120 255 255 255

255 255 255 112 68 41 46 58 117 255 255 255

105 110 111 109 60 42 48 61 115 112 114 108

60 68 62 57 42 41 46 41 43 49 42 41

44 42 41 46 46 42 48 44 42 42 46 42

41 46 42 48 44 42 41 41 46 43 49 42

59 54 60 59 41 46 42 46 46 42 48 46

100 120 120 115 51 41 43 49 110 116 118 105

255 255 255 118 62 44 42 57 115 255 255 255

255 255 255 121 68 41 46 58 120 255 255 255

255 255 255 100 60 42 48 61 105 255 255 255


Imaging Basics

• Color Images
– Color images commonly are acquired and internally
represented as three planes of digital data – one each
for Red, Green, and Blue
– Difference between 3-chip color and Bayer Filter
– Other representations such as HIS, LAB are derived
from the RGB data
Lighting Basics

• Illumination for machine vision


must be designed for imaging, not
human viewing
– Selection must be made relative to
light structure, position, color, diffusion
– We need to know how light works so Light Source Diffuse Specular
our light selections are not “hit and Reflection Reflection
miss” guesswork
– Light is both absorbed and
reflected to some degree from all
Refraction,
surfaces Absorption
• When an object is clear or
translucent, light is also
Transmitted Light
transmitted (if object is not
• Angle of incidence = angle of completely
reflection opaque)
Lighting Basics

• Dedicated lighting must be used for machine vision with


few exceptions.
• Where feasible, LED illumination is the best source
– Long life with minimal degradation of intensity
– Able to be structured into a variety of shapes
• May be directional or diffuse
– May be strobed at very high duty cycles and overdriven to many
times nominal current specifications
– Available in many visible and non-visible colors
• Other sources – fluorescent, fiber-optics
Lighting Basics

• Lighting Techniques
– The goal of lighting for machine vision applications
usually is to maximize the contrast (grayscale
difference) between features of interest and
surrounding background
– Techniques are categorized generally by the direction
of the illumination source
• Most may be achieved with different sources
Lighting Basics

• Direct bright-field
illumination
– Sources: high-angle ring
lights (shown), spot-lights,
bar-lights (shown); LED’s
or Fiber-optic guides
– Uses: general illumination
of relatively high-contrast
objects; light reflection to
camera is mostly specular

Images: CCS America; www.ccsamerica.com


Lighting Basics

• Diffuse bright-field
illumination
– Sources: high-angle
diffuse ring lights (shown),
diffuse bar-lights; LED’s or
fluorescent
– Uses: general illumination
of relatively high-contrast
objects; light reflection to
camera is mostly diffuse

Images: CCS America; www.ccsamerica.com


Lighting Basics

• Direct dark-field
illumination
– Sources: low-angle ring
lights (shown), spot-lights,
bar-lights; LED’s or Fiber-
optic guides
– Uses: illumination of
geometric surface features;
light reflection to camera is
mostly specular
– “dark field” is misleading –
the “field” or background
may be light relative to
Images: CCS America surface objects
Lighting Basics

• Diffuse dark-field
illumination
– Sources: diffuse, low-
angle ring lights (shown),
spot-lights, bar-lights;
LED’s or fluorescent
– Uses: non-specular
illumination of surfaces,
reducing glare; may hide
unwanted surface features

Images: CCS America


Lighting Basics

• Diffuse backlight
– Sources: highly diffused
LED or fluorescent area
lighting
– Uses: provide an accurate
silhouette of a part

Images: CCS America


Lighting Basics

• Structured light
– Sources: Focused LED
linear array, focused or
patterned lasers
– Uses: highlight geometric
shapes, create contrast
based upon shape, provide
3D information in 2D
images

Images: CCS America,


Stocker & Yale; www.stockeryale.com
Lighting Basics

• On-axis (coaxial)
illumination
– Sources: directed, diffused
LED or fiber optic area
– Uses: produce more even
illumination on specular
surfaces, may reduce low-
contrast surface features,
may highlight high-contrast
geometric surface features
depending on reflective
angle

Images: CCS America


Lighting Basics

• Collimated illumination
– Sources: diffuse, low-
angle ring lights (shown),
spot-lights, bar-lights;
LED’s or fluorescent
– Uses: non-specular
illumination of surfaces,
reducing glare; may hide
unwanted surface features

Images: Edmund Optics; www.edmundoptics.com,


CCS America
Lighting Basics

Camera • Constant Diffuse


Light Source
Beam Splitter (on-axis) Illumination (CDI –
“cloudy day illumination”)
– Sources: specialty
integrated lighting
Light – Uses: provides completely
Source Object non-specular, non-
(off-axis)
reflecting continuous
lighting from all reflective
angles; good for reflective
or specular surfaces

Images: Siemens; www.nerlite.com


Lighting Basics

• Other lighting considerations


– Color
• The effect of monochromatic light on colored features
• Camera response to different colors
• White light and color imaging
• Non-visible “colors”
– Light degradation over time; component life, heat
dissipation
– Light intensity and uniformity
– Strobing
– Elimination of ambient and other stray light
Optics Basics

• Application of optical components


– Machine vision requires fundamental understanding
of the physics of lens design and performance
– Task is to competently specify the correct lens
• Create a desired field of view (FOV)
• Achieve a specific or acceptable working distance (WD)
• Project the image on a selected sensor based on sensor size
– primary magnification (PMAG)
– Goal, as always, to create the highest level of contrast
between features of interest and the surrounding
background; with the greatest possible imaging
accuracy
Optics Basics

• Considerations for lens selection


– Magnification, focal length, depth of focus (DOF), f-
number, resolution, diffraction limits, aberrations
(chromatic, spherical, field curvature, distortion),
parallax, image size, etc.
– The physics of optical design is well known and can
be mathematically modeled and/or empirically tested
• Specification or control of most of the lens criteria is out of
our hands
Optics Basics

• Considerations for lens selection


– Practical specifications for machine vision: PMAG (as
dictated by focal length) and WD to achieve a desired
FOV
• Use a simple lens calculator and/or manufacturer lens
specifications
• Simple – state the required FOV, the sensor size based on
physical selection of camera and resolution, and a desired
working distance – calculate the lens focal length
• Test your results
– Always use a high-resolution machine vision lens
NOT a security lens
Optics Basics

• Why use machine vision lenses only


– Light gathering capability and resolution

Images: Edmund Optics; www.edmundoptics.com


Optics Basics

• Specialty Lenses
– Telecentric
– Microscope stages
– Macro, long WD
The Basics of Machine Vision

MACHINE VISION – GETTING DATA


FROM IMAGES
Machine Vision – Getting Data out of Images

• Inspection Concepts
– What are the capabilities and limitations of machine
vision technology for the target application
• Requirement: specify a processing direction to take with
respect to system architecture, and the ability to specify
deliverables, performance, and acceptance criteria
– Analysis of the inspection concept can be subdivided
by general type of inspection
• Assembly Verification/Recognition
• Flaw Detection
• Gauging/Metrology
• Location/Guidance
• OCR/OCV
Machine Vision – Getting Data out of Images

• Assembly Verification/Object Recognition


– Feature presence/absence, identification,
differentiation of similar features
– Imaging Issues
• Must create adequate contrast between feature and
background
• Accommodate part and process variations
• May require flexible lighting/imaging for varying features
• For feature presence/absence, feature should cover approx.
1% of the field of view (med. resolution camera), more for
identification or differentiation
Machine Vision – Getting Data out of Images

• Defect/Flaw Detection
– A flaw is an object that is different from the normal
immediate background
– Imaging Issues
• Must have sufficient contrast and geometric features to be
differentiable from the background and other “good” objects
• Typically must be a minimum of 3x3 pixels in size and
possibly up to 50x50 pixels if contrast is low and defect
classification is required
• Reliable object classification may not be possible depending
upon geometric shape of the flaws
Machine Vision – Getting Data out of Images

• Gauging/Metrology
– Measurement of features
– There are physical differences between gauging
features in an image produced by a camera, and the
use of a gauge that contacts a part. These
differences usually can not be reconciled
Machine Vision – Getting Data out of Images

• Gauging/Metrology
– Gauging concepts
• Resolution, repeatability, accuracy
• Sub-pixel measurement
• Measurement tolerances
• Resolution must be approximately 1/10 of required accuracy
in order to achieve gauge reliability/repeatability
Machine Vision – Getting Data out of Images

• Gauging/Metrology
– Imaging Issues
• Lighting to get a repeatable edge
– Backlighting, collimated light
• Telecentric lenses
• Calibration
– Correction for image perspective/plane
– Calibration error stack-up
Machine Vision – Getting Data out of Images

• Location/Guidance
– Identification and location of an object in 2D or 3D
space
– May be in a confusing field of view
– Imaging Issues
• Measurement tolerances and accuracies as described for
gauging/metrology applications
• Sub-pixel resolutions may be better than discrete gauging
results
• For guidance applications, the stack-up error in robot motion
may be significant
Machine Vision – Getting Data out of Images

• OCR/OCV
– Optical Character Recognition/Verification – reading
or verifying printed characters
– Can be fooled by print variations
– Verification is difficult depending upon the application
– Imaging Issues
• Consistent presentation of the character string
• May require extensive pre-processing
The Basics of Machine Vision

APPLICATION ANALYSIS AND


SPECIFICATION
Application Analysis and Specification

• Define the target application and inspection


criteria
– Describe the desired inspection
• Avoid discussion of machine vision technique
– Clearly define good part criteria and bad part criteria
– What is the reason for the inspection
– What will happen to a bad part
Application Analysis and Specification

• Define the part(s) to be inspected


– Include physical detail about geometric structure,
features
– Identify all possible part variations; color, size,
structure
– Describe the materials and surface finish of the part
– Will the part change over time
– Get photos, samples
Application Analysis and Specification

• Production Process Analysis


– Background information about how the part is
manufactured and moved
– Production rates, number of shifts
– What factors in the process cause the bad parts
– Benefits of implementing inspection
• What happens if a bad part gets through
• Will costs, yield, quality be improved
– What is the cost of a falsely reject part
• Can rejects be recovered/repaired
Application Analysis and Specification

• Part Handling and Presentation


– Existing automation
• Physical description of the mechanism/conveyor including
background objects, surfaces, and colors at the target
inspection station
• Physical envelope available for inspection components
• Mounting surfaces available for inspection components
• Other processes taking place at the inspection station
• Other physical constraints or obstacles
• Reject method
• Interfacing method to existing controls system
– Inspection triggering, reject signal, alarms
– Operator interface requirements
Application Analysis and Specification

• Part Handling and Presentation


– Environment
• Type of environment: factory, lab, clean-room, wash-down,
hazardous, etc.
• Air quality in the vicinity of the inspection
– Oil, smoke, debris
• Dirt, oil, lubricant, water, other contaminants on parts
• Things damaging to cameras: weld spatter, reflected laser
light, radiation, etc.
• Ambient light
• Temperature and humidity
• Shock or vibration
Application Analysis and Specification

• Business Issues
– Scope of supply/deliverables; who is responsible for
what
• Engineering: design, integration, shipping, installation
• Hardware components
• Warranties
• Documentation and training
– Contractual items
• Performance guarantees
• Terms
• IP ownership
Application Analysis and Specification

• Once the constraints of the application are fully identified


the system performance can be quantified.
• The performance criteria of the system should include
– Actual inspection capability (measurement tolerance, feature
detection, etc.) with respect to the target application
– Throughput and speed of inspection
– Anticipated lighting and imaging methodology
– General overview of the operation of the inspection system
– Description of the automation and appropriate performance
related a specific process if applicable
Application Analysis and Specification

• Exceptions and limitations


– The project specification must identify all non-obvious
exceptions and limitation to the performance of the
system
– Include all possible unknowns
Application Analysis and Specification

• Acceptance Criteria
– How to prove the machine is functioning properly
– How to resolve differences in opinion regarding
machine function
– Clearly state acceptance criteria AND methodology in
quantifiable terms
– Acceptance will be based on stated performance
criteria
Application Analysis and Specification

• Acceptance Criteria
– Analysis of system performance must be done using a verifiable
sample or challenge set of parts
• Verifiable: All parties agree that each specific challenge part meets the
stated criteria, either reject criteria or feature size if a gauging application
– Static testing is done with challenge parts
– A gauge R&R is appropriate for gauging applications
– Production testing can be done with parallel visual inspection
• Rejected parts will be judged against the set of challenge parts
– The acceptance criteria will list false accept and false reject rates
Contact Information

David Dechow
President
Aptúra Machine Vision Solutions
3130 Sovereign Drive, Suite 5A
Lansing, Michigan 48911
USA
Telephone: 517-272-7820, x11
email: ddechow@aptura.com
www.aptura.com
Successfully Integrating
Vision Guided Robotics

Presented by:

David Dechow
Aptúra Machine Vision Solutions
International Conference for Vision Guided Robotics
September 30 – October 2, 2008

David Dechow
President
Aptúra Machine Vision Solutions

David Dechow
Aptúra Machine Vision Solutions
3130 Sovereign Drive, Suite 5A
Lansing, Michigan 48911
Phone: 517-272-7820, x11
Email: ddechow@aptura.com

David Dechow is president and founder of Aptúra Machine Vision Solutions, LLC. Mr.
Dechow has worked in the field of machine vision for over 25 years as a programmer,
engineer, and manager. He served 14 years on the AIA board of directors, and was a
two term president of that board. Mr. Dechow is the 2007 recipient of the AIA
Automated Imaging Achievement Award honoring industry leaders for outstanding
contributions in industrial and/or scientific imaging. Mr. Dechow is a regular speaker at
conferences and seminars, and a frequent contributor to industry trade journals and
magazines and has served on the editorial boards of Vision Systems Design magazine
and Quality Magazine’s Vision and Sensors.
Presentation not available at time of production.
International Conference for
Vision Guided Robotics

A Special thanks to our Moderator:

Frank Maslar
Ford Motor Company
International Conference for Vision Guided Robotics
September 30 – October 2, 2008

Frank Maslar
Technical Specialist
Ford Motor Company

Frank Maslar
Ford Motor Company
36200 Plymouth Road
Livonia, Michigan 48150
Phone: 313-805-3904
Email: fmaslar@ford.com

Key Responsibilities:
Work with universities and key suppliers to develop and implement advanced
manufacturing technology in the manufacturing of powertrain systems. Areas of focus
include vision systems and traceability.

Previous Positions Held:


Advanced Manufacturing and Quality Engineer at Ford Electronics
Research Scientist at Siemens Corporate Research

Degrees:
B.S.M.E. Penn State
Technology Advances in
2D Vision Guided Robotics

Presented by:

John Keating
Cognex Corporation
International Conference for Vision Guided Robotics
September 30 – October 2, 2008

John Keating
Principal Product Marketing Manager
Cognex Corporation

John Keating
Cognex Corporation
1 Vision Drive
Natick, Massachusetts 01760
Phone: 508-650-3000
Fax: 508-650-3338
Email: john.keating@cognex.com

John Keating is a Principal Product Marketing Manager for In-Signt® vision systems at
Cognex Corporation. He holds a B.S in Electrical Engineering from Boston University
and an MBA from Babson College. Since joining Cognex in 1994, he has held roles in
applications engineering management, as well as a variety of positions in industry and
product marketing.
Technology Advances in 2D Vision
Guided Robotics
John Keating
Principal Product Marketing Manager
Cognex Corporation
Types of Robotic-Vision Applications

• Vision Guided Robotics (VGR)


– Alignment & placement of parts
– Provides X, Y, Θ to robot

• VGR Plus Inspection


– Inspect parts while providing positional data
– Assembly verification, product identification, defect detection

• Robotic-assisted Inspection
– Robot presents part to vision system for inspection
– End-of-arm vision system maneuvered around part
Types of Robotic-Vision Applications

• End-of-Arm Mounted Vision


– Vision moves with robot
– Application Concerns:
• Cabling
• Weight
• Perspective distortion

• Fixed Mount Vision


– Vision separated from robot
– Application Concerns:
• Occlusion
• Multiple planes of inspection
Vision Guided Robotic Applications

• Palletizing/Depalletizing
– Place/Remove parts on pallets

• Conveyer Tracking
– Locate unfixtured parts on conveyer
and place them in package

• Component Assembly
– Locate unfixtured parts and assemble
to other components

• Machine Tending
– Locate unfixtured parts on conveyer
and place into CNC work cells

• Robotic Inspection
– Use robot to manipulate part or
camera to inspect critical features of
part
Types of Industries Using VGR

• Automotive & Aerospace

• Consumer Electronics

• Semiconductor & Solar Cell

• Consumer Packaged Goods

• Food & Beverage

• Pharmaceutical

• Medical Devices
Applications Examples for Vision Guided Robotics

• Wide variety of industries

• Wide range of robot manufacturers


– ABB, Adept, Denso, Epson, Fanuc,
Kawasaki, Kuka, Mitsubishi, Motoman,
Staubli, Yamaha, and others

• And they cover…


– Description of application and
challenges
– Enabling technology to overcome
challenges
– Benefit to the customer
De-Palletizing Application
in the Food Industry

• Application:
– Stacked pallets of juice boxes need to be de-
palletized for distribution

• Challenges:
– Various juice box sizes and configurations
– Parts move slightly when on pallet
– High speed of production line Æ must minimize
robot movement

• Solution:
– High resolution, robot-mounted vision system
with 6 mm lens and large (6 feet) field of view
– Non-linear calibration algorithm to ensures
accurate placement
– 30% reduction in robot cycle time
Enabling Technology: Non-linear Calibration

Distorted Undistorted
Caused by Lens After Non-Linear Calibration

• Correct lens distortion


– Removing distortion from short focal length lens
• Correct perspective distortion in 2 planes
– Calibrate conveyor and pallet planes separately
¾ Ensures positional accuracy of robot
Component Assembly Application
in the Automotive Industry

• Application:
– Locate holes and assemble rivnuts
into automotive frame

• Challenges: Rivnut

– Part is not precision-fixtured


– Need .003” accuracy to insure
proper assembly
– 27 different hole locations
• 5 different planes of view
• Varying surface finishes Hole

• Solution:
– Wrist-mounted, high resolution
vision system

Assembly Cross-Section
Enabling Technology: High Accuracy Gauging

• High Resolution Vision


640 1024 1600
Systems
– Provides greater
detail in image

Standard Res.
480
• Non-Linear Calibration
– Removes lens and
perspective distortion 768

¾ Highly Accurate High Resolution


Measurements 1200
Conveyer Tracking Application
in the Food Industry

• Application:
– Bagged food packet
– Pick-and-Place from conveyor into
shipment boxes
– Packets vary in size and can
sometimes overlap – need flexible
solution to provide exact location

• Challenges:
– Patterned background
– Non-uniform lighting
– Overlapping parts
– Specular reflection from bags

• Solution:
– Fixed Mount Vision System with
geometric pattern finding
• Application:
– Bagged food packet
– Pick-and-Place from conveyor into
shipment boxes
– Packets vary in size and can
sometimes overlap – need flexible
solution to provide exact location

• Challenges:
– Patterned background
– Non-uniform lighting
– Overlapping parts
– Specular reflection from bags

• Solution:
– Fixed Mount Vision System with
geometric pattern finding
Enabling Technology:
Advanced Pattern Finding Algorithm

Occlusion Out of focus Confusing Background

Trained Part

180° Rotation Reversed Polarity Scale Change &


Dim Lighting

Accurate Part Location Under Challenging Conditions


De-Palletizing Application
in the Automotive Industry
• Application:
– Stacked pallets of automotive wheels are
placed at machining center
– Experiencing problems with incorrectly loaded
parts

• Challenges:
– Large variety of wheel types
– Part finish varies due to part processing
– Part cannot be shrouded Æ resulting in
variable lighting
– Part type changes
– Part is loosely placed in bin

• Solution:
– Fixed-mount vision system communicating to
6-axis robot
– 4 Month Project Payback
Robotic Inspection Application
in the Durable Goods Industry

• Application:
– Inspect washing machine
– Inspect controls, LEDs, and labels for
correct placement and surface finish

• Challenges:
– Large variety of panel colors and
configurations
– Large area to inspect for small defects

• Solution:
– Six-axis robot presents washer panels to
vision system for inspection
Conveyer Tracking Application
in the Pharmaceutical Industry

• Application:
– Pharmaceutical product tubes
need to be located on conveyer
and placed into package for
distribution

• Challenges:
– Tubes are loosely placed on
conveyer
– Range of product sizes

• Solution:
– Fixed-mount vision system
– Pick-and-Place Robot
Enabling Technology: Robot Communications

• Communication Flexibility
– Serial and Ethernet based
– Formatted strings, specific drivers,
and native mode commands

• Point-and-Click Configuration
– Build up formatted communication
strings quickly

• Code Samples
– Robot and vision system sample
code available
Robotic Inspection Application
in the Electronics Industry

• Application:
– Consumer electronics stereo
components are assembled in a flexible
automation cell

• Challenges:
– Verify that the correct components are
being assembled
– Match model number with database
description in computer system

• Solution:
– Robot presents part to fixed-mount
vision system
– OCV algorithm “reads” part number
– OPC communications to SCADA
system
Enabling Technology: OPC & ActiveX

• ActiveX Display Control embeds


images and graphics into third-
party HMIs

• Built-in OPC Servers make OPC


communications point-and-click
simple

• Software Development Kits allow


integrators to develop custom
operator interfaces
Machine Tending Application
in the Automotive Industry

• Application Description
– Robot locates parts on conveyer and
places them into machine press

• Challenges:
– Safety concerns prohibit manual
intervention
– Parts are in random orientation on
conveyer

• Solution:
– Fixed mount vision system suited to a
rugged, industrial environment
Enabling Technology: Rugged Hardware

• IP67 rated protection


– Eliminate need for enclosures and reduces
weight for arm-mounted applications

• High flex cables


– Lengthens cable life
– Minimizes downtime required to replace
cables

• Lightweight camera choices


– Minimize weight at the end of the robot

• Integrated Power and Communications


– Power Over Ethernet technology reduces
cabling to one
Component Assembly Application
in the Electronics Industry

• Application:
– Inserts need to be loaded
into an injection mold
housing

• Challenges:
– Need flexible solution to
accommodate a wide range
of parts
– Heavy Industrial
Environment

• Solution:
– Industrially-hardened fixed
mount vision system
Robotic Inspection Application
in the Automotive Industry

• Application:
– Need multiple inspections on variety
of parts

• Challenges:
– Need to achieve 70 inspections in
under 1 minute
– Small lot-size production and need
to minimize machine changeover

• Solution
– Robot-mounted vision system
– High speed & high accuracy
system for multi-point inspection
Enabling Technology: DSP Performance
Advancements
• Higher Performance
– Embedded image processing
– Ability to run more powerful
algorithms

• Smaller Footprint In-Sight 5600: World’s fastest


vision sensor uses 1Ghz TI DSP
– Reduction in component size due
DSP Price Per Unit
to 90nm process design
4600
– Smaller form factors and lighter
weights to fit into tight-spaced 4200
production lines
3800

• Lower Cost 3400


– Driving overall vision system prices
down with greater performance 3000
02

03

04

05

06

07

08

09

10

11
20

20

20

20

20

20

20

20

20

20
Advantages of Vision Guided Robotics

• Elimination of Costly Fixtures


– Reduced capital investment costs

• Improved Robotic Accuracy


– Lower downtime and improve process efficiency

• Processing Multiple Part Types on the Same Production Line


– Improved equipment flexibility & reduced machine changeover time

• Improved Quality Resulting from Vision Inspection


– Command price premiums for a higher quality product
Key Vision Technologies

• Robust Location Algorithms


– Part Finding under challenging Tell your robot
conditions
where to go
– Eliminate perspective distortion
– Translation to robot coordinate
system

• Communications
– Availability of Robot Protocols
– Making the application code work
together

• Hardware Performance Simple communications to


– Processing Throughput robots and lower end-of-arm
tooling weight
– Size & Weight
– Environmental protection
Add Vision to Robotic Applications to Improve…

• Return on Investment
Gain from Investment – Cost of Investment
ROI =
Cost of Investment

• Operational Equipment Effectiveness (OEE)


OEE = Availability x Performance x Quality

• Product Quality
Cost of Quality = Internal Failure Cost + External Failure Cost +
Inspection Cost + Prevention Cost
Improve Return on Investment
Gain from Investment – Cost of Investment
ROI =
Cost of Investment
• Elimination of costly mechanical fixtures

Reduced Capital Investment Costs


Improve Operational Equipment Effectiveness

OEE = Availability x Performance x Quality


• Improve robotic placement accuracy
¾ Increase production speeds and reduce downtime

• Process multiple part types on the same production line


¾ Make equipment more flexible
¾ Reduce machine change-over time

Improved Equipment Efficiency


Improve Product Quality

Cost of Quality = Internal Failure Cost + External Failure Cost +


Inspection Cost + Prevention Cost

Cost of Quality

Cost of Poor Cost of Good


Quality Quality

Internal External Appraisal Prevention


Failure Costs Failure Costs Costs Costs
Improve Product Quality

Cost of Quality = Internal Failure Cost + External Failure Cost +


Inspection Cost + Prevention Cost

• Direct Costs – Defects identified in manufacturing


¾ Reduce Scrap & Rework Costs
¾ Reduce labor costs from manual inspection
¾ Improve reason for reject data to drive process improvement

• Indirect Costs – Defects that escape the plant


¾ Brand reputation, customer complaints, warranty claims, product recalls

Improved Product Quality


Contact Information

John Keating
Principal Product Marketing Manager
Cognex Corporation
1 Vision Drive
Natick,MA 01660
USA
Telephone: 1-508-650-3000
email: john.keating@cognex.com
www.cognex.com
Top Lessons Learned in
Vision Guidance Applications

Presented by:

Eric Hershberger
and
David Wyatt

Applied Manufacturing Technologies


International Conference for Vision Guided Robotics
September 30 – October 2, 2008

Eric Hershberger
Senior Engineer
Applied Manufacturing Technologies

Eric Hershberger
Applied Manufacturing Technologies
219 Kay Industrial Drive
Orion, Michigan 48359
Phone: 248-409-2000
Fax: 248-409-2027
Email: ehershberger@appliedmfg.com

Eric has a degree in Computer Science from Michigan Tech. He enjoys working with
vision systems and loves robot calibration and performance testing.
International Conference for Vision Guided Robotics
September 30 – October 2, 2008

David Wyatt
Staff Engineer
Applied Manufacturing Technologies

David Wyatt
Applied Manufacturing Technologies
219 Kay Industrial Drive
Orion, Michigan 48359
Phone: 248-409-2073
Fax: 248-409-2027
Email: dwyatt@appliedmfg.com
David was educated as an Electrical Engineer at the University of Missouri, is a Charter
Member of the Machine Vision Association of SME, the founder of Midwest Integration
and is a Staff Engineer at Applied Manufacturing Technologies. David started in vision
guidance at Delco Electronics. At Delco Electronics, David was involved in the writing
of many of the core vision algorithms in use today and served as the Machine
Intelligence Chairman for General Motors. At Midwest Integration, David performed
hundreds of automation projects earning the Vendor of the Year Award from Day and
Zimmerman Inc. as well as the US Small Business Administration’s Award for
Excellence.
Top Lessons Learned
in Vision Guidance Applications
David R. Wyatt
Staff Engineer
Applied Manufacturing Technologies
Fixture Repeatability
• Yes, vision relaxes fixture
requirements

• No, vision will not eliminate the need


for fixtures

• The capability of the vision system


must exceed the repeatability of the
fixture.

• Determine fixture repeatability before


doing anything else. (6 DOF)
Vision Repeatability
• If the part cannot be located with repeatability,
the application will never work.

• A representative supply of parts must be


maintained for testing and re-testing purposes
– A minimum bank of 30 parts per part type should
be identified and stored
– Parts bank must have every visible difference
expected (such as color or pattern differences)
– Do not use scrap parts unless you are sure the
differences will not matter
Lighting

• Lighting will make or break the


application
– Weeks of effort now avoids years of pain
later.
– Use long life LEDs and matching filters.
• Send parts to lighting companies or
distributors for free evaluations
– Get a gray scale distribution of foreground
and background and actual pictures as .bmp
files
– Must have more than 25 levels of gray
between foreground and background.
• Recommend 50 or 100 gray levels out of 255.
• Block all direct exposure to sunlight
Test Plan
• Test plan defines when the job is
done.
– The test plan is the specification
applied.

• The parts bank is used to execute


the test plan.
– New parts get added to the bank.

• The test plan is ran after each


change to the software or tooling.
Vision Tools

• Invest in great vision tools


• Always buy as much resolution and as big a tool
box as your budget can afford
• Cognex and DVT are different platforms
– You get what you pay for
– Test all platforms, never assume
• Use grayscale tools
– Auto-thresholding is self fulfilling
– Use Geometric Pattern Recognition (GPR)
– Recognize that GPR often times has no reference
point
Internal Support

• Vision works better


when the plant wants
it to work.
• Find a local
champion.
– Teach him/her the
system and test plan.
• Get a support
agreement.
• Have an Exit plan.
Operators

• Involve Operators early


– They know what is going on
at the line level

– They know what part


problems exist

– They know what didn’t work


before and why

– They can help avoid mistakes


and delays
2D versus 3D

• A camera is a 2D sensor
• We can get 3D info from cameras
• Make sure we don’t make 3D decisions on
2D data
– Avoid using shadows to gage height
– Avoid using reduced feature sizes as an
indication of distance
– Perspective can be valid in certain applications
Global vs. Local Calibration

• It is possible to improve
robot accuracy in a
smaller work envelope
– Accuracy will decrease
outside of that smaller
envelope
• It is always best to
calibrate the vision
system as close to the
work area as possible
Contact Information

David Wyatt
Staff Engineer
Applied Manufacturing
Technologies
219 Kay Industrial Drive
Orion, Michigan 48359
USA
Telephone: 248-409-2000
email: dwyatt@appliedmfg.com
www.appliedmfg.com
How Advancements in Vision Guidance are
Making Flexible Feeding Applications Desirable

Presented by:

Eric Lewis
Flexomation
International Conference for Vision Guided Robotics
September 30 – October 2, 2008

Eric Lewis
President
Flexomation, LLC

Eric Lewis
Flexomation, LLC
586 Northland Boulevard
Forest Park, Ohio 45240
Phone: 513-825-0555
Fax: 513-825-1870
Email: eric.lewis@flexomation.com

Bio Not Available at time of Print.


Presentation not available at time of production.
Vision Guided Robot Applications for
Packaging & Flexible Feeding

Presented by:

Mark Noschang
Adept Technology
International Conference for Vision Guided Robotics
September 30 – October 2, 2008

Mark Noschang
Manager of Applications Engineering for North America
Adept Technology, Inc.

Mark Noschang
Adept Technology, Inc.
11133 Kenwood Road
Cincinnati, Ohio 45242
Phone: 513-792-0266, x106
Fax: 513-792-0274
Email: mark.noschang@adept.com

Mark Noschang was appointed Manager of Applications Engineering for North America
in July of 2008. He joined Adept in October 1997 as an applications engineer in the
company’s Cincinnati, Ohio office. In his tenure, he served as a senior applications
engineer, as well as fulfilling roles in the training department. Mr. Noschang holds a
Bachelor’s Degree in Electrical Engineering from the University of Cincinnati.
Vision Guided Robot Applications for
Packaging and Flexible Feeding
Mark Noschang
Manager of Applications Engineering
Adept Technology, Inc.
Agenda
• Introduction
• Motivation
• System Components
• Pitfalls for system implementation
Packaging and Flexible Feeding?

Flex Feeding???
Packaging vs. Feeding
• Packaging:
– A method by which products are enclosed to
provide containment and protection, allowing
for easier shipment, distribution, and sale
• Flexible Feeding:
– A method by which parts are taken from bulk
storage to a known orientation, typically for
assembly operations
What is Flexible Material Handling?
• “A method of taking parts from bulk to a
known orientation that can handle multiple
part sizes and styles.”

• Abilities:
– Handle a wide variety of part types
– Perform frequent model changeovers quickly / easily
– Process multiple parts and models simultaneously
– Respond quickly to part design changes
Market Business Drivers
1. Higher Throughput per Factory Space
Factory Space is at a Premium

2. Increased Labor & Liabilities Costs


3. Localized Production
Need to Produce Near Customers
Regulatory Constraints

4. Increased Product Breadth


Shorter Time to Volume
Mass Customization of Products

5. Clean Product Handling


Need for Sterile Packaging
Reasons for Flexible Handling
• Short product life
• Frequent part changeover
• Reduce engineering / startup time
• Allow software-only part changeovers
• Hard tooled solutions are often expensive
• Provide consistent cost for global production
Flexibility is Key to Success
• Dedicated and specialized equipment is limiting
– Products and packaging changes
– Consumers / retailers demand customized products
• Robots can easily be adapted to new products
– Minimal changes to tooling / software
– Robust and tightly integrated vision is critical
• Flexible automation provides an agile solution
– Respond quickly to changes in the market
– Minimize inventory levels
Cost Per Placement
$0.0140
$0.0140

$0.0120
$0.0120

$0.0100
$0.0100

$0.0080
$0.0080

$0.0060
Manual
$0.0060

$0.0040
$0.0040

$0.0020
$0.0020 Robot

$-
$-
1990
11 22 33 44 55 66 77 88 99 10
10 11
11 12
12 13
13 14 152006
14 15 16
16

Source: US Department of Labor, International Federation of Robotics


Underlying Technology
Method of presenting parts

Method of locating parts

Method of manipulating parts

Flexible Material Handling


System Layout
• Methods of presentation
– Continuous conveyor
– Indexing conveyor
– Part feeders
• Vision systems are used to locate product
• The robot picks and places the product
• The system software ties it all together
+ + +
Enabling Technologies

1. Multiple Feeding Methods

2. Conveyor Tracking Support

3. Robust Vision Integration

4. High Speed Robotics

5. Integrated Compact Controls

6. Tight Connectivity to System Components


Expectation vs. Reality
“Flexible parts feeding is a lot like
education spending. Most folks want only
the best schools and the best teachers for
their children. But, when it comes time to
pay the piper, the enthusiasm quickly
evaporates.”
- John Sprovieri, Senior Editor, Assembly Magazine
Questions For Planning a System
• What is the overall cycle time required?
• Does the customer require average or absolute
cycle time?
• How is the machine flow controlled?
• How many stable states does the part have and
can vision differentiate them?
• Is the pick state the most stable state?
• Can the customer provide parts for testing?
Key to Success in Part Presentation
• Goals for feeding systems
– Provide part singulation / separation
– Provide contrast between parts and
background
– Provide parts in stable state
– Present parts to system in desired stable state
Key to Success in Part Presentation
• Selecting the proper type of feed system
– Flexible feeders
• Belt type is best for soft parts
• Platen type is best for rigid parts
• Grass style is best for fragile or rolling parts
Key to Success in Part Presentation
• Selecting the proper type of feed system
– Conveyor systems
• Variable belt materials / colors
• Cleated / textured belts
• Backlit to provide greater contrast
• Variable speeds to provide part separation
Keys to Success for Vision Systems
• Ability to see what you need to see
– Must have proper lighting to highlight the parts
– Must have sufficient resolution to identify key features
– Must have proper lens / mounting
– Must have comprehensive set of vision tools

• MUST CONTROL THE ENVIRONMENT!


Keys to Success for Vision Systems
• Powerful / complete vision toolset
– Retrain models with a single click
– Allow for measurements and inspections
– Make any vision tool relative to any other tool
– Allow for complex “filtering” of good / bad
parts based on any vision result
Keys to Success for Vision Systems

• Automated calibration process


• Support for multiple camera mounting
configurations
Keys to Success in Robotic Systems
• Pick the correct robot for the job
Key to Success in Robot Systems
1. Table-Top SCARA Robots
High Speed robots for Assembly, Handling & Packaging

2. Cartesian & Linear Robots


Configurable, Precision robots for Assembly & Handling

3. 6-Axis Articulated Robots


High Dexterity Robots for 3-D Assembly & Handling

4. Floor Mount SCARA Robots


High Payload robots for Industrial Applications

5. Parallel Kinematics Robots


Ultra High Speed robots for Packaging
Key to Success in Robot Systems
• Selecting the right robot for the job
• Feasibility study for specific application
• Rigid mounting for all system hardware
• Tight integration of all components
• Attention to system layout
• Clear definition of cycle
• Attention to “details”
Benefits for Packaging and Feeding
• In-feed parts do not require fixturing
• Part change over can be a simple software change
• Different parts can be handled with the same feed system
• Robots and feeders can be redeployed
• Vision provides flexibility

• Results:
– Save engineering time
– Saves start up time
– Allows equipment to be re-deployed
– Saves money
Trends in Packaging and Feeding
• Food, Consumer & Household Products and
Personal Care Products more specialize
• Companies desire to keep manufacturing close
to consumers
• The use of contract packagers is increasing
• Labor & worker liability increasing
• Handling randomly oriented product from
conveyors is required for many companies
• Flexible automation enables companies to
compete and flourish in a global economy
Contact Information

Mark Noschang
Manager of Applications Engineering
Adept Technology, Inc.
11133 Kenwood Road
Cincinnati, Ohio 45242
USA
Telephone: 513-792-0266, x106
email: mark.noschang@adept.com
www.adept.com
High Accuracy Robot Calibration, Wireless
Networking, and Related Technical Issues

Presented by:

Eric Hershberger
and
David Wyatt

Applied Manufacturing Technologies


International Conference for Vision Guided Robotics
September 30 – October 2, 2008

David Wyatt
Staff Engineer
Applied Manufacturing Technologies

David Wyatt
Applied Manufacturing Technologies
219 Kay Industrial Drive
Orion, Michigan 48359
Phone: 248-409-2073
Fax: 248-409-2027
Email: dwyatt@appliedmfg.com
David was educated as an Electrical Engineer at the University of Missouri, is a Charter
Member of the Machine Vision Association of SME, the founder of Midwest Integration
and is a Staff Engineer at Applied Manufacturing Technologies. David started in vision
guidance at Delco Electronics. At Delco Electronics, David was involved in the writing
of many of the core vision algorithms in use today and served as the Machine
Intelligence Chairman for General Motors. At Midwest Integration, David performed
hundreds of automation projects earning the Vendor of the Year Award from Day and
Zimmerman Inc. as well as the US Small Business Administration’s Award for
Excellence.
International Conference for Vision Guided Robotics
September 30 – October 2, 2008

Eric Hershberger
Senior Engineer
Applied Manufacturing Technologies

Eric Hershberger
Applied Manufacturing Technologies
219 Kay Industrial Drive
Orion, Michigan 48359
Phone: 248-409-2000
Fax: 248-409-2027
Email: ehershberger@appliedmfg.com

Eric has a degree in Computer Science from Michigan Tech. He enjoys working with
vision systems and loves robot calibration and performance testing.
Wireless Vision Systems and
High Accuracy Vision Guidance

Eric Hershberger
Senior Engineer
Applied Manufacturing Technologies
Wireless Vision – A Reality
• New Gigabit Ethernet (GigE Vision™) camera
standard
• New IEEE 802.11n draft protocol
• Routers available on the market
• Fewer wires, less expensive high flex cables
• Easy to integrate new cameras with older vision
systems
• Problems and issues
• Other options
GigE Vision™ Standard
• New GigE Vision™ camera standard
– 1000 megabits per second (Mbps)
• Lots of camera manufacturers
– DALSA, Basler, Prosilica
• Fast image transfer
• 100m cable runs
IEEE 802.11n draft protocol
• Not approved until ~December 2009
– Hardware already for sale as Draft-n
compliant
– Backwards compatible, but not
recommended
• MIMO – Multiple-input Multiple-output
– More antennas
– Theoretical 600Mbps
• 5.0Ghz Recommended
– Older protocols use the 2.4Ghz network
Routers Recommended
• Linksys WRT600N
• Best implementation
of the 5.0Ghz draft-n
• 12v power
• Easy to mount to the
EOAT
• Built in GigE switch
• Best to use
Less Wires, Less Expensive High flex Cables

• For full wireless solution 12v power cable


is only required for EOAT
• High Flex gigabite ethernet cables are less
expensive than current analog / Camera
Link® and FireWire cables
Easy to Integrate

• Visual Studio 2005 recommended


• All manufacturers have SDK’s
– Lots of examples and good tech support
– Basic routines for image transfer easy to
implement
• Easily interfaced with older Vision Pro
systems
– Specifically v3.4
• VB scripting can work
Problems and Issues

• Make sure plant IT is on board


• Draft-n is not an approved standard
– Future hardware incompatibility
– Mix and match hardware vendors might not
implement the same
• 5.0Ghz band is clear now, in the future….?
• Custom hardware available for analog
systems, >$20k
Other Options

• Bluetooth 3.0
– Up to 480Mbs
– Ultra-wideband (UWB)
• Nanny cams, or
wireless ethernet
based cameras
– Typically a CMOS
imager
– Lower resolution
– 2.4Ghz band
High Accuracy Vision Guidance

• What can we expect the


robot accuracy to be?
• Robot calibration
• High accuracy tool center
points (TCP)
• Calibration between robot
and vision
• Calibration must occur
near work…global vs.
local calibrations
Robot Accuracy

• Off the shelf robots are


repeatable but not accurate
• Typical off the shelf robot
accuracy will be 3-5mm
depending upon the robot arm
• The robot will be very
repeatable 0.02-0.04mm
• Calibrated robots can have an
accuracy of 0.3-0.5mm
• Use of external equipment can
increase the accuracy to less
than 0.1mm
Robot Calibration

• Metris K600 robot calibration


• Dynalog Dynacal system
• Absolute Accuracy systems from Kuka,
and ABB – Calibrated from the factory
• Laser trackers can be used
• Factory calibration becoming more
prevalent
High Accuracy TCP’s

• Can use portable


CMM’s for best
accuracy
• Metris and Dynalog
can calculate a high
accuracy TCP
Robot to Vision Calibration

• 2D systems, typical
for a dot calibration
grid – manual process
• 3D systems, auto
calibration available
• Portable CMM’s can
be used for very high
accuracy calibration
Global vs. Local Calibration

• It is possible to improve
robot accuracy in a
smaller work envelope
– Outside of that smaller
work envelope will lose
accuracy
• Always best to calibrate
the vision system as
close to the work area as
possible
High Accuracy Vision Guidance

• Combination of robot
calibration, high accuracy
TCP, and vision optics can
improve your VGR project
• Robot calibration alone can
help increase the accuracy
of off line programming for
downloads
Contact Information

Eric Hershberger
Senior Engineer
Applied Manufacturing Technologies
219 Kay Industrial Drive
Orion, Michigan 48359
USA
Telephone: 248-409-2000
email: ehershberger@appliedmfg.com

www.appliedmfg.com
Vision Based Line Tracking

Presented by:

Frank Maslar
Ford Motor Company
International Conference for Vision Guided Robotics
September 30 – October 2, 2008

Frank Maslar
Technical Specialist
Ford Motor Company

Frank Maslar
Ford Motor Company
36200 Plymouth Road
Livonia, Michigan 48150
Phone: 313-805-3904
Email: fmaslar@ford.com

Key Responsibilities:
Work with universities and key suppliers to develop and implement advanced
manufacturing technology in the manufacturing of powertrain systems. Areas of focus
include vision systems and traceability.

Previous Positions Held:


Advanced Manufacturing and Quality Engineer at Ford Electronics
Research Scientist at Siemens Corporate Research

Degrees:
B.S.M.E. Penn State
Vision Based Line Tracking

Frank Maslar
Technical Specialist
Ford Motor Company
Background
• Interactive directed research project
between Ford Advanced Manufacturing
Technology Development and Purdue
University Robot and Vision Lab
• Team Leaders
– Ford – Frank Maslar
– Purdue University – Professor Avinash Kak
Current Robotic Applications
Assembly
3%
Arc Welding
13%
Painting
Dispensing 18%
3%
Inspection
1%

Material Handling
27%
Spot Welding
31%
Material
Removing
4%
Source: Robotic Industries Association
Opportunities
1910’s

1940’s

1990’s
Moving Line Assembly

• Eliminate wasted time during


transfer of parts into and out of
assembly stations
• Eliminate cost associated with
stop stations
Line Tracking Status Overview

• Enhanced Accuracy
–3 mm
• Multi-loop control
–Enhanced robustness
• Visual tracking systems
–Geometric model-based approach
–Appearance-model-based approach
Control Architecture

Coarse Fine Fine


Control Control #1 Control #2

Control
Arbitrator

Robot
Controller
Coarse Control

Without specularity detection With specularity detection

Detection & Compensation of specular Highlights on target object


Stereo space parameters
Appearance-based 3D Rigid Object Tracking
• General Idea
Appearance Model (Template)

0.3
-0.1
0.25
0.2
0.15 -0.05

0.1
0.05 0
0
-0.05
0.05
-0.1

0.1

Perspective
0.15

0.64

0.66

0.68

0.7

0.72
projection

3D model The estimated 3D pose

Current Image

• Extension in the presence of occlusions


Results
Research provided by:

Avinash C. Kak
Professor of Electrical and Computer Engineering
Electrical and Computer Engineering
Purdue University
EE Building
West Lafayette, Indiana 47907
765-494-3551
kak@purdue.edu
http://rvl4.ecn.purdue.edu/~kak/
Contact Information

Frank Maslar
Technical Specialist
Advanced Manufacturing
Technology Department
Ford Motor Company
36200 Plymouth Road
Livonia, Michigan 48150
USA
Telephone: 313-390-2132
email: fmaslar@ford.com
www.ford.com
Case Study:
Robots & Vision in Life Sciences
and Automated Pharmacy

Presented by:

David Arceneaux
Stäubli Robotics
International Conference for Vision Guided Robotics
September 30 – October 2, 2008

David Arceneaux
Business Development & Marketing Manager
Stäubli Robotics

David Arceneaux
Stäubli Robotics
201 Parkway West
Duncan, South Carolina 29334
Phone: 864-486-5416
Fax: 864-486-5467
Email: d.arceneaux@staubli.com

Arceneaux is Business Development-Marketing Manager for Stäubli Robotics,


manufacturers of innovative 4 and 6 axis industrial and cleanroom robotic solutions.
During the past 5 years, his key responsibilities include business development,
marketing, and strategic management.

Stäubli Robotics and employees are corporate members of the Robotic Industries
Association (RIA), Association for Laboratory Automation ALA, Semiconductor
Equipment and Materials International (SEMI), Society of Plastics Industry (SPI), and
Society of Mechanical Engineers (SME).
Robots & Vision in Life Sciences and
Automated Pharmacy

David Arceneaux
Business Development-Marketing Manager
Stäubli Robotics
Agenda

ƒ Why Automate?

ƒ Where is Automation being utilized?

ƒ Why Use Vision in Robotic Automation?

ƒ Selecting the Right Automated Solution

ƒ Types of Robotic Automation

ƒ Applications

ƒ Case Study (RIVA)


Why Automate?
Industry Driving Forces
ƒ High-throughput (productivity and efficiency)

ƒ Reproducibility, reliability, accuracy, quality, and traceability

ƒ Automation of repetitive tasks

ƒ No cross-contamination (human/product
or product/human)

ƒ Miniaturization

ƒ Biocompatibility (Cleanroom)

ƒ Safe handling of sensitive and hazardous


products

Photo Courtesy: High Resolution Engineering


Why Automate?
Top 5 Reasons to Automate

Reasons to Automate: 1=Low; 5=High

Increased productivity means that Identification of new drugs and healthcare


products can be done more quickly thus reducing the time to market
JALA, (2006). 2006 ALA Survey on Industrial Laboratory Automation
Where is Automation being utilized?
The hottest area utilizing laboratory automation is in high throughput screening (HTS)
ƒ Still many growth opportunities within most life science segments
Why Use Vision in Robotic Automation?

Key Benefits
ƒ Elimination of Costly Fixtures
ƒ Reduced capital investment costs

ƒ Improved Robotic Accuracy


ƒ Lower downtime and improve process efficiency

ƒ Multiple Sample Processing on the Same Production Platform


ƒ Improved equipment flexibility & reduced changeover times

ƒ Improved Quality Resulting from Vision Inspection


ƒ Command price premiums for a higher quality product
Selecting the Right Automated Solution
Key Success Factors

ƒ Reliability
– Automated systems need to be able to work consistently for extend periods
with minimal human intervention. (24/7)

ƒ Scalability
ƒ Consideration to current equipment and future growth should include
automation that is modular and scalable.

ƒ Flexibility
ƒ Automation needs to be easily reconfigurable when the need arises.

ƒ Data Storage and Scheduling


ƒ Increasing sophistication of the automation equipment and the increasing
volume of information that is created, software is an important link to the
success of the system.
Types of Robotic Automation
Three of the most common geometries for laboratory robots are:

ƒ Cartesian (three mutually perpendicular axes)


ƒ Cylindrical (parallel action arm pivoted about a central point)
ƒ Anthropomorphic (multi jointed, human-like configuration).

ƒ Anthropomorphic (5-6 Axis) robots generally provide more flexible “human-like”


automation that includes transfer, handling, weighing, extraction and general
manipulation of media.
Applications:
Compact Cell Automation
ƒCompact cells can include all the devices needed for plate storage, plate retrieval, plate
replication and high throughput screening.

ƒFlexible robotic automation allows devices to be positioned closer to the robot for a compact
solution with all of the capacity and functionality of the larger systems offered in the industry.
Applications:
Bench Top Automation

ƒBench top robotic systems are inexpensive, flexible robotic solutions capable
of performing a wide array of applications.

ƒCells can be configured with two or more devices utilizing very little floor
space.
Applications:
Cell culture automation
• Cell culture is and has historically been an essential component of the drug discovery toolbox.
• Cell culture provides the proteins, membrane preparations and other raw materials required for biological
research.

• In recent years this demand for cells and new cell lines has increased dramatically with the
emergence of high-throughput screening reinforcing the need for robotic automation.
Applications:
Ultra High Throughput Automation
ƒRobots are capable of UHTS applications with a throughput of over 1 million assays per day.

ƒThese systems are built for industrial use and are capable of running 24 hours a day, 7 days a
week.
Case Study: Automated Pharmacy (RIVA)
ROBOTIC IV AUTOMATION
The new standard in IV admixture compounding.

Providing INTELLIGENT SOLUTIONS to deal


with the issues of. . .

– Safety

– Efficiency and Effectiveness

– The ever changing standards of the


regulatory environment.
Background
St. Boniface Research Centre developed technology

ƒ The St. Boniface Hospital & Research Foundation was


established in 1971 as part of the centennial celebrations of the
Hospital.

ƒ The Foundation acts as the primary fundraising body for the St.
Boniface General Hospital Inc., and promotes excellence in
health care research related to the prevention and treatment of
disease, the promotion of good health, and improvements in
patient care.

ƒ RIVA is the first product of its kind


The Product

RIVA
Robotic IV Automation
The Product Overview
RIVA is an integrated system designed to automate the process
of preparing IV admixtures in the hospital pharmacy.

ƒDesigned with input from + 300 pharmacists

ƒFully enclosed ISO Class 5 environment (cleaning procedure in mini-


environment. They use UV process for Sterilization daily…
FDA required.)

ƒFully automated preparation of syringe and bag doses…in any


combination…in many sizes

ƒMultiple independent verifications – visual, weight for complete


electronic audit trail

ƒUp to 600 labeled, patient specific or batch doses, per 8 hour shift with
one operator

ƒDesigned to operate in the currently evolving regulatory


environment
The Market

Target Market Outlook

– Market size- 1750 hospitals in the main target profile


• 510 with 400+ beds; 1240 with 200 to 400 beds
– (All manual process with high turnover of personnel)

– Based on feedback to date and IV volumes


• On average 2 RIVA units per hospital – 3500 units
– (Two areas for need: Antibiotics and Chemo therapy and they
must be kept separate)
Value Proposition to Hospitals
There are several categories of potential savings to the hospital, these vary
depending on the institution and their goals

– Resource savings (3 Techs per shift – ROI in less than one year)

– Drug wastage savings (= staggering losses – up to 40% IV


applications are scrap. 99% usage with RIVA)

– Reduction or elimination of pre-fills from third parties

– Cost avoidance of major construction for USP797 compliance (Cost


for a “cleanroom”)

– Inventory reduction (better control and loss prevention + resources


limited)
Conclusion

ƒ Just about any laboratory and hospital can take advantage of some of the
advances in automation, the questions are what to automate and to what
extent?

ƒ The options cover the spectrum from islands of automation, which retain some
manual processes, to fully automated integrated systems. The optimal degree
of laboratory automation depends on the laboratory setting and considerations
of cost, throughput, and flexibility.

ƒ Other considerations include the time that will be required to complete the
installation, the space available, the proportion of the tests that are routine, the
availability of skilled technicians, safety, and reliability.

• The reasons for automation have become so compelling that it is no longer


simply a competitive advantage for laboratories, but rather is now a
competitive necessity!
Contact Information

David Arceneaux
Business Development-Marketing Manager
Staubli Robotics
201 Parkway West
Duncan, South Carolina 29334
USA
Telephone: 864-486-5416
email: d.arceneaux@staubli.com
www.staubli.com
Unmanned Systems Intelligence, Vision and
Automation Concepts for Combat Engineer and
Other Battlefield Missions

Presented by:

Jerry Lane
Applied Research Associates, Inc.
International Conference for Vision Guided Robotics
September 30 – October 2, 2008

Jerry Lane
Director Great Lakes Office
Applied Research Associates

Jerry Lane
Applied Research Associates
48320 Harbor Drive
Chesterfield Twp., Michigan 48047
Phone: 586-242-7778
Email: glane@ara.com

Jerry joined ARA in 2005 to work on multiple unmanned systems projects including the
Modular Robotic Control System for Route Clearance and Combat Engineer missions,
the man-packable Nighthawk mini-unmanned aerial vehicle and the obstacle/stair
climbing robot, and the Lightweight Reconnaissance Vehicle (LRV). Jerry’s prior
assignment for 28 ½ years was at TARDEC leading various advanced vehicle
technologies and robotic vehicle projects. Jerry is also on the Board of Directors of the
Michigan Chapter of NDIA and First Vice President of the Association for Unmanned
Vehicle Systems International (AUVSI). Jerry is the Co-Founder and Co-Chair of the
Intelligent Ground Vehicle Competition (IGVC).

Jerry holds a Bachelor of Mechanical Engineering and Masters of Business


Administration from the University of Detroit.
Unmanned Systems:
Intelligence, vision and automation concepts for
combat engineer and other battlefield missions
Jerry Lane
Director Great Lakes Office
Applied Research Associates
Unmanned Systems: Intelligence, vision and automation
concepts for combat engineer and other battlefield missions

• Combat engineer functions such as route clearance, Explosive


Ordnance Disposal (EOD), countermine and other high risk
battlefield missions can be performed with intelligent and capable
unmanned systems.
• Unmanned systems with advanced manipulators, machine vision,
planners, navigation can automate the dangerous battlefield
functions.
• Unmanned systems can save soldiers lives while increasing the
efficiency of the operations.
• The integration of advanced vision and manipulation with provide
new levels of semi-autonomous performance on the battlefield by
relieving the soldier from dull & tedious moment to moment
control of current tele-operated systems
Robotic Route Clearance for Convoys
Robotic Route Clearance

Robotic Control
System
+
Construction Equipment
Armor option
Supportable
Low cost
Optional equipment
High power
Hydraulic available

+
Multiple IED Disruptors
Rollers & Chains
Rakes & Cutters
Jammers
GPR/EM Detectors
Robotic Concept
with IED Disruptor Options

Rhino

IED Drag Chains


Cat 924G/H with Robotic Control System
USAF All-purpose Remote Transport System
(ARTS) - Tool Attachments

• Standard Tool Attachments For ARTS


– Forks
– Loader Bucket
– Brush Cutter
– Three-point Hitch
• Tool Attachments with ARTS
– Surface Clearance Windrow Assembly
– Backhoe
– Gripper
– Manipulator Arms
– Water Cannon
– High Energy Access And Disablement Device (HEADD)
– Small Munitions Disrupter (SMUD)
Robotic Skid Steer
RC50/RC60 for USMC

10 Delivered & 5 in Iraq


USMC Grappler dexterity
with simulated UXO
US Army
Humanitarian Demining

Detection Neutralization
Robotic Platform

„ Humanitarian Demining System Development Effort


z Sponsored by US Army Night Vision and Electronic Sensors
Directorate (NVESD), Fort Belvoir, Virginia, USA
z Integrate Detection and Neutralization Technologies on a Robotic
Platform
ARA Robotic controls on Cat 247b
with backhoe
IED Command & Trip Wire Cutter
Options
Lightweight Reconnaissance Vehicle
(LRV) for under vehicle Inspection
LRV Videos & Specifications

Stair Climbing Mountain Terrain


--- climbs and descends stairs, rocks and rubble up to 11” in height.
--- low cost surveillance robot (starting at under $25k)
--- self righting, under vehicle inspections (planes, trains and autos)
--- 0.003 Lux color cameras with IR illumination, Simple Tele-operation
--- 15 lbs. (man packable), carry small payloads, & pulls large payloads
--- Up to a 7 mile range with its Lithium Polymer batteries
LRV Under Vehicle Inspection
LRV: High mobility, low profile, back-
packable reconnaissance UGV
Nighthawk
„ Mini Unmanned Aerial Vehicle
„ Weight = 01.5 lbs (w/ payloads)
„ Rolls into a 6” x 20” tube
„ No assembly required
„ Quiet Electric Propulsion
„ Adv GPS Navigation Auto Pilot
„ Forward and Side looking Cameras
„ IR Option
Nighthawk In-Flight
UAV Launch, Control, & Recovery
for Immediate Situational Awareness
„ Launch Solutions:

Launcher Launcher
SIDE VIEW FRONT VIEW

„ UAV/MAV:

„ Recovery Solutions:
State Police Sedan
(Ford Crown Victoria):
Deployable Capture System Deployable Capture System
SIDE VIEW FRONT VIEW

„ Systems Integration Solutions:


Launcher Recovery (Deployed) CIA SUV
HMMWV-M1025 Launcher Recovery
(Stowed) (Chevy Suburban):
SIDE VIEWS
Enhanced Situation Awareness
On-Board Deployment of UAV
Robotic Checkpoint
Objective Capability
Robotic Checkpoint (Near Term 3-4 mos.)
Vehicle Stopped at Robotic Checkpoint

Inspection Robot w/grappler Blocking Robot w/remote M240

LRV under vehicle inspection robot


Inspection Robot removing cargo
Inspection robot removes hidden ordnance
Robotic Checkpoint Control
Suspect Car removed
by blocking robot with forks
Suspect Car threat neutralized by water cannon
Contact Information

Jerry Lane
Director Great Lakes Office
Applied Research Associates
Inc.
48320 Harbor Drive
Chesterfield Twp. Michigan 48047
USA
Telephone: 586-242-7778
Email: glane@ara.com
www.ara.com
International Trends and Applications in 3D
Vision Guided Robotics

Presented by:

Adil Shafi
SHAFI Innovation Inc.
International Conference for Vision Guided Robotics
September 30 – October 2, 2008

Adil Shafi
President
SHAFI Innovation, Inc.

Adil Shafi
SHAFI Innovation, Inc.
803 Lakeshore Drive
Houghton, Michigan 49931
Phone: 734-516-6761
Email: adil.shafi@shafi-inc.com

Adil Shafi is founder and president of SHAFI, Inc. and SHAFI Innovation, Inc.
(www.shafiinc.com) He has worked in the robotics and vision industry for the last 20
years and his companies have pioneered more than 100 Software Solutions in the area
of Vision Guided Robotics. His company's RELIABOT software runs on equipment
worth $500 million and is ranked #1 in the world for AutoRacking and Bin Picking. Adil
received three degrees in Computer Science and Electrical Engineering from Michigan
Technological University and worked in Chicago, Manhattan and the Silicon Valley prior
to founding his companies in Michigan. He works closely with many industry, academic
and government organizations and has travelled to more than 90 states and countries in
the world.
Presentation not available at time of production.
Advances in 3D Vision Guided
Robotics at Fraunhofer IPA

Presented by:

Jens Kuehnle
Fraunhofer IPA
International Conference for Vision Guided Robotics
September 30 – October 2, 2008

Jens Kuehnle
Research Associate
Fraunhofer Institute Manufacturing Engineering and
Automation (IPA)

Jens Kuehnle
Fraunhofer Institute Manufacturing Engineering
and Automation (IPA)
Nobelstrasse 12
70569 Stuttgart
Germany
Phone: 49 711 970 1861
Fax: 49 711 970 1004
Email: kuehnle@ipa.fraunhofer.de

Jens Kuehnle studied pure and applied mathematics at Ulm University, Germany and at
San Diego State University, California, USA. He completed his diploma in 2005 and his
Master of Science in 2006. Since 2007, he is employed as research associate in the
Information Processing department at Fraunhofer Institute Manufacturing Engineering
and Automation (IPA), Stuttgart, Germany. His areas of interest include software
development for 3D data evaluation, 3D metrology and computed tomography, 3D
perception in robot vision and dimensional inspection.
Advances in 3D Vision Guided
Robotics at Fraunhofer IPA
Jens Kuehnle
Research Associate
Fraunhofer IPA
Structure

• Motivation
• 3D Measurement Principles
• Applications:
– 3D Obstacle Avoidance for Service Robots
– 3D Object Recognition and Localization for
Automation and Handling Engineering –
Case Studies on Bin Picking
• Summary

© Fraunhofer IPA
Fraunhofer Society in Germany

58 institutes
at 40 locations

12.800 employees Budget of 1,2 billion euro

© Fraunhofer IPA
Fraunhofer IPA, Stuttgart

© Fraunhofer IPA
Focal Fields of Research
at Fraunhofer IPA
Corporate Management Automation Surface Technology
ƒ Life cycle management ƒ Digital Factory ƒ Coating processes and plant
ƒ Product creation management ƒ Visualization and interaction for paints and powder coating
ƒ Technical risk management ƒ Deposition of metals and
ƒ Technical information systems
ƒ Hazardous materials and recycling metal compounds
management ƒ New fields of application (electroplating, PVD/CVD)
ƒ Quality management ƒ Human-machine interaction ƒ Optimization / planning of
ƒ Factory planning and design processes and plants
ƒ Industrial and service robots
ƒ Work-flow and process planning ƒ Generation of functional
ƒ Assembly systems
ƒ Production system and plant layers and nano layers
management ƒ Robot assistance systems
ƒ Analysis and testing
ƒ Production optimization ƒ Sensor technology, mechatronics techniques for components,
ƒ Order processing management and microsystems engineering layers, surface engineering
ƒ Inbound and outbound logistics processes and sources of
ƒ Signal and power transmission defects
ƒ Manufacturing logistics ƒ Measurement and testing
ƒ Supplier parks ƒ Simulation of coating
technology processes and spatial flow
ƒ Network logistics and ƒ Ultraclean technology
supply chain management
ƒ Rapid product development

© Fraunhofer IPA
Department
Technical Information Processing

• 3D Object Recognition
• Robot Vision
• Industrial Image Processing
• Digital Signal Analysis
• Measurement and Testing Technology
• 3D Modeling and Reverse Engineering
• Rapid Prototyping, Rapid Product Development
• Industrial Computer Tomography

© Fraunhofer IPA
Motivation
Motivation
Challenges of Industrial Effects on Machine/Robot
Production Vision

innovation/product life cycle ▼ reconfigurability of systems ▲

measurement in
zero defect production ▲
manufacturing cycle ▲
intuitive MM-communication;
part complexity ▲
no expert knowledge ▲
robot usage in human 1st Law of Robotics:
environments ▲ collision avoidance ▲
robot accomplishes complex
perception ▲
tasks (e.g., parts handling) ▲

© Fraunhofer IPA
Motivation
Challenges of Industrial Effects on Machine/Robot
Production Vision

innovation/product life cycle ▼ reconfigurability of systems ▲

measurement in
zero defect production ▲
manufacturing cycle ▲
intuitive MM-communication;
part complexity ▲
no expert knowledge ▲
robot usage in human real-time modeling of 3D
environments ▲ workspace with memory
robot accomplishes complex object recognition and
tasks (e.g., parts handling) ▲ localization of industrial parts

© Fraunhofer IPA
3D Measurement Principles
Rough Classification

Time-of-Flight
Sheet of Light

(D
tio

e)M
ula

od
g
an

ul a
Tri

Interferometry

tio
Structured Light

n
Stereo
Depth-From-
Focus Determination (De)Focus

• all yield 3D point cloud or 2.5D depth image of object‘s 3D surface


© Fraunhofer IPA
Triangulation
Sheet of Light Structured Light
Laser

Line-
projection 2D-Camera

Mo
vem
en
t

• Accuracy < 1mm • Accuracy < 1mm


• One profile per measurement; • Dependence on contrast and
movement necessary (linear) hence, on ambient light
• E.g. SickIVP Ruler E600: • E.g. gom ATOS III:
10.000 profiles/sec., 1.536 4.000.000 points in 2 sec.
points each
© Fraunhofer IPA
Time-of-Flight
Rotated 2D Laser Scanner Time-of-flight Camera

• Accuracy > 3mm • Accuracy > 10mm


• One profile per measurement; • Provides depth/intensity image
movement necessary • Still somewhat prototypical
(pivoting) • E.g. Mesa Imaging
• E.g. Sick LMS400: SwissRanger SR3000:
500 profiles/sec. 176 x 144 pixels at 25 Hz
© Fraunhofer IPA
3D Obstacle Avoidance for
Service Robots
Obstacle Detection

• A robot completes its tasks in 3D space, so obstacle


detection should be done in 3D as well.
• State of the art (SOTA):
– various sensor systems available (e.g., ultrasound, radar,
laser rangefinder, rotated laser scanner, stereo rig)
– problem: no 3D data set (ultrasound, radar, laser
rangefinder) or generation of 3D data expensive (time:
rotated laser scanner; computation: stereo rig)
• Promising sensoric alternative:
– time-of-flight camera provides 2.5D depth image at video
frame rate

© Fraunhofer IPA
Moving Voxel Model with Memory

• 3D capture of robot environment in voxel model:


– axis aligned uniform grid
– parameters:
• dimension in x/y/z
• voxel size
– voxel states:
• unknown (white)
• free (green)
• occupied (red)
– voxel time stamp
– e.g. 6.4m x 6.4m x 3.2m with voxel size 0.05m yields 1.048.576 voxels

© Fraunhofer IPA
Moving Voxel Model with Memory

• Typically, viewing frustum < model dimensions


– data registered within the model
• Model follows the robot‘s movement
– maintain information in overlap

robot

robot robot

© Fraunhofer IPA
Experimental Platform

• DESIRE paltform (www.service-robotik-initiative.de)


stereo rig Marlin 145C2

3D sensor SR3000

Pan-Tilt Schunk PW70

• Voxel model used for


obstacle avoidance:
– 3D collision check of robot 7-DOF KUKA LBR3
with environment
robotic hand Schunk SAH
– complements 2D path planing
(projection of voxels into xy-plane)
laser scanner SICK S300

biaxial drive

© Fraunhofer IPA
Performance of Voxelization Algorithm

• Time-of-flight camera SR3000 used


– calibration + depth correction + bad pixel
removal + robot filtering
• Model update rate:
– CPU algorithm: about 10 Hz
– GPU algorithm: about 25 Hz
• Time stamp allows to keep data up-to-date
• Possible to use more than one sensor

© Fraunhofer IPA
3D Object Recognition and
Localization for Automation and
Handling Engineering
Case Studies on Bin Picking
Object Recognition and Localization
Sorted parts

TA
S O
• Parts stored in special carriers
• Parts supplied totally ordered
• Vision system redundant
Partly ordered
2D camera • However, carriers specifically
adapted to the parts stored and
thus, variations in the parts
usually require the carriers to be
Randomly stored changed as well
3D Sensor • Carriers are space-consuming

© Fraunhofer IPA
Object Recognition and Localization
Sorted parts

• Parts get separated by special


machinery
• Vision system for parts lying
Partly ordered separated from each other
2D camera
TA
S O • However, machinery
specifically adapted to the parts
and thus, variations in the parts
usually require the machinery to
Randomly stored
3D sensor be changed as well
• Machinery is space-consuming

© Fraunhofer IPA
Object Recognition and Localization
Sorted parts • Parts in known plane (e.g.,
belt, …) with only 3 DOF
(translation x/y, rotation z)

- one 2D camera
- image processing
Partly ordered
2D camera
TA
S O • Parts in arbitrary position (6
DOF) with identifiable features
(e.g. corners, holes, …)
Randomly stored
3D sensor - one (or more)
2D camera(s)
- photogrammetry

© Fraunhofer IPA
Object Recognition and Localization
Sorted parts • Parts „thrown in a box“
• Parts supplied unordered
• E.g., Bin Picking

Partly ordered • Parts in arbitrary


2D camera position (6 DOF)
with no identifiable
features

Randomly stored
- 3D sensor
3D sensor

© Fraunhofer IPA
Problems like „Bin Picking“

• Applications
– handling of known parts with a robot
– e.g, supplying chaotically stored parts to the
manufacturing chain

• Vision system that recognizes and localizes the


specified object should meet certain criteria:
– speed: in time with fabrication process
– robustness: 100% performance even on partial data
– degree of automation: amateur vs. expert operability
– flexibility towards variations in parts handled
– variety of objects that can be handled

© Fraunhofer IPA
Different Approaches at Fraunhofer IPA
Approach 1: database-driven
• Bin Picking ≠ Bin Picking
– part features (e.g., geometry)
– form of supply
– etc.

• At Fraunhofer IPA, two different


Approach 2: fitting of geometric
approaches are studied:
primitives
– database-driven algorithm that uses
the CAD-model of the part
considered
– algorithm based on best-fitting
geometric primitives within the part

© Fraunhofer IPA
Approach 1:
database

• Offline: database contains certain


orientations of considered part
• Online: comparison of scan data
with orientations in database
• Characteristic (excerpt): rotated CAD-model
– Depth histograms
comparison
– Normal vector histograms
– Significant bounding volumes on
the part (positive) and next to the
part (negative)

scan data

© Fraunhofer IPA
Approach 1:
Overview
Generation of a  point cloud Object localization of workpieces Gripping point calculation, movement 
e.g. with a rotated laser‐scanner with offline‐generated  database planing and gripping of a workpiece

© Fraunhofer IPA
Approach 1:
Status
• Current state of development:
– Tool to generate Database implemented
– Flexible object-localization and
gripping-point-calculation finished
– Prototype of bin picking system for
shafts implemented
• Currently under development:
– Integration of robot-cell at Hirschvogel
Umformtechnik GmbH (Denklingen, gripper
Germany) with cycle time 8s
till end of 2008
– Tests with different types of
found parts
objects (e.g. ring, housing)

© Fraunhofer IPA
Approach 2 (basics):
Segmentation of Geometric Primitives
• Geometric Primitives are:
– Plane, Sphere, Cylinder, Cone, Torus
• Best-Fit Principle (non-linear least-squares) optimizes:
d i = X i − X i′ orthogonal distance
d T = ( d1 ,...,d m ) distance vector

objective functions:
measurement point Xi X i′ base point σ02 = d T PT Pd
with weighting matrix PT P
optimization problem:

min min
a m
{X i′}i =1∈ F
σ 2
0 ({X ′ ( a )} )
i
m

i =1
© Fraunhofer IPA
Approach 2 (basics):
Best-Fitting of Geometric Primitives
• In case, the point cloud represents more than one geometric
primitive, the iterative fitting can be complemented with an automatic
segmentation into inlier and outlier:
– error of fit = order of magnitude of measurement error

inlier outlier

© Fraunhofer IPA
Approach 2:

• Parts are represented by means of the meta-model


geometric primitives contained within cylinders representing crankshaft
(meta-model)
• Idea: geometric primitives carry enough
information to make possible proper
recognition and localization model registration
• Algorithm combines segmentation and segmenting and best-fitting of
cylinders in scan data
best-fit of geometric primitives

© Fraunhofer IPA
Approach 2:
Overview
1. scanning unordered scene 2. recognition/localization
scan data

found cylinders

3. collision-free gripping 4. supplying ordered

© Fraunhofer IPA
Approach 2:
Status

• Current state of development:


– Prototype of bin picking system for
„simple“ parts implemented
• Currently under development:
– Treatment of more complex parts which
can also contain free-form surfaces
(e.g., crankshaft)
– Tool to automatically teach the system
to handle a yet unknown part
– Tool to automatically generate all
parameters needed

© Fraunhofer IPA
Performance of Bin Picking Algorithms

• Computing time:
– Approach 1: about 0.5 sec. with database < 40 MB.
– Approach 2: about 0.25 sec.
• Accuracy of localization adaptable:
– Approach 1: dependent on rotatory resolution of database (typically 2°)
– Approach 2: bounded by sensor inaccuracy (typically < 0.5 mm)
• Not dependent on 3D sensor used
• Successfully adapted to different parts
• Outstanding rate of recognition for parts tested
• Adaptable to different parts:
– Approach 1: definition of bounding volumes
– Approach 2: examination of geometric primitives (relation, symmetry, …)

© Fraunhofer IPA
Summary
Summary

• Situations occur, where 2D does not suffice.


• 3D algorithms are in need.
• 3D data is typically more complex than 2D (depth map, point cloud).
• 3D algorithms computationally more expensive than 2D.

• Fraunhofer IPA has many years of experience with 2D/3D.


• Examples of 3D vision guided robotics at Fraunhofer IPA:
– real-time modeling of 3D workspace with memory
– object recognition and localization of industrial parts

© Fraunhofer IPA
Acknowledgments

I would like to thank my colleagues at Fraunhofer IPA.


In particular, Martin Stotz, Thomas Ledermann and Dennis Fritsch.

This work was partly funded within the research project


DESIRE by the German Federal Ministry of Education and
Research (BMBF) under grant no. 01IME01A.
© Fraunhofer IPA
Thank you
for your Attention!
Contact Information

Jens Kuehnle
Research Associate
Fraunhofer Institute
Manufacturing Engineering and
Automation (IPA)
Nobelstrasse 12
D-70569 Stuttgart
Germany
Telephone: +49 711 970 1861
email: kuehnle@ipa.fraunhofer.de
www.ipa.fraunhofer.de
© Fraunhofer IPA
Vision Guided Part Loading/Unloading
from Racks for Automotive
Applications – Lessons Learned

Presented by:

Robert Anderson
Chrysler LLC
International Conference for Vision Guided Robotics
September 30 – October 2, 2008

Robert Anderson
New Technology Manager
Advance Manufacturing Engineering
Chrysler LLC

Robert Anderson
Chrysler LLC
800 Chrysler Drive, CIMS 482-04-16
Auburn Hills, Michigan 48326
Phone: 248-944-6076
Fax: 248-841-6272
Email: ra2@chrysler.com

Robert Anderson is currently the New Technology Manager in the Advance


Manufacturing Engineering Department, Chrysler LLC. Mr. Anderson is responsible for
coordination of technology development and deployment activities for North American
assembly plants. Mr. Anderson has developed and used robotic vision systems during
most of his career.

Mr. Anderson received Bachelor’s and Master’s of Science degrees from the Ohio State
University and a Master’s of Engineering degree from the University of Michigan. Before
coming to the Chrysler, Mr. Anderson worked for General Electric and Automatix, Inc.
Mr. Anderson has four patents in powder metallurgy and laser processing and has
published several technical articles.
Vision Guided Part Loading & Unloading from Racks for
Automotive Applications – Lessons Learned

Robert Anderson
New Technology Manager
Advance Manufacturing Engineering
Chrysler, LLC
Automotive Part Loading / Unloading
Topics

• Robotic rack load / unload experience


• Chrysler flexibility strategy
• Why use vision ?
• Vision techniques / approaches
• Design, integration & operations lessons
• Plant considerations
• Summary
Scope

• Automotive body shop


• Loading & unloading part racks
• Practical lessons learned
• Specific material delivery, but lessons adaptable
to other delivery, conveyance methods
• Topics not covered
– Image processing techniques, inspection, gauging,
bin picking, best fit guidance, adhesive detection,
general assembly, . . .
U.S. Auto Market Fragmentation
Drives Lean Processes

70% 306
Increase

181

2000 2008
Flexible Processes
Robotic Vision Experience
• Early Experience
– First vision aided rack unload application in 1997 using stationary cameras.
– Robot mounted cameras for rack unloading used first in 2006.
• Current Applications
– Belvidere Assembly was the first Chrysler program to use comprehensive
vision rack unload strategy.
– All but one Chrysler assembly plant and all Stamping/Sub-Assembly
fabrication plants use rack load / unload vision systems.
– Approximately 75 robotic vision rack load / unload applications.
– Load and unload a wide range of body shop components: doors, floorpans,
body sides, rails, crossmembers. . . .
• Future
– Standardize systems, incorporate best practices, improve system integration
and robustness.
– Possibly reduce the number of assemblies loaded/unloaded with vision.
– Looking at bin picking but need solid business case.
Rack Load / Unload Applications
Why Use Vision ? – Reduce Cost

• Productivity, not technology


• Must consider:
– Direct & indirect labor
– Vision system cost
– Tooling & rack costs
– Training
– Impact on production rate, MTBF, MTTR
– Part transportation

Must reduce total life cost !


Why Use Vision ? – Flexibility & Floor Space
Why Use Vision ? – Productivity

• Cycle time improvement ?


• MTBF, MTTR impact ?
Why Use Vision ? – Logistics
Logistics Factors:
• One time rack build,
eng’g change cost
• Production day rate
• Rack density
• Rack fleet quantity
• Transportation cost
• Part movement &
damage
• Rack change
frequency

Even the lost or gain of one or two parts per rack


costs or saves thousands.
Why Use Vision ?

• Reduce costs definitely


• Flexibility, floor space sometimes
• Ergonomics sometimes
• Productivity sometimes
• Logistics no

Select application carefully !


Vision Approaches
• 2D
– X, Y, Rz
– Uses auxiliary sensors, palletizing routine to
compensate for Z.
– May use robot or stationary mounted camera.
– Object can not tilt out of plane
– Often used for flat, stack sheet metal and lower
accuracy applications

• 2½D
– Feature size, shape change interpreted as change in Z.
– Requires 3D object model
– Object tilt introduces measurement error
– Often used for parts which have little rotation in the rack
Vision Approaches

• 3D with one robot mounted camera


– 6 degrees of freedom Z Y
– May use multiple robot mounted cameras for larger objects
– Requires 3D object model
– Lens size & lighting placement critical X

• 3D with multiple cameras


May use stationary cameras or combination
of robot and stationary cameras
– May not require object model
– Requires clear lines of sight for stationary mounted cameras
– Often used for larger objects
– Suited to applications which require vision for parts & racks
– Stationary mounted cameras less prone to operational problems
System Variations
• Architecture
Single monochrome CCD camera
Sensor Type One or multiple CCD cameras
laser (triangulation)
Robot
Sensor Mounting Stationary • Optics
Combo. robot & stationary
2D & 3D: part model comparison LED
Robot Offsets Lighting Max. response
True 3D: triangulation / stereo
Vision PC / software Variable intensity control
Vision System Integrated with robot Manufacturer
Smart cameras Camera Max. response
Separate vision screen & keyboard Pixel size / total resolution
HMI Vision access via robot teach pendent Lens Req'd field of view, resolution
Laptop for setup, only None
Camera Housing Manufacurer industrial
• Support Integrator
Vision system supplier Focus
Engineering / Manual / Automatic
Vision, controls, robot integrator Aperture
Integration Adjustability
System integrator, line builder Shutter Speed
Turn-key
Start-up
Contractual to main OEM
Direct
Training
Contractual to OEM
Robot Mounted Camera:
No Incident Angle
Robot Mounted Camera(s):
Low Incident Angle
x 1.15 x 2.5
x 2.5
x 2.4

Brightness Range

Object surface
Stationary Mounted Cameras
Vision Approach Lesson Learned
• One solution will not likely be best for all applications
– Drives increased cost and complexity
• Custom solutions lack advantages of standardization
– Need consistent vision components, integration and user
interface to optimize operation and troubleshooting procedures.
• Standardize by application
– Flat, stacked sheet metal such as floor panels
– Vertical, 3D objects such as rails
– Completed assembly loading such as doors, hoods
– Small components such as cross-members
– Installation applications such as roof and glass fitting
– Can also group by positioning accuracy required
Image Processing

• Lighting

• Feature recognition
Search windows, field of view,
accuracy, redundancy,
accommodate part movement
Robustness for Productions Conditions
Use 3D Vision

• Select part features to enable


2 ½ D vision due use of 3D object model.
to lack of part • Concurrent engineering, if
features within required, to add product
field of view features.
Required Accuracy

Hand-off table

• Hand-off tables require less accuracy and


perhaps eliminate the need for vision.
• Robot to robot hand off and direct Robot to robot
placement in geometry stations require hand-off
greater accuracy.
Robot Grippers, Tooling

• Interference & tolerances between parts, racks,


robots, grippers
• Camera, prox./sensor placement
– Careful design required for multiple models
• Robust design for camera stability
• Direct robot handoff and passing tables
• Access all parts including sides / back of rack.
• Use of auxiliary sensors
• Use of alternate back-up process ?
Robot Programming
Robot / System Programming
• More complex, innovative programming
• All rack conditions – first, last, front, back,
top, bottom, right, left.
• Cable movement, protection
• Standards for consistent programming
• Extensive calibration functionality
• System diagnostics 3D AutoRacking: A Partnered Solution by COGNEX and SHAFI
RELIABOT PC: Simplified Robotic Software Solutions www.shafiinc.com
– Image processing & fault
messages in user language Model: LWB Right
Vision System Status
– Prevent interference, crashes due Rack:
Full/Empty
Left vs. Right:
to calculated vision offsets Correct/Incorr
Short vs.
ect
• Recovery Long:
Correct/Incorr
Robot
ect
– Semi automatic routines Coordinates
X -
Y
mm
:Z 5.03
Item Value mm
626.
– Well documented job aides :R 1554
Units
mm
11
:R -3.62
.51
-
degr
degr
X: ees
– Back-up palletizing operation RELIABOT Menu:
R
Y: 1.28
10.5 degr
Z: 8
ees
ees

Train Manual AUTO


Click on camera above
to see image
Racks – The Biggest Problem

Vision systems capable of adapting to production


racks or racks designed for vision ?
Special Racks vs. Rack Validation
Production Rack Considerations

Rod for robot to open rack flippers

A problem with color variation


Typical Rack Problems
• Lack of design coordination
– Vision, non-vision application
– Clearance with robotic gripper, camera
– Robot programming limitations
– Lack of validation early in development
– Traditional latch mechanisms
• Inconsistent rack fabrication, reuse of old racks
• Improper rack loading
• Parts shifting during shipping
• Rack locating within the station
• Rack damage
Rack Loading, Part Movement

How much
Parts shift in rack movement is Shift during transport
acceptable /
excessive?

Manual loading non-


repeatability
Rack Recommendations
• Analyze total cost of ownership
– Capital, labor, productivity, maintenance, logistics
• Use a concurrent, integrated process
– Multiple supplier interaction is key - part design, process design,
tooling design, rack design, robot simulation, rack build, validation. .
• Use design best practices
– Design racks for vision applications
– Careful consideration of tolerance stack-up
– Best practices include: paint, clearances, fabrication practices,
latches, rack locating, dunnage features, etc
• Implement a rack maintenance program
• Use common and / or flexible racks
Integrated Rack / Process Roadmap
Final Vehicle
Program Program Design Pre- Launch
Start Spec. Release Production 0 -10 -12
A MLD Z
NEW DESIGN

Conduct Preliminary Initiate Rack Design


Logistics Studies Build Pre-Production Racks
Conduct Design Reviews Build Production Racks
Conduct Rack
Studies Conduct Rack Validation
Prototype Complete

Develop Resource Prototype Rack Approved Continuous Improvements


Allocation Plan
Lessons Learned
Develop Mat’l Handling
Master Plan

Cross-Functional Make / Buy Prototype


Workshop Review Validation

Verify Workshop Consolidated


Design Validation
Assumptions Lessons Learned

Original Material Added Cross Functional


Handling Process Roadmap Milestones
Common, Flexible Racks
Integration
• Concept demonstration
• Establish roles & responsibilities
– Various suppliers: Vision equipment, vision integrator, robot
manufacturer, line builder/system integrator
– Turn key responsibility for lead supplier
• Design, engineering
– Product features and vision
– Multiple supplier design reviews, FMEA
– Seamless system integration
• Pre-production validation
– Validate in / out of tolerance conditions regarding part
positioning, angle, surface finish, ambient light, etc.
• Acceptance testing
• Start-up & early production support
Productivity, not technology; factory, not laboratory !
“Good” Applications
• Repeatable part features
• Repeatable part presentation
• Less precision required (passing tables, clamping, tooling)
• Robust image processing
– Insensitive to ambient light, part finish, part movement
– Redundant feature recognition
• Robust tooling and sensors
• Easy recovery
• Duplication of / or high similarity to, other
successful applications
Standardize across programs, plants
“Challenging” Applications

• Parts, tooling, racks & vision system not designed


concurrently
– Feature search area too small & image processing not
tolerate to production variations
– Rack & tooling engineering changes during start-up
• Part movement during transport and/or robot pickup
• Precision robot placement and processing required
• Cumbersome robot recovery impacts MTTR

Perform careful upfront engineering


Vision Not Required

Don’t use vision unless it is certainly the best alternative


Plant Considerations - Training

• Know your audience


– Limited technical information with extensive hands-on
– Identify specialists and generalists
• Don’t train too early (or too late)
– Build it into master schedule & account for personnel
availability
• Training is not a one time event
– Conduct “fire drills”
– Conduct refreshing training
Plant Considerations – Readiness
VISION FAILURES
Problem Probable Cause
Job Aids: Trigger is OK, response fails Lost communications between vision computer & robot

‰ Calibration Offsets exceeds limits


Camera moved, limits too narrow
‰ Vision failures (Min.X, Max.Y, Max. Rx, etc.)
Feature on edge of search window, feature found in wrong
‰ Retry, Abort High Uncertainty Value
location

‰ Comm. failures Repeatedly missing feature Search window or lighting problem

Problem typically one style, only Part repeatablility problem, tends to exclude camera failure

Fail with all features "C" Camera unplugged or cable damage

Fail with all features "0" No part, camera moved/failed, light changed/failed

Readiness Assessment:
‰ Fault recovery, restore, backup
‰ System start-up. – connections,
power up, software navigation, . . . ‰ System maintenance - alignmt,
replacemt, calibrtn, clean’g . . .
‰ Operation – robot prgm’g, lighting,
reference images, diagnostics, . . . ‰ Eng’g changes
Plant Considerations – Optimization
Search Window
Search Window

Vision Picture with


lighting optimized
Vision Picture with
shadow influence

Build system availability and expertise into your process


Plant Considerations – Problem Solving
• 50% of all downtime and 55%
of all occurrences were caused 318
D 322 DODGE
ONLY
by two clamps (C and D) and a FIS Mssg 321 E 323
A F
vision error. PP3 FIS Mssg 274
• The data revealed that C and PP1 FIS Mssg 272

D faults can be traced back to


Dunnage Point B being out of FIS Mssg 247 PP2 FIS Mssg 273
ISRA
position. 6 Points
• Two other clamps, PP2 and FIS Mssg 320
CHRYSLER
319 ONLY
PP3 account for 46% of B C
remaining downtime and 55%
Problem Area
of remaining occurrences and A Lower Dunnage Points B Legend
can be caused by dunnage Dunnage
point B and/or proximity switch LH OUTER BODY SIDE APERTURE
Related
Points
problems. ROBOT 1 FOUR-WEEK PERIOD: 07/23 to 08/17/2007
F IS M e s s a ge Time Occurrences M ESSAGE DESCRIPTION Proximity
247 1:09:23 51 ISRA VISION GENERAL ERROR Switch
272 1:15:50 63 PP1 - PART PRESENT 1 FAULT
Related
Points
273 0:30:35 22 PP2 - PART PRESENT 2 FAULT
274 1:02:22 85 PP3 - PART PRESENT 3 FAULT Message
318 0:06:17 3 A-M AT HNDLNG CYL A EXT FAULT Number
319 0:08:41 19 B-M AT HNDLNG CYL B EXT FAULT Codes
320 1:03:06 125 C-M AT HNDLNG CYL C EXT FAULT
End Effector
321 1:05:30 59 D-M AT HNDLNG CYL D EXT FAULT Clamp
322 0:01:09 2 E-M AT HNDLNG CYL E EXT FAULT Points
323 0:03:36 2 F-M AT HNDLNG CYL F EXT FAULT
TOTALS 6:26:29 429
Plant Considerations – Maintenance

• Establish comprehensive PM practices


• Ensure vision system & robot software back-up
• Stock required spare parts, especially high flex.
cables
• Establish process to identify and repair bad racks
– Identify responsibility, source, funding
• Identify long term corrective actions to reduce and
eliminate need for rack maintenance
– Both vision and rack improvements
Plant Considerations - Support

• Comprehensive robot programming required


– Integrated vision system operation & robot
programming
• Establish service agreements up front
– What service is provided with system ?
– Ensure on-going support is provided if plant support is
uncertain
• Material handling coordination
– Especially early in the program
Summary – Lessons for Suppliers

• One vision technique or approach is likely not the


optimum for all applications
• Know outside restraints – racks, part variations,
tooling, robot programming, etc.
– All suppliers & customer groups must work together.
• Robust solutions are a must
– Validate all production situations
• System operation must be easy for plant personnel
– Recovery to minimize MTTR is crucial
Summary – Lessons / Questions for Users

• Does the use of vision really save me money


considering all cost factors?
– Are there less complicated alternatives?
• This this a proven application ?
– Innovation is good, but make sure there is adequate
beta-production validation
• Does my plant have culture to accept, learn and
maintain complex systems?
– Do I (plan to) have vision technicians and well trained
maintenance personnel ?
Summary – Lessons / Questions for Users

• Do I have a detailed specification governing


design, performance, integration, support, etc. ?
• Do I have a trusted, turnkey supplier ?
• Does my start-up plan allow time and material to
fine tune process and work out bugs ?
• Has the system been designed with careful
attention to the racks ?
– Added up front costs are likely requirements to meet
expected production requirements.

Integration is the key !


Contact Information

Robert Anderson
New Technology Manager
Advanced Manufacturing Engineering
Chrysler, LLC
Auburn Hills, Michigan

Telephone: 248 944 6076


email: ra2@chrysler.com

Acknowledgements to ISRA Vision and to SHAFI, Inc. a Braintech


company, for providing some presentation material.
Random Bin Picking
Technical Challenges and Approach

Presented by:

Babak Habibi
Braintech Inc.
International Conference for Vision Guided Robotics
September 30 – October 2, 2008

Babak Habibi
CTO
Braintech, Inc.

Babak Habibi
Braintech, Inc.
102 - 930 West 1st Street
North Vancouver, British Columbia V7P 3N4
Canada
Phone: 604-988-6440
Fax: 604-986-6131
Email: bhabibi@braintech.com

Bio Not Available at time of print.


Presentation not available at time of production.
Random Bin Picking Applications/Solutions

Presented by:

Steven West
ABB, Inc.
International Conference for Vision Guided Robotics
September 30 – October 2, 2008

Steven West
Business Development Manager
ABB, Inc.

Steven West
ABB, Inc.
1250 Brown Road
Auburn Hills, Michigan 48326
Phone: 248-391-9000
Fax: 248-391-7390
Email: steven.w.west@us.abb.com
Bio Not Available at time of Print.
Random Bin Picking
Applications and Solutions
Steven W. West
Development Manager
ABB Inc.
Random Bin Picking
The Robotic Bin Picking
Challenge:
• Extreme part
overlap and
occlusion
• Significant lighting
variability and
shadowing
• Shortage of
distinct features
on parts
• Collision
avoidance with
other parts, tools
and bins
• Factory ready and
RELIABLE!
Random Bin Picking

Automotive OEM Pilot Cell


ABB IRB 2400 picks con rods and place on feeder conveyor
RBP – Software (examples)

Braintech eVision Factory TM


ABB BinWare TM

Multi Part Candidate Tracking to offer a choice of identified pick- Simplified User Interface provides easy set-up for the most common
able parts operations required to develop a Robotic Bin Picking System
Pick Candidate Re-verification, to enable efficient re-use of Robust Motion Planner identifies the optimal path for the robot to pick
identified parts from previous pick cycles a part from the bin
Multi Grasp Point Selection, to provide multiple grasping points Position Reachability checks to see if the robot can reach the image
and scenarios for a given part, while also increasing speed and capture point and the pick point selected by the vision system
accuracy when grasping each part Collision Avoidance detects if the robot tool and arm will collide with
Advanced 3D Range Data Analysis, to confidently move to and the bin walls
grasp parts without collisions.
RBP – System Features
Auto Recovery from Collision Detect / Motion
Standardized Compliant Tooling
Supervision faults. The robot is capable of
handles minor collisions making the
gripper more likely to grasp part automatically restarting should the robot gripper
collide with a neighboring part.
• 6 DOF compliance

• compliance devices in series with variable


amounts of compliance and degrees of
freedom

• adjustable (pneumatic) compliance


controlled by robot progam

• robot measures tool compliance (air


based: air pressure; spring based:
proximity sensors)

• use one or more digital limits / signals, or


use analog IO

• limits trigger robot to stop or change


direction of robot Click screen to see collision recovery
RBP – Optimal Applications
Part Types
• 3D Geometry
• Dull surface finish
• Semi-planar with 2 – 3 resting
positions
• Examples: castings, forgings,
plastic molded parts, flat
stamped blanks

Optimal Application
Optimal Application
RBP – Optimal Applications
• Easily recognizable
2D features for initial
detection of possible
candidates
• Edges that can be
consistently used to
detect the part
footprint, even if they
are the silhouette –
very important
• Bin that have angled
side walls near the
bottom
RBP – Optimal Applications
Complexity Factors
• Parts that link, hook or
wedge together
• Multi-planer parts that rest
in in excess of three
positions
• Small parts, less than 10 cm
• Limited pick points
• Parts with deep cavities
• Parts that appear to have
self-similar areas.
• Bottom of the bin / side of
the bin (deep bins with 90
degree angles – a “deep
box”
RBP – Optimal Applications
Semi – Structured RBP
• 3D picks from exit
conveyors
• RBP module solves part
overlapping challenges
• Does not require parts to
be cingulated
RBP – Gripper Design
Considerations
• End of Arm Tooling
– Narrow footprint for gripper fingers
– Rounded edges on gripper fingers and gripper body
– Hooks, suction cups and magnetic grippers
– Multi pick point grippers (e.g. ID and OD fingers)
– Cable and hose management
• Re-grip and flipper stands
• Part present switches, and sensors to detect multiple parts
picked, or parts that have slipped out of position prior to
placement
• Robot uses specialized tool to move parts away from bin walls
• EOAT design for RBP necessitates “out of the box” thinking
RBP – Gripper Design

“out of the box thinking”

Ishikawa Namiki Laboratory – high speed multi-finger hand


RBP – Cell Design
Considerations
• Bin location and orientation
– Bin location techniques:
• Mechanical
• Vision
• Robotic
• Bin logistics
• Bin geometry
– New programs
– Existing programs
• Safety system
• Robot reach and mounting
– Inverted, shelf-mount, standard
• Manual back-up
RBP – Application Development
Step 1: Define requirements and Design Concepts:

Outputs:
• Simulation Model
• Equipment specification including robot model, robot risers, bin / box locators, safety system, etc...
• Gripper concept drawn in 3D model
• Cycle time study
• Proposal and budget for Step 2

Step 2: Design and Build Test System:

Outputs:
• Design and build prototype gripper
• Perform 10,000 pick test run
• Measure pick rate quality, system availability, and cycle time
• Proposal and budget for Step 3

Step 3: Design and Build Production Pilot System

Outputs:
• 1st production ready unit
• Lessons learned from pilot installation
• Standardization of factory specific requirements
RBP – Business Case
The “Traditional” Business Case for Robot Automation
1. Reduce operating costs
2. Improve product quality & consistency
3. Improve quality of work for employees
4. Increase production output rates
5. Increase product manufacturing flexibility
6. Reduce material waste and increase yield
7. Comply with safety rules and improve workplace health &
safety
8. Reduce labour turnover and difficulty of recruiting workers
9. Reduce capital costs (inventory, work in progress)
10. Save space in high value manufacturing areas
Based on research carried out by the International Federation of Robotics (IFR)
Published in World Robotics 2005
RBP – Business Case
Random Bin Picking - Business Case “Adders”
1. Perfectly ergonomic
ƒ Eliminate noise from vibratory feeding systems
ƒ Eliminate potential for worker injury:
ƒ Repetitive motion
ƒ Workers hit by forklifts moving bins into position
ƒ Injuries resulting from dropped parts, pinched fingers, etc…
2. Reduce MRO costs
ƒ robotic bin picking cells are mechanically simple systems (robot and gripper)
ƒ much less complex than vibratory feeding and/or mechanical positioning systems
ƒ lift assists and hoists often require frequent maintenance and spare parts
3. Improve productivity
ƒ robots don’t stop to talk about the “big game” causing the line to starve for parts

Based on research carried out by the International Federation of Robotics (IFR)


Published in World Robotics 2005
RBP – OEM Pilot Project
Objectives

OEM Automotive Customer –


Reduction of packaging and material
handling costs. The pilot project offers
this customer an opportunity to measure
the production readiness bin picking
technology without disrupting their
production process.

ABB - Prove that robotic bin picking


technology is production ready and
develop factory-based lessons learned
to support this customer and other
customers on future bin picking orders.
RBP - Pilot Project Performance

Test 1 Test 2
Total Parts Picked 1125 3005
System Failures 2 5
System Availability 99.8222% 99.8336%
Total Faults 42 147
Pick Rate Quality 96.2667% 95.1082%

Faults: System Failures:


Does not stop the robot or require manual Requires operator intervention or restart of
intervention. system.
•Dropped part •Gripper stuck between parts
•No part present in gripper •Axis 4 Joint out of Range
•Two parts linked together •Vision system does not detect pick-able parts
•Automatic recovery from collision detect at the bottom or side of bin
On average 13 parts remain at bottom of
bin when system times out
Contact Information

Steven W. West
Development Manager, Vision Guided
Robotics
ABB Inc.
1250 Brown Road
Auburn Hills, Michigan 48326
USA
Telephone: 248-393-7120
email: steven.w.west@us.abb.com
www.abb.com
The Need for Generic 3D Bin Picking

Presented by:

René Dencker Eriksen


Scape Technologies
International Conference for Vision Guided Robotics
September 30 – October 2, 2008

René Dencker Eriksen


Chief Technology Officer
Scape Technologies

René Dencker Eriksen


Scape Technologies
Kochsgade 31 C, 3. sal
DK-5000 Odense C
Denmark
Phone: 45 65 25 66 03
Fax: 45 70 25 31 14
Email: rde@scaptechnologies.com

Ph.D in Computer Systems Engineering, University of Southern Denmark.


Subject: Efficient interpretation of digital images in a structured environment.
Three years as Assistant Professor at the Maersk Mc-Kinney Moller Institute for
Production Technology at the University of Southern Denmark.
Co-founder of Scape Technologies A/S.
Chief Technology Officer at Scape Technologies.
Generic 3D Bin Picking

René Dencker Eriksen


Chief Technology Officer
Scape Technologies
Overview
• About Scape Technologies
• True 3D Bin-Picker – Why and What?
• Demands from the Industry
• Challenges in Bin-Picking
• Case Studies
• Final Words
About Scape Technologies
• Established 1st of February, 2004

• Spin-off of University of Southern


Denmark

• Focus on industrial bin-picking based on a


patented vision-algorithm.
Why Use a Bin-Picker?
• Rationalization: Labor reduction, fewer
mechanical systems
• Productivity: Higher efficiency, 24 – 7
production
• Flexibility: Reconfigurable, handles many
different objects, takes up little space
• Workforce: Replaces unattractive jobs, avoid
repetitive tasks for your workforce
Demands from the Industry
• Fast cycle times
• Generic system to keep total cost low
• Easy to use (as easy as any 2D system)
• Handling of different part types
• Light conditions may change
• Small size of robot work cell
• Specific robot brand
• Constraints on placement
True 3D Bin-Picker
SCAPE Workcell
Standard fluorescent lamps
Manager

Firewire camera

SCAPE Workcell Studio

C
C
Handling station

SCAPE Tool unit

SCAPE communication Server

Robot controller

EtherNet, RS232

SCAPE computer Pallet Box (uptil 4 frames high)


Bin-Picker Cell Building
Challenges in Bin-Picking
• Recognition of parts
• Background is cluttered
• Robot path planning is complicated
• Collision avoidance is necessary
• Gripping strategy is needed
• Gripper design
• Re-grip, precision grip, and orientation
control may be needed.
Challenge 1: Recognition
• 6 degrees of
freedom makes it
much harder to
recognize the parts.

• Cannot be trained in
a generic way from
real images.
Challenge 2: How to grip a part
• Must have multiple grip positions on the
parts. Two reasons preventing grips:

1) The grip position on the part is facing


down-wards.
2) The grip is impossible because of robot
collision with bin frame
Challenge 3: Gripper Design
We need grip positions from all sides of the
part. In most cases this requires two or
more grippers on the “Tool-Unit”.
Gripper examples:
Challenge 3: Gripper Design
Flexible Tool-Unit
Challenge 4: Path Planning
• Must avoid robot joint limits
• Must include collision avoidance
Challenge 5: Re-grip
• A “Handling Station” is in most cases
needed for:
• Re-grip so parts can be placed correctly

• Precision grip

• Orientation control to ensure correct grip


Challenge 5: Re-grip
Handling Station Examples
Case Study: Picking Disks
Requirements:
• Total cycle time: 10 seconds
• Handle 71 different disks (diameter and
thickness)
• Two pallet frames
• Robot has many other tasks, so bin-
picking can only take about 5 seconds.
Case Study: Picking Disks
Case Study: Picking Rotor Cans
Requirements:
• Cycle time: 9.5 seconds
• 4 pallet frames
• Remove slip sheet between each layer
Case Study: Picking Rotor Cans
Case Study: Picking pipes
Demo Cell Used by AH Automation

Example of:
• Completely randomly placed parts
• Deep bin (4 pallet frames)
• Need for multiple grippers
Case Study: Picking pipes
Case Study: Split Cone and Nuts
Requirements:
• Bin-pick two different products from two
bins
• Compact
• Robot has many other tasks outside the
bins
Case Study: Split Cone and Nuts
Final Words

True 3D Bin-Picking is available now!

Things to come:
• Even more generic
• Improved path planning
• More automated training of parts
• Handling a bigger variety of part types
Contact Information

René Dencker Eriksen


Chief Technology Officer
Scape Technologies
Kochsgade 31 C, 3. sal
DK-5000 Odense C
Denmark
Telephone: 45 70 25 31 13
email: rde@scapetechnologies.com
www.scapetechnologies.com
Robot Visual Servoing – Opportunities and
Challenges Ahead

Presented by:

Jane Shi
and
James Wells
General Motors Corporation
International Conference for Vision Guided Robotics
September 30 – October 2, 2008

Jane Shi
Staff Researcher
GM R&D Center

Jane Shi
GM R&D Center
MC 480-106-359
30500 Mound Road
Warren, Michigan 48090
Phone: 586-986-0353
Fax: 586-986-0574
Email: jane.shi@gm.com

Currently, Jane Shi works at General Motors R&D Center of Warren, Michigan as a staff
researcher, focusing on the fundamental challenges in achieving a reliable, robust, and
capable autonomous and intelligent robotic systems for automotive assembly. Her
delivered research results are ranging from analytic models, innovative methods, to
data analysis and related practical tools to address a variery of automotive
manufacturing challenges in order tom improve its flexibility, efficiency, and reliability.
Dr. She joined GM R&D Center in 2002. Prior to 2002, Dr. Shi’s work experience
includes FANUC Robotics America, Inc. (1994-2002) and NIST (1988-1989). Jane
earned her Ph.D. in robotics from Kansas State University in 1995. Dr. Shi is a member
of IEEE and Robotics and Automation (RA) Society.
International Conference for Vision Guided Robotics
September 30 – October 2, 2008

James Wells
Senior Staff Research Engineer
GM R&D Manufacturing Systems Research

James Wells
GM R&D Manufacturing Systems Research
MC 480-106-359
30500 Mound Road
Warren, Michigan 48090
Phone: 810-602-9879
Fax: 586-986-0574
Email: james.w.wells@gm.com

Jim joined GM’s Manufacturing Engineering organization in 1979 and has been working
in the area of Robotics since 1982. Jim has held engineering and management
positions primarily responsible for robot application development and manufacturing
program support including simulation, paint operations, body assembly, robot
procurement and specifications. Jim joined the R&D MSR group in 2003 and is
currently Senior Staff Research Engineer working on developing advanced robotics with
low cost flexible tooling and equipment for vehicle assembly. Jim has served SME
Robotics International as Chairman for the RI Board of Advisors (1995) and is currently
on the board of the RIA (Robotic Industries Association). Jim holds a Bachelors degree
in Electrical Engineering from Rochester Institute of Technology and a Masters degree
in Engineering from Purdue.
Presentation not available at time of production.
3D Robot Guidance for
Cosmetic Sealer Applications

Presented by:

Kevin Taylor
ISRA VISION SYSTEMS
International Conference for Vision Guided Robotics
September 30 – October 2, 2008

Kevin Taylor
Vice President
ISRA VISION SYSTEMS

Kevin Taylor
ISRA VISION SYSTEMS
3350 Pine Tree Road
Lansing, Michigan 48911
Phone: 517-887-8878
Fax: 517-887-8444
Email: ktaylor@isravision.com
Kevin has been with ISRA Vision since 1999. His first years with ISRA were solely in
sales capacity. Several years ago he assumed the position of Vice President with the
responsibility of the North American Business Unit. Prior to that, he spent 8 years
selling automation to the automotive industry.
3D Robot Guidance for Cosmetic
Sealer Applications
Kevin Taylor
Vice President
ISRA VISION
What is Cosmetic Sealer?

Cosmetic sealer is a sealer application


where the appearance of the bead is
important since it is painted and seen by the
customer. Examples include:
‰ Roof Ditch
‰ Door Hem Flange
‰ Hood
‰ Trunk Lid
Why automate Cosmetic Sealer?

Cosmetic sealer applications are automated


for the following reasons:
‰ Greatly Improved Quality of Appearance
‰ Consistency of Applied Sealer
‰ Reduction of Manpower
‰ Reduced Material Consumption
Why is 3D vision required for Cosmetic
Sealer applications?
Cosmetic sealer applications require VGR
for the following reasons:
‰ Non-repeatable Positioning of Car Body
‰ Non-repeatable Positioning of Part to be
Sealed
‰ Tight Tolerances of Application
The vision system provides:

‰ 3D Position of the Part to be Sealed


‰ Clearance Verification for Dispense Nozzle
‰ Part Position can be Detected with Robot
Moving or Stopping in Multiple Acquisition
Positions
Picture courtesy of SCA Schucker
Picture courtesy of SCA Schucker
Picture courtesy of SCA Schucker
Picture courtesy of SCA Schucker
Video courtesy of Esys Corporation
• Application
– 3D Measurement of Part in Open or Closed Position
• Measurement Accuracy: ± 0.1 mm
• Measurements: Typically four (4) measurement points per part
• Measurement Time: Less than 1 second per measurement
– Verification of Bead Position (optional)
– Verification of Bead Geometry (optional)
– Verification of Bead Surface (optional)
• Multiple Configurations
‰ Stationary Sensors
• Minimum cycle time
• Unflexible for Multiple Styles
• Multiple Sensors Required
‰ Sensor on Robot: Multiple Measurements
• Maximum Flexibility
• Lowest Cost Solution
• Increased Cycle Time Due to Multiple Measurements
• Robot Moving or Robot Stopped
• Sensor Requirements
‰ High Accuracy
‰ Measurement Time of < 0.2 Seconds per Feature
‰ Robust Construction for High Reliability
‰ Compact Design
‰ Automated Calibration
‰ Possibility for 3D Measurement AND Inspection
‰ Possible to Detect Edges and Holes to Allow Opening
of Closures
‰ Suitable for Robot Mounting
• Typical Requirements for Sensors for High Precision Robot
Guidance
– Highly Accurate to 0.05 mm
– Mobile or Stationary Mounting
– Measurement Features: Edges, Holes,
Planes, Free Form Features, Flush & Gapness
– Sensor Lengths: 300mm to 800 mm
• Measurement Principle
Multi Line Projector
Area Lighting

Camera
Plane

3D Point
• Multi Line Sensors are Robust even with Disturbed
Features
• Software Requirements for Sealant Applications
‰ Combination of 3D Measurement and Inspection
‰ Software Platform is Identical to Other Accepted Robot
Guidance Systems
‰ Improved Algorithms to Increase Robustness of
Measurement
‰ Automated Calibration

Software Improved with:


• Bead Inspection
• Color Functionality
Menu Bar
• Ideal Software Tool Bar

Indicator
Bar

Action Area

Interactive Area

Information
Status Bar
Area
• Ideal Software - Diagnostics

All indicators are green


– system is ready for
operation

Overview Display of last


Button measurement result,
Selected group, and task

Display of current
system messages
• Ideal Software - Diagnostics

Red indicator signals source of


problem

Red indicator
signals last
measurement In the overview image the
failed failed feature is indicated

Text line for error and diagnostic messages


• Software: Environmental light and color
independant measurement

• Problem: Blooming
– Destruction of image
information by overexposed
pixels due to reflections in
the imaging area
• Software: Environmental light and color
independant measurement
• Solution: Color Control Functionality
Images at each shutter time:
Completed Image:
• Software: Environmental light and color
independant measurement
• Solution: Color Control Functionality

Camera image of black and light car body


Î Optimal contrast for all colors!
Application Solution
– 3D Measurement of Open Door
• Measurement Accuracy: ± 0.1 mm
• Mobile Sensor Mounted on Robot
• Measurements: Four (4) Features on Door
• Measurement Time: 0.2 Seconds for Measurement Plus Robot Movement

Measurement points for position detection


Measurement points for clearance check
• 1 Sensor on Robot Measuring 4 Door Features

Offset Vector
(x,y,z,Rx,Ry,Rz)
• Bead Inspection
– Bead Presence/Absence

Hem Flange – No Bead Hem Flange – Bead Present


• Bead Inspection
– Bead Geometry

Bead Too Wide and Shallow Bead too Narrow


• Bead Inspection
– Bead Position and Height

Bead Position 2 mm too low Bead Position 2 mm too high


Bead 0.7 mm too thick
• Bead Inspection
– Optional Possibility: Surface Inspection
• Bead Inspection
– Optional Possibility (In Development): Surface Inspection -
Scaling
Cosmetic Sealing
Contact Information

Kevin Taylor
Vice President
ISRA VISION
3350 Pine Tree Road
Lansing, Michigan 48911
USA
Telephone: 517-887-8878
email: ktaylor@isravision.com
www.isravision.com
Combining Machine Vision and Robotics to
Mimic Complex Human Tasks

Presented by:

Michael Muldoon
Averna Vision & Robotics
International Conference for Vision Guided Robotics
September 30 – October 2, 2008

Michael Muldoon
Business Solutions Engineer
AV&R Vision & Robotics Inc. (Averna Vision & Robotics)

Michael Muldoon
AV&R Vision & Robotics Inc.
(Averna Vision & Robotics)
269 Rue Prince
Montreal, Quebec H3C 2N4
Canada
Phone: 514-788-1420
Fax: 514-866-5830
Email: michael.muldoon@avr-vr.com

Mike studied at the University of Windsor graduating with a Bachelor of Applied Science
in Electrical Engineering. Since then his main focus has been expanding the Machine
Vision and Robotics technologies for applications ranging from laser welding to de-
palletizing to deburring and surface inspection. He spent the first part of his career in
Windsor Ontario heavily involved in automotive production where the volume and range
of parts was always the challenge. Now, in Montreal Quebec, he specializes in
Aerospace applications which has different demands as it is low volume, complex and
high precision production environments.
Combining Machine Vision and
Robotics to Mimic Complex Human
Tasks
Michael Muldoon, P.Eng
Business Solutions Engineer
AV&R Vision & Robotics
(Averna Vision & Robotics)
Presentation Agenda
• Automation Challenges
• Aerospace Industry
• Material Removal
• Surface Inspection
• Cad-to-path Strategies
• Performance Testing
• Case Study 1: Surface Inspection & Gauging System
• Case Study 2: Finishing & Inspection System
Automation Challenges
Some Areas Typically Left to Humans:

• High accuracy material removal/part finishing;

• Surface Inspections on complex, brilliant surfaces;

• Random Bin Picking;

• Assembly where there is tight tolerances and part variations;

• Low volume & high complexity tasks.

“Humans are wired for change and very easily adapt to


changing conditions, but are not very consistent”
Aerospace Industry Challenges
• Low volume production (batches) & frequent changeovers

• Tight tolerances and complex surfaces usually left to human


to finish & inspect
• Aerospace engine manufacturer’s parts are typically shiny
metal parts with complex surfaces and critical quality
constraints.
• Two areas that are drivers of change:
• Material Removal
• Ergonomics
• Cost savings
• Quality
• Surface Inspection
• Consistent quality
• Critical requirements
• Labor intensive
Material Removal
Typical Material Removal Requirements (Deburring)
• Break all sharp edges, remove all burrs
• Create a radius on edge
• Typical 0.005" - 0.025"
Control Strategies & Process Variation:
• Feed Control
• Consistent part location and size
• Inconsistent burr or flash size
• Pressure Control
• Inconsistent part location & size
• Consistent burr or flash size
• Tool/feedback compliance
• Active/passive
• Calibration
• Tool wear
• Part Variations & Location
Surface Inspection
Random Surface Defect Detection
•Resolution
•Typical min. defect size 0.001-0.015"

•Defect Classification
•NI Particle Analysis VI

Defect Examples:
• Dents
• Scratches
• Cracks
• Pits
• Ripples
• Die Marks
Inspection Algorithms

Raw Image Left 0.008" Right 0.016"


Inspection Algorithms

Raw Image Left 0.020" Right 0.020"


Cad-To-Path Strategies
Simulation Environment Vision Environment Real-World Production
(Station HMI)

Future-> Real-time analysis & adjustment of robot path/recipe (ie cast parts, MRO facilities)
Performance Testing
Statistical Analysis Requires:
• Defined performance requirements
• Test Plan
• Samples!
• Ability to present useful information of data

Po Pe Kappa

A1-A2 0.995 0.7485 0.98

A1-A3 0.985 0.7414 0.94

A2-A3 0.98 0.738 0.92

Overall
Good
Repeatability 0.95
Agreement
Kappa
Case Study 1: Surface Inspection & Gauging
System
Description:

•Low volume production (batches) & frequent changeovers

•Complex Surfaces usually left to human inspection


•High levels of precision (+/-0.001")
•Multiple inspection requirements:
•Dimensional gauging
•Surface Inspection
•OCR Inspection
Solution Overview
Description:
•Stand-alone system
•Approx. cycle time: 12 seconds per part
•100% Inspection with guaranteed results
• System Components:
•Fanuc LR-Mate 200iC
•LabVIEW 8.5
•NI-PCIe-1430 frame grabber
•NI-PCI-6514 I/O card
OCV Serial Number Root Inspection Surface Inspections Dimensional Measurement
Case Study 2: Part Finishing & Inspection System
• Inspect serial number for part tracking & traceability
• Deburr & create a radius on the all finished edges of the root form
• Inspect all finishing operations
• Inspect for random surface defects
(scratches, nicks, pits etc)

Blade Root Form


Solution Overview
•Robot:
• Fanuc LR Mate 200iC HMI
• 5kg Payload NI Smart Robot
• R-J3 Controller Camera Controller
•Vision System: PC
• NI 1722 Smart Camera
•Software: Ring Bar
•LabVIEW 8.5 Light Light
•Inspection sequence
•User Interfaces (viewer, alarm, calibration)
•NI Vision Acquisition Algorithms Robot

Final Quality
Part

Finish Inspect
Contact Information

Michael Muldoon, P.Eng


Business Solutions Engineer
AV&R Vision & Robotics Inc.
269 Rue Prince
Montreal, Quebec
Canada
Telephone: 514-788-1420, x532
email: michael.muldoon@avr-vr.com
www.avt-vr.com
Using 3D Laser Scanning for Robot Guidance

Presented by:

Brian Windsor
SICK, Inc.
International Conference for Vision Guided Robotics
September 30 – October 2, 2008

Brian Windsor
Business Development Manager - Machine Vision
SICK, Inc.

Brian Windsor
SICK, Inc.
6900 West 110th Street
Minneapolis, Minnesota 55438
Phone: 810-923-1880
Fax: 248-997-1068
Email: brian.windsor@sick.com
Brian has been working at SICK, Inc. for 3 years and is currently a Business
Development Manager for SICK’s Machine Vision products. He has been involved in
technical sales and engineering of industrial sensor and machine vision products for the
past 15 years.
Using 3D Laser Scanning
for Robot Guidance
Brian Windsor
Business Development
SICK, Inc.
3D Imaging Technology
What is 3D Profiling?
Camera captures the laser light
resulting in a contour of the surface

Complete shape is built-up of contour slices


as the object moves through the laser plane
3D Image Data

Intensity-based contrast

Height-based contrast
Height Resolution Calculation

Ordinary Reverse ordinary Specular Look-away


Profile Capturing
Profiles can be captured based on time or encoder pulses

Inconsistent Speed Slow Speed Constant Speed

Scanning Methods
• Robot or other device moves camera, object is stationary
• Object is moving on a conveyor, camera is stationary
Occlusion
Occlusion
• Camera Occlusion - the height of an object can block the laser
creating areas of missing data

• Laser Occlusion - the laser cannot ‘bend’ to see vertical surfaces


Performance Factors
3D Scanning Advantages
Application Example
• Application
– Automated generation of
order pallets for bulk and
route delivery
• Concept
– Industrial robots on linear
slides for pallet building
– Laser guided vehicles for
internal pallet transportation
Palletizing Application
• This concept includes a high-speed CCD camera and
a laser line projector.
• One image represents one profile line of the search
area, where the grey value represents a height
information.
• For acquiring an whole image a scan movement
orthogonal to the laser line has to be performed.
• In this 3D profile the different articles can be found.
• Since this approach needs no article specific features,
there is no teach-in process required!
Palletizing Application

Camera and laser unit Scan movement


Palletizing Application

Fridge Packs Range Data Vision Result


Palletizing Application

Vision result
(Bottle groups can be
20oz Crates Range Data separated, position of the
trays can be found)
Palletizing Application

Vision result
(Bottle groups can be
2L Crates Grey Scale Picture separated, position of the
trays can be found)
Palletizing Application
Random Bin Picking

Random bin picking of metal tubes using


3D triangulation camera to scan the bin
Belt Picking

• 3D triangulation camera is used to


scan piles of scissor blades
• The robot picks the blade on top
Contact Information

Brian Windsor
Business Development
SICK, Inc.
6900 West 110th St
Minneapolis, Minneapolis 55438
USA
Telephone: 248-997-7618
email: brian.windsor@sick.com
www.sickusa.com
Vision Options for “Dual Arm” Robot Guidance

Presented by:

Greg Garmann
Motoman, Inc.
International Conference for Vision Guided Robotics
September 30 – October 2, 2008

Greg Garmann
Software & Controls Technology Leader
Motoman, Inc.

Greg Garmann
Motoman, Inc.
1050 Dorset Road
Troy, Ohio 45373
Phone: 937-440-2668
Fax: 937-440-2626
Email: greg.garmann@motoman.com

Computer Engineering Degree from Wright State University, Dayton, Ohio 21 Years of
experience in Automation.
Vision Options for “Dual Arm”
Robot Guidance
Greg Garmann
Software & Controls
Technology Leader
Motoman Inc.
Robot Technology
• New developments in robot technology
require new ways of working with vision
systems. The “human-like” flexibility of
movement with Motoman’s new dual arm
robots provides unique solutions for the
automation world. This presentation will
show options in handling vision
opportunities with dual arm robots.
Ultimate Flexibility
Gear Assembly
Air Conditioner Assembly
Air Conditioner Assembly
Spool Handling
Packing
Packing
Window Glass Sealing
Sunroof Assembly (Bolting)
Vision (Ageria) Solution
Vision (Ageria) Solution: Bolt Picking 1
Vision (Ageria) Solution: Bolt Picking 2
Parts Picking with Dual Arm
Contact Information

Greg Garmann
Software & Controls Technology Leader
Motoman Inc.
805 Liberty Lane
West Carrollton, OH 45449
USA
Telephone: 937-440-2668
email: greg.garmann@Motoman.com
www.Motoman.com
Distance, Pitch & Yaw from a 2D Image

Presented by:

Steve Prehn
FANUC Robotics America
International Conference for Vision Guided Robotics
September 30 – October 2, 2008

Steve Prehn
Senior Product Manager - Vision
FANUC Robotics America, Inc.

Steve Prehn
FANUC Robotics America, Inc.
3900 W. Hamlin Road
Rochester Hills, Michigan 48309
Phone: 248-276-4065
Email: steven.prehn@fanucrobotics.com

Steve Prehn has worked in the machine vision market for over 20 years. In addition to
implementing over 200 vision systems, he has acted as product manager for VisionBlox
software at Integral Vision, and CorrectPlace and CorrectPrint products at ESI. He is
now product manager at FANUC, applying his knowledge of machine vision to extend
the reach of iRVision into the material handling market. He has a bachelors of science
in electrical engineering from DeVry Institute in Columbus, Ohio.
Distance, Pitch & Yaw from a 2D Image
Understanding the Dynamics of the
Robotic / Vision Coordinate Interface

Steve Prehn
Senior Product Manager - Vision
FANUC Robotics America, Inc.
Image to Robot Relationship

In two-dimensional applications, the XY plane of the user frame


specified here should be parallel to the target workpiece plane.
How do you compensate when this is not the case?
Vision To Robot Transformations
Considerations
• Camera mounting style
– Fixed position or Robot mounted camera
• Part Presentation issues
– In which axes is the part likely to move?
• X, Y, Rotation, Z, Pitch and Yaw
– Is the part consistent or is its presentation consistent?
– Is it possible to correlate position from different
perspectives?
– Can structured light be used to help identify location?
2D Single Camera issue
Camera Image

• Height change creates subtle


apparent size change.
• Are you sure the part size is not
different – creating the same affect?
Distance Calculation from an
Accurate Focal Length

• Knowns:
Calculate
Height • Calibrated Focal length of
Lens
Known
Width
• Camera Array size
• If Part size is know, calculate
distance of the part from the
camera
Consistent part size
• Find parts at two known heights.
1) Calculate scale change and
correlate this to the height
difference. (Delta to delta
determines lens magnification)
2) The part size at this trained
distance is then known
Multi-plane Calibration

Find the Calibration


Grid at Two Levels.
The camera will be
calibrated at every
height between the
levels
Stereo Triangulation Method
• Locate multiple points and
calculate z offset from two
images
• Height is found from
relationship between
points within the two
images. Camera 1 Image Camera 2 Image

• Clear edges and points will


reduce confusion
Stereo Triangulation Method
Camera 1 Image

Camera 2 Image

• On round parts, transformations may


not be applied to exactly the same
point
– creating the possibility of error.
A 2D Change of Perspective
Camera Image

• As part orientation changes


in pitch and yaw, surface
points converge or diverge.
Structured Light
2D Camera
Image
Projected Line
from a Laser

* Images Courtesy of Sick - IVP


How Does 3DL Work?
Structured-light projector CCD Camera

Laser ON Laser OFF

3D processing 2D processing
Detection of Detection of
distance and pose by position and rotation of
structured-light the object from 2D image
Composition

Yaw, Pitch and Z X, Y and Roll

3D data with position and orientation


X, Y, Z, Yaw, Pitch and Roll
Image View:
Normal and Parallel
Image View:
Z Movement
Image View:
Extreme Yaw
Image View:
Extreme Pitch
Image View:
Normal and Parallel
3DL Vision Results
3DL Vision Results
Contact Information

Steve Prehn
Senior Product Manager - Vision
FANUC Robotics America, Inc.
3900 W. Hamlin Road
Rochester Hills, Michigan 48309
USA
Telephone: 248-276-4065
email:
steven.prehn@fanucrobotics.com
www.fanucrobotics.com
VGR Panel Discussion

You might also like