SciELO - Scientific Electronic Library Online

 
vol.19 issue1 author indexsubject indexarticles search
Home Pagealphabetic serial listing  

Services on Demand

Article

Indicators

Related links

  • On index processCited by Google
  • On index processSimilars in Google

Share


South African Journal of Industrial Engineering

On-line version ISSN 2224-7890
Print version ISSN 1012-277X

S. Afr. J. Ind. Eng. vol.19 n.1 Pretoria  2008

 

Knowledge-based robot vision system for automated part handling

 

 

J. WangI; T.I. van NiekerkII; D.G. HattinghII; T. HuaII

ISchool of Mechanical and Automotive Engineering, Hefei University of Technology, China yangwang@mail.hf.ah.cn
IIFaculty of Engineering, the Built Environment and Information Technology, Nelson Mandela Metropolitan University, South Africa theo.vanniekerk@nmmu.ac.za

 

 


ABSTRACT

This paper discusses an algorithm incorporating a knowledge-based vision system into an industrial robot system for handling parts intelligently. A continuous fuzzy controller was employed to extract boundary information in a computationally efficient way. The developed algorithm for on-line part recognition using fuzzy logic is shown to be an effective solution to extract the geometric features of objects. The proposed edge vector representation method provides enough geometric information and facilitates the object geometric reconstruction for gripping planning. Furthermore, a part-handling model was created by extracting the grasp features from the geometric features.


OPSOMMING

Hierdie artikel beskryf 'n kennis-gebaseerde visiesisteemalgoritme wat in 'n industriële robotsisteem ingesluit word om sodoende intelligente komponenthantering te bewerkstellig. 'n Kontinue wasige beheerder is gebruik om allerlei objekinligting deur middel van 'n effektiewe berekeningsmetode te bepaal. Die ontwikkelde algoritme vir aan-lyn komponentherkenning maak gebruik van wasige logika en word bewys as 'n effektiewe metode om geometriese inligting van objekte te bepaal. Die voorgestelde grensvektormetode verskaf voldoende inligting en maak geometriese rekonstruksie van die objek moontlik om greepbeplanning te kan doen. Voorts is 'n komponenthanteringsmodel ontwikkel deur die grypkenmerke af te lei uit die geometriese eienskappe.


 

 

“Full text available only in PDF format”

 

 

REFERENCES

[1] Mishra, B. and Teichmann M., 1995. Three finger optimal planar grasp. Technical Report, New York University.         [ Links ]

[2] Kolluru, R., Valavanis, K.P. and Hebert, T.M., 1998. Modeling, analysis, and performance evaluation of a robotic gripper system for limp material handling, IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics, vol.28, No.3, 480-486.         [ Links ]

[3] Pusan National University, 1999. Development of an intelligent robot control system using off-line programming, Research report.         [ Links ]

[4] Nyongesa H.O. and Rosin P.L., 2000. Editorial: Neural-fuzzy applications in computer vision, Journal of Intelligent and Robotic Systems, vol. 29, no. 4, 309-315.         [ Links ]

[5] Evans, P., Fulcher, J., and Ogunbona, P., 1995. Industrial computer vision using undefined feature extraction. Proceedings of the 1995 IEEE International Conference on Neural Networks, Perth, Australia, vol.2, 1145-1149.         [ Links ]

[6] Zhang N. and Yan P., 1998. Neural network and fuzzy control, Beijing, Tsinghua University Press.         [ Links ]

[7] ABB Flexible Automation, 2002. Production manual IRB 1400.         [ Links ]

[8] Pacific Scientific Motor and Controls, 2001. Data sheet: Powermax hybrid stepper motor.         [ Links ]

[9] Delta Tau Data Systems Inc., 1998. PMAC user's manual.         [ Links ]

[10] Blador ASR AG, 1991. User's manual for PMAC accessory 8D option 2 - The voltage to frequency converter board (PMAC stepper motor interface).         [ Links ]

[11] Eagle Technology, 2002. Color CCD camera operating manual.         [ Links ]

[12] Integral Technologies Inc., 1999. Flashpoint 3D user manual.         [ Links ]

Creative Commons License All the contents of this journal, except where otherwise noted, is licensed under a Creative Commons Attribution License