VTechWorks staff will be away for the Thanksgiving holiday beginning at noon on Wednesday, November 27, through Friday, November 29. We will resume normal operations on Monday, December 2. Thank you for your patience.
 

The integration of visual and tactile sensing for the definition of regions within a robot workcell

dc.contributor.authorDe Meter, Edward Christopheren
dc.contributor.departmentIndustrial Engineering and Operations Researchen
dc.date.accessioned2021-10-26T20:10:06Zen
dc.date.available2021-10-26T20:10:06Zen
dc.date.issued1986en
dc.description.abstractVision systems are widely used in robot workcells for sensory feedback. The resolution of a vision system is usually good enough to locate an object so that it can be grasped, but not good enough to accurately locate an insertion hole. Tactile probes are used to accurately locate objects. However, they require a data base containing the approximate location of an object in order to be used effectively. This thesis presents the development of a robot workcell which utilizes a vision system and tactile probe to identify, locate, and orientate two types of circuit board fixtures. The vision system approximately locates the corner points of each fixture in the robot workcell. The tactile system uses the data base created by the vision system to conduct a tactile search for each fixture and to accurately define the coordinates of each corner point. After a fixture is accurately located, a region (sub-coordinate system) is defined about the fixture. The location of each insertion hole within a fixture is defined relative to the region and the robot subsequently inserts the tactile probe into each hole. The vision system developed can define any two dimensional object and can locate the corner points of any straight edged object, whose adjacent sides have an included angle greater than 90 degrees. The tactile system is self calibrating and has a repeatability of 0.009 inches. A probe insertion error analysis was conducted on the system. The average probe insertion error for the system was determined to be 0.0337 inches. In addition, it was determined that probe insertion error increases with the distance between a hole and the origin of its defining region, and that the major source of probe insertion error is the robot language's (AML/E Verion 4.0) inability to accurately define points within a region.en
dc.description.degreeM.S.en
dc.format.extent2 volumes (xi, 385 pages)en
dc.format.mimetypeapplication/pdfen
dc.identifier.urihttp://hdl.handle.net/10919/106091en
dc.language.isoenen
dc.publisherVirginia Polytechnic Institute and State Universityen
dc.relation.isformatofOCLC# 15167199en
dc.rightsIn Copyrighten
dc.rights.urihttp://rightsstatements.org/vocab/InC/1.0/en
dc.subject.lccLD5655.V855 1986.D459en
dc.subject.lcshRobots, Industrialen
dc.subject.lcshVision -- Researchen
dc.titleThe integration of visual and tactile sensing for the definition of regions within a robot workcellen
dc.typeThesisen
dc.type.dcmitypeTexten
thesis.degree.disciplineIndustrial Engineering and Operations Researchen
thesis.degree.grantorVirginia Polytechnic Institute and State Universityen
thesis.degree.levelmastersen
thesis.degree.nameM.S.en

Files

Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
LD5655.V855_1986.D459.pdf
Size:
14.8 MB
Format:
Adobe Portable Document Format

Collections