Robotic Learning of Haptic Skills From Expert Demonstration for Contact-Rich Manufacturing Tasks

dc.contributor.author Hamdan, Sara
dc.contributor.author Aydın, Yusuf
dc.contributor.author Oztop, Erhan
dc.contributor.author Basdogan, Cagatay
dc.date.accessioned 2024-12-05T18:22:54Z
dc.date.available 2024-12-05T18:22:54Z
dc.date.issued 2024
dc.description.abstract We propose a learning from demonstration (LfD) approach that utilizes an interaction (admittance) controller and two force sensors for the robot to learn the force applied by an expert from demonstrations in contact-rich tasks such as robotic polishing. Our goal is to equip the robot with the haptic expertise of an expert by using a machine learning (ML) approach while providing the flexibility for the user to intervene in the task at any point when necessary by using an interaction controller. The utilization of two force sensors, a pivotal concept in this study, allows us to gather environmental data crucial for effectively training our system to accommodate workpieces with diverse material and surface properties and maintain the contact of polisher with their surfaces. In the demonstration phase of our approach where an expert guiding the robot to perform a polishing task, we record the force applied by the human (Fh) and the interaction force (Fint) via two separate force sensors for the polishing trajectory followed by the expert to extract information about the environment (Fenv = Fh - Fint). An admittance controller, which takes the interaction force as the input is used to output a reference velocity to be tracked by the internal motion controller (PID) of the robot to regulate the interactions between the polisher and the surface of a workpiece. A multilayer perceptron (MLP) model was trained to learn the human force profile based on the inputs of Cartesian position and velocity of the polisher, environmental force (Fenv), and friction coefficient between the polisher and the surface to the model. During the deployment phase, in which the robot executes the task autonomously, the human force estimated by our system ( <^>Fh) is utilized to balance the reaction forces coming from the environment and calculate the force ( <^>Fh - Fenv) needs to be inputted to the admittance controller to generate a reference velocity trajectory for the robot to follow. We designed three use-case scenarios to demonstrate the benefits of the proposed system. The presented use-cases highlight the capability of the proposed pHRI system to learn from human expertise and adjust its force based on material and surface variations during automated operations, while still accommodating manual interventions as needed.
dc.identifier.doi 10.1109/CASE59546.2024.10711473
dc.identifier.isbn 9798350358513
dc.identifier.isbn 9798350358520
dc.identifier.issn 2161-8070
dc.identifier.scopus 2-s2.0-85208254414
dc.identifier.uri https://hdl.handle.net/20.500.11779/2438
dc.language.iso en
dc.publisher IEEE
dc.relation.ispartof IEEE 20th International Conference on Automation Science and Engineering (CASE) -- AUG 28-SEP 01, 2024 -- Bari, ITALY
dc.relation.ispartofseries IEEE International Conference on Automation Science and Engineering
dc.rights info:eu-repo/semantics/closedAccess
dc.subject Physical Human-Robot Interaction (PHRI)
dc.subject Haptic Skill Transfer
dc.subject Autonomous Polishing
dc.subject Admittance Control
dc.subject Real-Time Interaction
dc.subject Machine Learning (ML)
dc.subject Contact-Rich Tasks
dc.title Robotic Learning of Haptic Skills From Expert Demonstration for Contact-Rich Manufacturing Tasks
dc.type Conference Object
dspace.entity.type Publication
gdc.author.institutional Aydın, Yusuf
gdc.bip.impulseclass C5
gdc.bip.influenceclass C5
gdc.bip.popularityclass C5
gdc.coar.access metadata only access
gdc.coar.type text::conference output
gdc.description.department Mühendislik Fakültesi, Elektrik Elektronik Mühendisliği Bölümü
gdc.description.endpage 2341
gdc.description.publicationcategory Konferans Öğesi - Uluslararası - Kurum Öğretim Elemanı
gdc.description.scopusquality Q3
gdc.description.startpage 2334
gdc.description.woscitationindex Conference Proceedings Citation Index - Science
gdc.description.wosquality N/A
gdc.identifier.openalex W4403678691
gdc.identifier.wos WOS:001361783102002
gdc.index.type WoS
gdc.index.type Scopus
gdc.oaire.diamondjournal false
gdc.oaire.impulse 0.0
gdc.oaire.influence 2.5942106E-9
gdc.oaire.isgreen false
gdc.oaire.popularity 2.9478422E-9
gdc.oaire.publicfunded false
gdc.openalex.collaboration National
gdc.openalex.fwci 0.0
gdc.openalex.normalizedpercentile 0.27
gdc.opencitations.count 0
gdc.plumx.mendeley 6
gdc.plumx.scopuscites 0
gdc.publishedmonth Eylül
gdc.scopus.citedcount 0
gdc.virtual.author Aydın, Yusuf
gdc.wos.citedcount 0
gdc.wos.publishedmonth Eylül
gdc.yokperiod YÖK - 2024-25
relation.isAuthorOfPublication b328b4ec-9950-43e2-82b7-f1930e43afe0
relation.isAuthorOfPublication.latestForDiscovery b328b4ec-9950-43e2-82b7-f1930e43afe0
relation.isOrgUnitOfPublication de19334f-6a5b-4f7b-9410-9433c48d1e5a
relation.isOrgUnitOfPublication 0d54cd31-4133-46d5-b5cc-280b2c077ac3
relation.isOrgUnitOfPublication a6e60d5c-b0c7-474a-b49b-284dc710c078
relation.isOrgUnitOfPublication.latestForDiscovery de19334f-6a5b-4f7b-9410-9433c48d1e5a

Files

Original bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
099.pdf
Size:
2.88 MB
Format:
Adobe Portable Document Format