Dr AYSE KUCUKYILMAZ AYSE.KUCUKYILMAZ@NOTTINGHAM.AC.UK
Associate Professor
An emerging research problem in the field of assistive robotics is the design of methodologies that allow robots to provide human-like assistance to the users. Especially within the rehabilitation domain, a grand challenge is to program a robot to mimic the operation of an occupational therapist, intervening with the user when necessary so as to improve the therapeutic power of the assistive robotic system. We propose a method to estimate assistance policies from expert demonstrations to present human-like intervention during navigation in a powered wheelchair setup. For this purpose, we constructed a setting, where a human offers assistance to the user over a haptic shared control system. The robot learns from human assistance demonstrations while the user is actively driving the wheelchair in an unconstrained environment. We train a Gaussian process regression model to learn assistance commands given past and current actions of the user and the state of the environment. The results indicate that the model can estimate human assistance after only a single demonstration, i.e. in one-shot, so that the robot can help the user by selecting the appropriate assistance in a human-like fashion.
Kucukyilmaz, A., & Demiris, Y. (2015, August). One-shot assistance estimation from expert demonstrations for a shared control wheelchair system. Presented at 24th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN 2015), Kobe, Japan
Presentation Conference Type | Conference Paper (published) |
---|---|
Conference Name | 24th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN 2015) |
Start Date | Aug 31, 2015 |
End Date | Sep 4, 2015 |
Acceptance Date | Apr 28, 2015 |
Online Publication Date | Nov 23, 2015 |
Publication Date | Aug 31, 2015 |
Deposit Date | Feb 26, 2020 |
Publicly Available Date | Feb 18, 2021 |
DOI | https://doi.org/10.1109/roman.2015.7333600 |
Public URL | https://nottingham-repository.worktribe.com/output/4040452 |
Publisher URL | https://ieeexplore.ieee.org/document/7333600 |
Additional Information | © 2015 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works. |
2015-Kucukyilmaz2015OneShot
(3.6 Mb)
PDF
LABERT: A Combination of Local Aggregation and Self-Supervised Speech Representation Learning for Detecting Informative Hidden Units in Low-Resource ASR Systems
(2023)
Presentation / Conference Contribution
Somabotics Toolkit for Rapid Prototyping Human-Robot Interaction Experiences using Wearable Haptics
(2023)
Presentation / Conference Contribution
Somabotics Toolkit for Rapid Prototyping Human-Robot Interaction Experiences using Wearable Haptics
(2023)
Presentation / Conference Contribution
In-the-Wild Failures in a Long-Term HRI Deployment
(2023)
Presentation / Conference Contribution
ScoutWav: Two-Step Fine-Tuning on Self-Supervised Automatic Speech Recognition for Low-Resource Environments
(2022)
Presentation / Conference Contribution
About Repository@Nottingham
Administrator e-mail: discovery-access-systems@nottingham.ac.uk
This application uses the following open-source libraries:
Apache License Version 2.0 (http://www.apache.org/licenses/)
Apache License Version 2.0 (http://www.apache.org/licenses/)
SIL OFL 1.1 (http://scripts.sil.org/OFL)
MIT License (http://opensource.org/licenses/mit-license.html)
CC BY 3.0 ( http://creativecommons.org/licenses/by/3.0/)
Powered by Worktribe © 2025
Advanced Search