Skip to main content

Research Repository

Advanced Search

Outputs (3)

Somabotics Toolkit for Rapid Prototyping Human-Robot Interaction Experiences using Wearable Haptics (2023)
Presentation / Conference Contribution
Zhou, F., Price, D., Pacchierotti, C., & Kucukyilmaz, A. (2023, July). Somabotics Toolkit for Rapid Prototyping Human-Robot Interaction Experiences using Wearable Haptics. Poster presented at IEEE World Haptics Conference, Delft, Netherlands

This work-in-progress paper presents a prototyping toolkit developed to design haptic interaction experiences. With developments in wearable and sensor technologies, new opportunities arise everyday to create rich haptic interaction experiences actin... Read More about Somabotics Toolkit for Rapid Prototyping Human-Robot Interaction Experiences using Wearable Haptics.

Push-to-See: Learning Non-Prehensile Manipulation to Enhance Instance Segmentation via Deep Q-Learning (2022)
Presentation / Conference Contribution
Serhan, B., Pandya, H., Kucukyilmaz, A., & Neumann, G. (2022, May). Push-to-See: Learning Non-Prehensile Manipulation to Enhance Instance Segmentation via Deep Q-Learning. Presented at IEEE International Conference on Robotics and Automation (ICRA 2022), Philadelphia, USA

Efficient robotic manipulation of objects for sorting and searching often rely upon how well the objects are perceived and the available grasp poses. The challenge arises when the objects are irregular, have similar visual features (e.g., textureless... Read More about Push-to-See: Learning Non-Prehensile Manipulation to Enhance Instance Segmentation via Deep Q-Learning.

Intelligent control of exoskeletons through a novel learning-from-demonstration method (2020)
Presentation / Conference Contribution
Ugur, E., Samur, E., Ugurlu, B., Erol Barkana, D., Kucukyilmaz, A., & Bebek, O. (2020, September). Intelligent control of exoskeletons through a novel learning-from-demonstration method. Poster presented at Cybathlon Symposium 2020, Zurich, Switzerland

We present a novel concept that enables the intelligent and adaptive control of exoskeletons through exploiting our state-of-the-art learning from demonstration (LfD) method, namely Conditional Neural Movement Primitives (CNMPs) [1], on our integrate... Read More about Intelligent control of exoskeletons through a novel learning-from-demonstration method.