Skip to main content

Research Repository

Advanced Search

All Outputs (30)

Nottingham Robotic Mobility Assistant (NoRMA): An Affordable DIY Robotic Wheelchair Platform
Presentation / Conference Contribution
Brand, L. C., & Kucukyilmaz, A. (2022, August). Nottingham Robotic Mobility Assistant (NoRMA): An Affordable DIY Robotic Wheelchair Platform. Presented at UKRAS22: The 5th UK Robotics and Autonomous Systems Conference, Aberystwyth, UK

A significant portion of the population requires a wheelchair to improve mobility, independence, and dignity but not all users are able to use a traditional manual one. Powered wheelchairs offer a more effortless experience but still present difficul... Read More about Nottingham Robotic Mobility Assistant (NoRMA): An Affordable DIY Robotic Wheelchair Platform.

Push-to-See: Learning Non-Prehensile Manipulation to Enhance Instance Segmentation via Deep Q-Learning
Presentation / Conference Contribution
Serhan, B., Pandya, H., Kucukyilmaz, A., & Neumann, G. (2022, May). Push-to-See: Learning Non-Prehensile Manipulation to Enhance Instance Segmentation via Deep Q-Learning. Presented at IEEE International Conference on Robotics and Automation (ICRA 2022), Philadelphia, USA

Efficient robotic manipulation of objects for sorting and searching often rely upon how well the objects are perceived and the available grasp poses. The challenge arises when the objects are irregular, have similar visual features (e.g., textureless... Read More about Push-to-See: Learning Non-Prehensile Manipulation to Enhance Instance Segmentation via Deep Q-Learning.

Intent-Aware Predictive Haptic Guidance and its Application to Shared Control Teleoperation
Presentation / Conference Contribution
Ly, K. T., Poozhiyil, M., Pandya, H., Neumann, G., & Kucukyilmaz, A. (2021, August). Intent-Aware Predictive Haptic Guidance and its Application to Shared Control Teleoperation. Presented at 2021 30th IEEE International Conference on Robot and Human Interactive Communication, RO-MAN 2021, Vancouver, Canada (Online)

This paper presents a haptic shared control paradigm that modulates the level of robotic guidance, based on predictions of human motion intentions. The proposed method incorporates robot trajectories learned from human demonstrations and dynamically... Read More about Intent-Aware Predictive Haptic Guidance and its Application to Shared Control Teleoperation.

The Goods and Bads in Dyadic Co-Manipulation: Identifying Conflict-Driven Interaction Behaviours in Human-Human Collaboration
Presentation / Conference Contribution
Issak, I., & Kucukyilmaz, A. (2020, April). The Goods and Bads in Dyadic Co-Manipulation: Identifying Conflict-Driven Interaction Behaviours in Human-Human Collaboration. Presented at UKRAS20 Conference: “Robots into the real world”, Lincoln, UK

One of the challenges in collaborative human-robot object transfer is the robot’s ability to infer about the interaction state and adapt to it in real time. During joint object transfer humans communicate about the interaction states through mul- tip... Read More about The Goods and Bads in Dyadic Co-Manipulation: Identifying Conflict-Driven Interaction Behaviours in Human-Human Collaboration.

VR-Fit: Walking-in-Place Locomotion with Real Time Step Detection for VR-Enabled Exercise
Presentation / Conference Contribution
Sari, S., & Kucukyilmaz, A. (2019, August). VR-Fit: Walking-in-Place Locomotion with Real Time Step Detection for VR-Enabled Exercise. Presented at 16th International Conference on Mobile Web and Intelligent Information Systems, Istanbul, Turkey

With recent advances in mobile and wearable technologies, virtual reality (VR) found many applications in daily use. Today, a mobile device can be converted into a low-cost immersive VR kit thanks to the availability of do-it-yourself viewers in the... Read More about VR-Fit: Walking-in-Place Locomotion with Real Time Step Detection for VR-Enabled Exercise.

Haptic negotiation and role exchange for collaboration in virtual environments
Presentation / Conference Contribution
Oguz, S. O., Kucukyilmaz, A., Sezgin, T. M., & Basdogan, C. (2010, March). Haptic negotiation and role exchange for collaboration in virtual environments. Presented at 2010 IEEE Haptics Symposium, Waltham, MA, USA

We investigate how collaborative guidance can be realized in multi-modal virtual environments for dynamic tasks involving motor control. Haptic guidance in our context can be defined as any form of force/tactile feedback that the computer generates t... Read More about Haptic negotiation and role exchange for collaboration in virtual environments.

Role allocation through haptics in physical human-robot interaction
Presentation / Conference Contribution
Kucukyilmaz, A., Sezgin, T. M., & Basdogan, C. (2013, April). Role allocation through haptics in physical human-robot interaction. Presented at 2013 21st Signal Processing and Communications Applications Conference (SIU), Haspolat, Turkey

One-shot assistance estimation from expert demonstrations for a shared control wheelchair system
Presentation / Conference Contribution
Kucukyilmaz, A., & Demiris, Y. (2015, August). One-shot assistance estimation from expert demonstrations for a shared control wheelchair system. Presented at 24th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN 2015), Kobe, Japan

An emerging research problem in the field of assistive robotics is the design of methodologies that allow robots to provide human-like assistance to the users. Especially within the rehabilitation domain, a grand challenge is to program a robot to mi... Read More about One-shot assistance estimation from expert demonstrations for a shared control wheelchair system.

Conveying intentions through haptics in human-computer collaboration
Presentation / Conference Contribution
Kucukyilmaz, A., Sezgin, T. M., & Basdogan, C. (2011, June). Conveying intentions through haptics in human-computer collaboration. Presented at IEEE World Haptics Conference, Istanbul, Turkey

Haptics has been used as a natural way for humans to communicate with computers in collaborative virtual environments. Human-computer collaboration is typically achieved by sharing control of the task between a human and a computer operator. An impor... Read More about Conveying intentions through haptics in human-computer collaboration.

ScoutWav: Two-Step Fine-Tuning on Self-Supervised Automatic Speech Recognition for Low-Resource Environments
Presentation / Conference Contribution
Fatehi, K., Torres, M. T., & Kucukyilmaz, A. (2022, September). ScoutWav: Two-Step Fine-Tuning on Self-Supervised Automatic Speech Recognition for Low-Resource Environments. Presented at Interspeech 2022, Incheon, Korea

Recent improvements in Automatic Speech Recognition (ASR) systems obtain extraordinary results. However, there are specific domains where training data can be either limited or not representative enough, which are known as Low-Resource Environments (... Read More about ScoutWav: Two-Step Fine-Tuning on Self-Supervised Automatic Speech Recognition for Low-Resource Environments.