Skip to main content

Research Repository

Advanced Search

Dr AYSE KUCUKYILMAZ's Outputs (37)

The Goods and Bads in Dyadic Co-Manipulation: Identifying Conflict-Driven Interaction Behaviours in Human-Human Collaboration (2020)
Presentation / Conference Contribution
Issak, I., & Kucukyilmaz, A. (2020, April). The Goods and Bads in Dyadic Co-Manipulation: Identifying Conflict-Driven Interaction Behaviours in Human-Human Collaboration. Presented at UKRAS20 Conference: “Robots into the real world”, Lincoln, UK

One of the challenges in collaborative human-robot object transfer is the robot’s ability to infer about the interaction state and adapt to it in real time. During joint object transfer humans communicate about the interaction states through mul- tip... Read More about The Goods and Bads in Dyadic Co-Manipulation: Identifying Conflict-Driven Interaction Behaviours in Human-Human Collaboration.

Haptic-Guided Teleoperation of a 7-DoF Collaborative Robot Arm with an Identical Twin Master (2020)
Journal Article
Singh, J., Srinivasan, A. R., Neumann, G., & Kucukyilmaz, A. (2020). Haptic-Guided Teleoperation of a 7-DoF Collaborative Robot Arm with an Identical Twin Master. IEEE Transactions on Haptics, 13(1), 246-252. https://doi.org/10.1109/toh.2020.2971485

In this study, we describe two techniques to enable haptic-guided teleoperation using 7-DoF cobot arms as master and slave devices. A shortcoming of using cobots as master-slave systems is the lack of force feedback at the master side. However, recen... Read More about Haptic-Guided Teleoperation of a 7-DoF Collaborative Robot Arm with an Identical Twin Master.

VR-Fit: Walking-in-Place Locomotion with Real Time Step Detection for VR-Enabled Exercise (2019)
Presentation / Conference Contribution
Sari, S., & Kucukyilmaz, A. (2019, August). VR-Fit: Walking-in-Place Locomotion with Real Time Step Detection for VR-Enabled Exercise. Presented at 16th International Conference on Mobile Web and Intelligent Information Systems, Istanbul, Turkey

With recent advances in mobile and wearable technologies, virtual reality (VR) found many applications in daily use. Today, a mobile device can be converted into a low-cost immersive VR kit thanks to the availability of do-it-yourself viewers in the... Read More about VR-Fit: Walking-in-Place Locomotion with Real Time Step Detection for VR-Enabled Exercise.

Learning Shared Control by Demonstration for Personalized Wheelchair Assistance (2018)
Journal Article
Kucukyilmaz, A., & Demiris, Y. (2018). Learning Shared Control by Demonstration for Personalized Wheelchair Assistance. IEEE Transactions on Haptics, 11(3), 431-442. https://doi.org/10.1109/TOH.2018.2804911

An emerging research problem in assistive robotics is the design of methodologies that allow robots to provide personalized assistance to users. For this purpose, we present a method to learn shared control policies from demonstrations offered by a h... Read More about Learning Shared Control by Demonstration for Personalized Wheelchair Assistance.

One-shot assistance estimation from expert demonstrations for a shared control wheelchair system (2015)
Presentation / Conference Contribution
Kucukyilmaz, A., & Demiris, Y. (2015, August). One-shot assistance estimation from expert demonstrations for a shared control wheelchair system. Presented at 24th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN 2015), Kobe, Japan

An emerging research problem in the field of assistive robotics is the design of methodologies that allow robots to provide human-like assistance to the users. Especially within the rehabilitation domain, a grand challenge is to program a robot to mi... Read More about One-shot assistance estimation from expert demonstrations for a shared control wheelchair system.

Recognition of Haptic Interaction Patterns in Dyadic Joint Object Manipulation (2014)
Journal Article
Madan, C. E., Kucukyilmaz, A., Sezgin, T. M., & Basdogan, C. (2015). Recognition of Haptic Interaction Patterns in Dyadic Joint Object Manipulation. IEEE Transactions on Haptics, 8(1), 54-66. https://doi.org/10.1109/toh.2014.2384049

The development of robots that can physically cooperate with humans has attained interest in the last decades. Obviously, this effort requires a deep understanding of the intrinsic properties of interaction. Up to now, many researchers have focused o... Read More about Recognition of Haptic Interaction Patterns in Dyadic Joint Object Manipulation.

Haptic Role Allocation and Intention Negotiation in Human-Robot Collaboration (2013)
Thesis
Kucukyilmaz, A. Haptic Role Allocation and Intention Negotiation in Human-Robot Collaboration. (Thesis). https://nottingham-repository.worktribe.com/output/4040519

This dissertation aims to present a perspective to build more natural shared control systems for physical human-robot cooperation. As the tasks become more complex and more dynamic, many shared control schemes fail to meet the expectation of an effor... Read More about Haptic Role Allocation and Intention Negotiation in Human-Robot Collaboration.

Role allocation through haptics in physical human-robot interaction (2013)
Presentation / Conference Contribution
Kucukyilmaz, A., Sezgin, T. M., & Basdogan, C. (2013, April). Role allocation through haptics in physical human-robot interaction. Presented at 2013 21st Signal Processing and Communications Applications Conference (SIU), Haspolat, Turkey

The role of roles: Physical cooperation between humans and robots (2012)
Journal Article
Mörtl, A., Lawitzky, M., Kucukyilmaz, A., Sezgin, M., Basdogan, C., & Hirche, S. (2012). The role of roles: Physical cooperation between humans and robots. International Journal of Robotics Research, 31(13), 1656--1674. https://doi.org/10.1177/0278364912455366

Since the strict separation of working spaces of humans and robots has experienced a softening due to recent robotics research achievements, close interaction of humans and robots comes rapidly into reach. In this context, physical human–robot intera... Read More about The role of roles: Physical cooperation between humans and robots.

Supporting Negotiation Behavior with Haptics-Enabled Human-Computer Interfaces (2012)
Journal Article
Oguz, S. O., Kucukyilmaz, A., Sezgin, T. M., & Basdogan, C. (2012). Supporting Negotiation Behavior with Haptics-Enabled Human-Computer Interfaces. IEEE Transactions on Haptics, 5(3), 274--284. https://doi.org/10.1109/toh.2012.37

An active research goal for human-computer interaction is to allow humans to communicate with computers in an intuitive and natural fashion, especially in real-life interaction scenarios. One approach that has been advocated to achieve this has been... Read More about Supporting Negotiation Behavior with Haptics-Enabled Human-Computer Interfaces.

Conveying intentions through haptics in human-computer collaboration (2011)
Presentation / Conference Contribution
Kucukyilmaz, A., Sezgin, T. M., & Basdogan, C. (2011, June). Conveying intentions through haptics in human-computer collaboration. Presented at IEEE World Haptics Conference, Istanbul, Turkey

Haptics has been used as a natural way for humans to communicate with computers in collaborative virtual environments. Human-computer collaboration is typically achieved by sharing control of the task between a human and a computer operator. An impor... Read More about Conveying intentions through haptics in human-computer collaboration.

Haptic negotiation and role exchange for collaboration in virtual environments (2010)
Presentation / Conference Contribution
Oguz, S. O., Kucukyilmaz, A., Sezgin, T. M., & Basdogan, C. (2010, March). Haptic negotiation and role exchange for collaboration in virtual environments. Presented at 2010 IEEE Haptics Symposium, Waltham, MA, USA

We investigate how collaborative guidance can be realized in multi-modal virtual environments for dynamic tasks involving motor control. Haptic guidance in our context can be defined as any form of force/tactile feedback that the computer generates t... Read More about Haptic negotiation and role exchange for collaboration in virtual environments.

Modelling and Animation of Brittle Fracture in Three Dimensions (2007)
Thesis
Kucukyilmaz, A. Modelling and Animation of Brittle Fracture in Three Dimensions. (Thesis). https://nottingham-repository.worktribe.com/output/4040521

This thesis describes a system for simulating fracture in brittle objects. The system combines rigid body simulation methods with a constraint-based model to animate fracturing of arbitrary polyhedral shaped objects under impact. The objects are repr... Read More about Modelling and Animation of Brittle Fracture in Three Dimensions.

An Animation System for Fracturing of Rigid Objects (2005)
Book Chapter
Kucukyilmaz, A., & Ozguc, B. (2005). An Animation System for Fracturing of Rigid Objects. In Computer and Information Sciences - ISCIS 2005: 20th International Symposium, Istanbul, Turkey, October 26-28, 2005. Proceedings (688--697). Springer Berlin Heidelberg. https://doi.org/10.1007/11569596_71

This paper describes a system for the animation of fracturing brittle objects. The system combines rigid body simulation methods with a constraint-based model to animate fracturing of arbitrary polyhedral shaped objects under impact. The objects are... Read More about An Animation System for Fracturing of Rigid Objects.