Show simple item record

dc.contributor.authorNeves Zenha, Ren_US
dc.date.accessioned2024-08-05T14:35:43Z
dc.identifier.urihttps://qmro.qmul.ac.uk/xmlui/handle/123456789/98628
dc.description.abstractThere is a growing interest in leveraging tactile sensing and data-driven methods to generate more robust robot grasps for effective autonomous tasks (e.g. object pick-and-place). In this context, slip detection is a fundamental skill since it allows for detecting when the grasped object moves within the gripper and applying corrective actions to hold the object static. Robotic slip detection requires two components: a sensor that captures signals related to the physical interaction between a gripper and an object, and a model identifying if such data corresponds to a slip event. In this thesis, a novel magnetic-based tactile sensor, the uSkin sensor, is used. It leverages measurements of signals related to the distributed normal and shear forces to develop slip-detection methods for real-time robot-object manipulation. In the context of autonomous robotic grasping, current slip detection methods lack the generalisation capabilities to cope with a generic set of tactile interactions. Effective slip classifiers have been proposed in the literature, combining different sensing mechanical principles (e.g. vibration, force, contact region sensing) and data-driven learning strategies (e.g. Machine and Deep Learning methods). However, due to overall constraints with collecting tactile data related to slips, current solutions struggle to cope with the large variability in gripper-object interactions in typical autonomous grasping systems (e.g., different grasp poses, slip intensity). Previous work simplifies the data collection processes while also critically limiting the variety of interactions for which data is collected (e.g. only considering up to 6 grasp poses). This thesis investigates the applicability (to autonomous grasping systems) of several data-driven slip-detection models trained with newly proposed efficient data collection processes that are also representative of the variability expected in autonomous robotic grasping. First, a vision-based autonomous data collection protocol is proposed. Classical machine learning models are trained with the collected data: first, with data from one or more objects (achieving up to 61.5% F1-Score generalisation performance); later, introducing a multi-stage object detection and slip detection pipeline to detect slips from autonomous grasps (up to 52% F1-Score generalisation performance). Then, learning from the key challenges observed with the previous approaches, a controlled object and sensor-independent tactile data collection protocol is proposed to collect tactile data representative of realistic variability. Experimentally, this protocol is shown to be faster, more efficient, and more reproducible than those proposed in the literature. Models trained with such data can also generalise better to unseen conditions, enabling robust robotic pick-and-place in real-world settings.en_US
dc.language.isoenen_US
dc.titleRobust Tactile Slip Detection, Using the uSkin Sensor Applied to Autonomous Robotic Graspingen_US
pubs.notesNot knownen_US
rioxxterms.funderDefault funderen_US
rioxxterms.identifier.projectDefault projecten_US


Files in this item

Thumbnail

This item appears in the following Collection(s)

  • Theses [4248]
    Theses Awarded by Queen Mary University of London

Show simple item record