A Low Cost Teleoperation Framework Using Vision-based Tactile Sensors for Rigid and Soft Object Manipulation

Martina Lippi*, Michael C. Welle*,Maciej K. Wozniak, Andrea Gasparri, Danica Kragic

Haptic feedback is essential for humans to successfully perform complex and delicate manipulation tasks. A recent rise in tactile sensors has enabled robots to leverage the sense of touch and expand their capability drastically. However, many tasks still need human intervention/guidance. For this reason, we present a teleoperation framework designed to provide haptic feedback to human operators based on the data from camera-based tactile sensors mounted on the robot gripper. Partial autonomy is introduced to prevent slippage of grasped objects during task execution. Notably, we rely exclusively on low-cost off-the-shelf hardware to realize an affordable solution. We demonstrate the versatility of the framework on nine different objects ranging from rigid to soft and fragile ones, using three different operators on real hardware.

*Contributed equally

Method

Our Method in action:

Execution videos

Plastic gel bottle

Object

Tactile data

Execution videos

Plastic cup

Object

Tactile data

Execution videos

AUX connector

Object

Tactile data

Execution videos

Tetra Pak box

Object

Tactile data

Execution videos

Pistachio nut

Object

Tactile data

Execution videos

Lime

Object

Tactile data

Execution videos

Plum

Object

Tactile data

Execution videos

Table-grape berry

Object

Tactile data

Execution videos

Tomato

Object

Tactile data

Execution videos

Instructions for setup

Instructions for setting up the teleoperation framework using DIGIT tactile sensors and Oculus Quest 2 controller

Setup instructions

Code

Code available at the following repository:

Contact

  • Martina Lippi; martina.lippi(at)uniroma3.it; Roma Tre University, Italy
  • Michael C. Welle; mwelle(at)kth.se; KTH Royal Institute of Technology, Sweden
  • Maciej K. Wozniak; maciejw(at)kth.se; KTH Royal Institute of Technology, Sweden
  • Andrea Gasparri; andrea.gasparri(at)uniroma3.it; Roma Tre University, Italy
  • Danica Kragic; dani(at)kth.se; KTH Royal Institute of Technology, Sweden