Robots and the Sense of Touch
Roberto Calandra
Facebook AI Research
Re-Work Deep Reinforcement Learning Summit - 21 June 2019
State-of-the-art in Robotics
From YouTube: https://www.youtube.com/watch?v=g0TaYhjpOfo
Learning to Interact with the World
- Interactions with the world happen through forces (Newton's laws of motion)
- Current robots are almost "blind" w.r.t. forces
- Can we use tactile sensors to measure contact forces (like in humans) ?
- As humans, we create complex models of the world
- These models are useful to predict the effects of our actions
- Can robots learn similar models from complex multi-modal inputs?
- Can these learned models be used for control?
The Importance of Touch (in Robots)






The Importance of Touch (in Humans)
From the lab of Dr. Ronald Johansson, Dept. of Physiology, University of Umea, Sweden
The Importance of Touch (in Humans)
Challenges and Goal
- Opportunity: Tactile sensors allow to accurately measure contact forces, and enable real feedback control during manipulation.

- Goal: Improve current robot manipulation capabilities by integrating raw tactile sensors (through machine learning).
- Problem: Integrating tactile sensors in the control scheme is challenging, and requires a lot of engineering and expertise.
Previous Literature
[Allen et al. 1999]
[Chebotar et al. 2016]
[Bekiroglu et al. 2011]
[Sommer and Billard 2016]
[Schill et al. 2012]





GelSight Sensor
[Yuan, W.; Dong, S. & Adelson, E. H. GelSight: High-Resolution Robot Tactile Sensors for Estimating Geometry and Force Sensors, 2017]
- Optical tactile sensor
- Highly sensitive
- High resolution
(1280x960 @30Hz) - Grid structure
Examples of GelSight Measurements
















Self-supervised Data Collection
- 7-DOF Sawyer arm, Weiss WSG-50 Parallel gripper, and one GelSight on each finger
- Two RGB-D cameras in front and on top of the working space
- (Almost) fully autonomous data collection:
- Estimates the object position using depth, and perform a random grasp of the object.
- Labels automatically generated by looking at the presence of contacts after the attempted lift
Examples of Training Objects
Collected 6450 grasps from over 60 training objects over ~2 weeks.

Visuo-tactile Learned Model

Grasp Success on Unseen Objects
83.8% grasp success on 22 unseen objects
(using only vision yields 56.6% success rate)
Gentle Grasping
- Since our model considers forces, we can select grasps that are effective, but gentle
- We can halve the amount of force used with no significant loss in grasp success


Lessons Learned and Future Directions
- Engineering is super-important!
- No post lift-off control
- e.g., slip prediction, dynamics during loading phase
- Parallel gripper is very limiting
- multi-finger hands
- Single object
- cluttered environments
- Only consider macro-actions
- Moving towards continuous control
Tian, S.; Ebert, F.; Jayaraman, D.; Mudigonda, M.; Finn, C.; Calandra, R. & Levine, S.
Manipulation by Feel: Touch-Based Control with Deep Predictive Models
IEEE International Conference on Robotics and Automation (ICRA), 2019
Human Collaborators
Calandra, R.; Owens, A.; Jayaraman, D.; Yuan, W.; Lin, J.; Malik, J.; Adelson, E. H. & Levine, S.
More Than a Feeling: Learning to Grasp and Regrasp using Vision and Touch
IEEE Robotics and Automation Letters (RA-L), 2018, 3, 3300-3307

Andrew
Owens


Dinesh
Jayaraman
Wenzhen
Yuan
Justin
Lin
Jitendra
Malik
Sergey
Levine
Edward H.
Adelson




Summary
- Touch is a key sensor modality for interacting with the world
- Integrating high-resolution tactile sensors in traditional robot controllers is challenging
- Deep Reinforcement Learning can help!
- We presented a system that learns through self-supervised learning to grasp and re-grasp
- This system can generalize to new object with high success rate

Vision:
Towards active feedback control from multi-modal sensing
Thank you for your attention
References
- Calandra, R.; Owens, A.; Jayaraman, D.; Yuan, W.; Lin, J.; Malik, J.; Adelson, E. H. & Levine, S.
More Than a Feeling: Learning to Grasp and Regrasp using Vision and Touch
IEEE Robotics and Automation Letters (RA-L), 2018, 3, 3300-3307 - Allen, P. K.; Miller, A. T.; Oh, P. Y. & Leibowitz, B. S.
Integration of vision, force and tactile sensing for grasping
Int. J. Intelligent Machines, 1999, 4, 129-149 - Chebotar, Y.; Hausman, K.; Su, Z.; Sukhatme, G. S. & Schaal, S.
Self-supervised regrasping using spatio-temporal tactile features and reinforcement learning
International Conference on Intelligent Robots and Systems (IROS), 2016 - Schill, J.; Laaksonen, J.; Przybylski, M.; Kyrki, V.; Asfour, T. & Dillmann, R.
Learning continuous grasp stability for a humanoid robot hand based on tactile sensing
BioRob, 2012 - Bekiroglu, Y.; Laaksonen, J.; Jorgensen, J. A.; Kyrki, V. & Kragic, D.
Assessing grasp stability based on learning and haptic data
Transactions on Robotics, 2011, 27 - Sommer, N. & Billard, A.
Multi-contact haptic exploration and grasping with tactile sensors
Robotics and autonomous systems, 2016, 85, 48-61
Failure Case
Understanding the Learned Model

Understanding the Learned Model

Understanding the Learned Model
Additional Slide: Tactile MPC
Tian, S.; Ebert, F.; Jayaraman, D.; Mudigonda, M.; Finn, C.; Calandra, R. & Levine, S.
Manipulation by Feel: Touch-Based Control with Deep Predictive Models
IEEE International Conference on Robotics and Automation (ICRA), 2019
Additional Slide: Full Grasping Results

Robots and the Sense of Touch
By Roberto Calandra
Robots and the Sense of Touch
Humans make extensive use of touch. However, integrating the sense of touch in robot control has traditionally proved to be a difficult task. In this talk, I will discuss how machine learning can help to provide robots with the sense of touch, and the benefits of doing so.
- 1,739