Digitizing Touch
and its Importance in Robotics

Roberto Calandra

IROS 2025 - 22 August 2025

LASR Lab

The Importance of Touch

From the lab of Dr. Ronald Johansson, Dept. of Physiology, University of Umea, Sweden

The Importance of Touch (in Humans)

  • Rodney Brooks - Why Today’s Humanoids Won’t Learn Dexterity
    https://rodneybrooks.com/why-todays-humanoids-wont-learn-dexterity/
    26 September 2025

  • [...] "humanoid robots will need a sense of touch, and a level of touch sensing that no one has yet built in the lab in order for them to do tasks like the one above which is of the same order of difficulty that millions of workers do all day everyday" [...]

  • [Bold by Roberto Calandra]

The Importance of Touch (in Robots)

The Next Breakthrough will be in Touch

Audio

Touch

Vision

(~1890)

(~1990)

(~2025)

The Grand Vision of Digitizing Touch

+ Applications

+ Community

Hardware

Vision-based Tactile Sensors (VBTS)

[Hillis, W. D. A High-Resolution Imaging Touch Sensor The International Journal of Robotics Research, 1982, 1, 33-44 ]

[Tanie, K.; Komoriya, K.; Kaneko, M.; Tachi, S. & Fujikawa, A. A high resollution tactile sensor Proc. of 4th Int. Conf. on Robot Vision and Sensory Controls, 1984, 251, 260]

[Begej, S. Planar and finger-shaped optical tactile sensors for robotic applications IEEE Journal on Robotics and Automation, 1988, 4, 472-484]
[Kamiyama, K.; Kajimoto, H.; Kawakami, N. & Tachi, S. Evaluation of a vision-based tactile sensor IEEE International Conference on Robotics and Automation (ICRA), 2004, 2, 1542-1547 ]

[Johnson, M. K. & Adelson, E. H. Retrographic sensing for the measurement of surface texture and shape Computer Vision and Pattern Recognition (CVPR), 2009, 1070-1077]

[Abad, A. C. & Ranasinghe, A. Visuotactile Sensors With Emphasis on GelSight Sensor: A Review IEEE Sensors Journal, 2020, 20, 7628-7638]

Image from:
[Yuan, W.; Dong, S. & Adelson, E. H. GelSight: High-Resolution Robot Tactile Sensors for Estimating Geometry and Force Sensors, Multidisciplinary Digital Publishing Institute, 2017]

Democratizing Touch with DIGIT

Lambeta, M.; Chou, P.-W.; Tian, S.; Yang, B.; Maloon, B.; Most, V. R.; Stroud, D.; Santos, R.; Byagowi, A.; Kammerer, G.; Jayaraman, D. & Calandra, R.
DIGIT: A Novel Design for a Low-Cost Compact High-Resolution Tactile Sensor with Application to In-Hand Manipulation
IEEE Robotics and Automation Letters (RA-L), 2020, 5, 3838-3845

Examples of DIGIT Measurements

Lambeta, M.; Chou, P.-W.; Tian, S.; Yang, B.; Maloon, B.; Most, V. R.; Stroud, D.; Santos, R.; Byagowi, A.; Kammerer, G.; Jayaraman, D. & Calandra, R.
DIGIT: A Novel Design for a Low-Cost Compact High-Resolution Tactile Sensor with Application to In-Hand Manipulation
IEEE Robotics and Automation Letters (RA-L), 2020, 5, 3838-3845

Comparison to other Commercial Products

BioTac

DIGIT

~15,000 $

Cost

~15 $*

Resolution

29
taxels

307,200
taxels

Mounted on multi-finger hands

Open-source

1000x
Higher resolution

1000x
Cheaper

* component cost for 1000 units, not including labor

Tactile SLAM

Suresh, S.; Qi, H.; Wu, T.; Fan, T.; Pineda, L.; Lambeta, M.; Malik, J.; Kalakrishnan, M.; Calandra, R.; Kaess, M.; Ortiz, J. & Mukadam, M.
Neural feels with neural fields: Visuo-tactile perception for in-hand manipulation 
Science Robotics 2024 https://arxiv.org/abs/2312.13469

DIGIT's Limitations

  • Sampling Rate
    (Camera are relatively slow)
  • Data Bandwidth
    (A hand can generate hundreds of Mb/s)
  • Bulky
    (Focal distance and electronics)
  • No Multimodality
    (Human skin has very heterogeneous receptors)

Evetac: An Event-Based VBTS

Funk, N.; Helmut, E.; Chalvatzaki, G.; Calandra, R. & Peters, J.
Evetac: An Event-based Optical Tactile Sensor for Robotic Manipulation 
IEEE Transactions on Robotics (T-RO), 2024 https://arxiv.org/abs/2312.01236

  • Event-based Camera
  • Up to 1KHz to better capture dynamic touch
  • Modular and using off-the-shelf components
    (Inspired by the DigiTac by Nathan Lepora)
  • Not the first event-based Tactile Sensor
    (See pioneer work by Benjamin Ward-Cherrier
    )

Data Bandwidth

12%

1.7% over
entire
trajectory

Funk, N.; Helmut, E.; Chalvatzaki, G.; Calandra, R. & Peters, J.
Evetac: An Event-based Optical Tactile Sensor for Robotic Manipulation 
IEEE Transactions on Robotics (T-RO), 2024 https://arxiv.org/abs/2312.01236

DIGIT Pinky

Di, J.; Dugonjic, Z.; Fu, W.; Wu, T.; Mercado, R.; Sawyer, K.; Most, V. R.; Kammerer, G.; Speidel, S.; Fan, R. E.; Sonn, G.; Cutkosky, M. R.; Lambeta, M. & Calandra, R.
Using Fiber Optic Bundles to Miniaturize Vision-Based Tactile Sensors
IEEE Transactions on Robotics (T-RO), 2024 https://arxiv.org/abs/2403.05500

Cancer Detection in Prostate Tissue

Di, J.; Dugonjic, Z.; Fu, W.; Wu, T.; Mercado, R.; Sawyer, K.; Most, V. R.; Kammerer, G.; Speidel, S.; Fan, R. E.; Sonn, G.; Cutkosky, M. R.; Lambeta, M. & Calandra, R.
Using Fiber Optic Bundles to Miniaturize Vision-Based Tactile Sensors
IEEE Transactions on Robotics (T-RO), 2024 https://arxiv.org/abs/2403.05500

Cancer

No Cancer

Digit360: A Superhuman Multi-modal Fingertip

Lambeta M.; Wu T.; Sengül A.; Most V. R.; Black N.; Sawyer K.; Mercado R.; Qi H.; Sohn A.; Taylor B.; Tydingco N.; Kammerer G.; Stroud D.; Khatha J.; Jenkins K.; Most K.; Stein N.; Chavira R.; Craven-Bartle T.; Sanchez E.; Ding Y.; Malik J. & Calandra R.
Digitizing Touch with an Artificial Multimodal Fingertip
Arxiv Preprint. 2024 https://arxiv.org/abs/2411.02479

Resolution

~100x
better than humans

Lambeta M.; Wu T.; Sengül A.; Most V. R.; Black N.; Sawyer K.; Mercado R.; Qi H.; Sohn A.; Taylor B.; Tydingco N.; Kammerer G.; Stroud D.; Khatha J.; Jenkins K.; Most K.; Stein N.; Chavira R.; Craven-Bartle T.; Sanchez E.; Ding Y.; Malik J. & Calandra R.
Digitizing Touch with an Artificial Multimodal Fingertip
Arxiv Preprint. 2024 https://arxiv.org/abs/2411.02479

Multi-modality

Lambeta M.; Wu T.; Sengül A.; Most V. R.; Black N.; Sawyer K.; Mercado R.; Qi H.; Sohn A.; Taylor B.; Tydingco N.; Kammerer G.; Stroud D.; Khatha J.; Jenkins K.; Most K.; Stein N.; Chavira R.; Craven-Bartle T.; Sanchez E.; Ding Y.; Malik J. & Calandra R.
Digitizing Touch with an Artificial Multimodal Fingertip
Arxiv Preprint. 2024 https://arxiv.org/abs/2411.02479

Touch Processing
(i.e., AI for Touch)

Creating a New Science of Touch Processing

Many open questions:

  • What are good features for touch?
  • Do we need sensor standardization?
    • What representation do we want/need for touch?
    • What sensorial information do we even want/need for touch?
  • What are the useful structures in computational models for touch?
  • What are the different tasks that can benefit from touch?
  • What are meaningful benchmarks for touch processing?
  • and more...

Very limited literature about computational processing of touch sensing

It will take decades of research to answer all of these questions!
(and reach the same maturity of community working on other sensing modalities)

PyTouch: A Machine Learning Library
for Touch Processing

Goal: Create the equivalent of OpenCV for Touch

Lambeta, M.; Xu, H.; Xu, J.; Chou, P.-W.; Wang, S.; Darrell, T. & Calandra, R.
PyTouch: A Machine Learning Library for Touch Processing
IEEE International Conference on Robotics and Automation (ICRA), 2021,
Online: https://arxiv.org/abs/2105.12791

What Representations do we Need for Touch?

Lambeta, M.; Xu, H.; Xu, J.; Chou, P.-W.; Wang, S.; Darrell, T. & Calandra, R.
PyTouch: A Machine Learning Library for Touch Processing
IEEE International Conference on Robotics and Automation (ICRA), 2021

Kerr, J.; Huang, H.; Wilcox, A.; Hoque, R.; Ichnowski, J.; Calandra, R. & Goldberg, K.
Self-Supervised Visuo-Tactile Pretraining to Locate and Follow Garment Features
Robotics: Science and Systems (RSS) 2023, Online: https://arxiv.org/pdf/2209.13042

Transfer Across Sensors

Transfer across Tasks

Touch and Language

Fu, L.; Datta, G.; Huang, H.; Panitch, W. C.-H.; Drake, J.; Ortiz, J.; Mukadam, M.; Lambeta, M.; Calandra, R. & Goldberg, K.
A Touch, Vision, and Language Dataset for Multimodal Alignment

ICML 2024 https://arxiv.org/abs/2402.13232

Applications

Key Applications

Robotics

Metaverse
(AR/VR)

E-commerce

Medical

Why is Touch Important for Robotics?

  • Main sensing modality in robot is vision (and similar)
  • Vision provides only weak cues about contact and forces
    (and suffers from occlusions)
  • Instead, Touch provides direct measurement of forces, and the exact position where they are applied
  • Accurately measure contact and forces is crucial to interact with the world and perform robust contact-aware control!
  • Growing set of evidence that quantify the benefit of using Touch!
  • Here are a few examples...

Learning to Grasp and Regrasp using Vision and Touch

Calandra, R.; Owens, A.; Jayaraman, D.; Yuan, W.; Lin, J.; Malik, J.; Adelson, E. H. & Levine, S.
More Than a Feeling: Learning to Grasp and Regrasp using Vision and Touch
IEEE Robotics and Automation Letters (RA-L), 2018, 3, 3300-3307

Trained with 6450 grasps from over 60 training objects

Grasp Success on Unseen Objects

83.8% grasp success on 22 unseen objects
(using only vision yields 56.6% success rate)

Learning Gentle Grasping Using Vision, Sound, and Touch

Nakahara, K. & Calandra, R.
Learning Gentle Grasping Using Vision, Sound, and Touch
IROS, 2025

Learning Gentle Grasping Using Vision, Sound, and Touch

Nakahara, K. & Calandra, R.
Learning Gentle Grasping Using Vision, Sound, and Touch
IROS, 2025

Lighting a Match with a Robot

N. Funk; C. Chen, T. Schneider, G. Chalvatzaki, R. Calandra, and J. Peters
On the Importance of Tactile Sensing for Imitation Learning: A Case Study on Robotic Match Lighting
Arxiv Preprint. 2025 https://arxiv.org/abs/2504.13618.

From 20% (vision only) to 80% (vision+touch) success rate

Learning General In-Hand Rotation

Qi, Haozhi, Brent Yi, Sudharshan Suresh, Mike Lambeta, Yi Ma, Roberto Calandra, and Jitendra Malik.
General In-Hand Object Rotation with Vision and Touch.
Conference on Robot Learning (CORL). 2023 https://arxiv.org/abs/2309.09979

Learning General In-Hand Rotation

Qi, Haozhi, Brent Yi, Sudharshan Suresh, Mike Lambeta, Yi Ma, Roberto Calandra, and Jitendra Malik.
General In-Hand Object Rotation with Vision and Touch.
Conference on Robot Learning (CORL). 2023 https://arxiv.org/abs/2309.09979

Teleoperation with Haptic Feedback

Fritsche, L.; Unverzagt, F.; Peters, J. & Calandra, R.
First-Person Tele-Operation of a Humanoid Robot
IEEE-RAS International Conference on Humanoid Robots (HUMANOIDS), 2015

To Conclude

Human Collaborators

LASR Lab @ TU Dresden

Supported by

Overview

  • Touch is a key sensor modality for humans and robots
  • Touch Sensing is entering a new "digital revolution" that will make it ubiquitous and enable new applications
    • We already have hardware that has better capabilities than Humans
    • Touch Processing is a new exciting field of AI dedicated to make sense of touch
  • Growing evidences that Touch Sensing helps robot to achieve better performance and be more robust
    • Critical to move manipulation from "lab performance" to "real-world performance"

Thank you!

Some of our work on touch sensing

  • Wang, S.; Lambeta, M.; Chou, L. & Calandra, R.
    TACTO: A Fast, Flexible and Open-source Simulator for High-Resolution Vision-based Tactile Sensors
    IEEE Robotics and Automation Letters (RA-L), 2022, 7, 3930-3937, Online: https://arxiv.org/abs/2012.08456

  • Smith, E. J.; Meger, D.; Pineda, L.; Calandra, R.; Malik, J.; Romero, A. & Drozdzal, M.
    Active 3D Shape Reconstruction from Vision and Touch
    Advances in Neural Information Processing Systems (NeurIPS), 2021
  • Lambeta, M.; Xu, H.; Xu, J.; Chou, P.-W.; Wang, S.; Darrell, T. & Calandra, R.
    PyTouch: A Machine Learning Library for Touch Processing
    IEEE International Conference on Robotics and Automation (ICRA), 2021
  • Smith, E. J.; Calandra, R.; Romero, A.; Gkioxari, G.; Meger, D.; Malik, J. & Drozdzal, M.
    3D Shape Reconstruction from Vision and Touch
    Advances in Neural Information Processing Systems (NeurIPS), 2020
  • Lambeta, M.; Chou, P.-W.; Tian, S.; Yang, B.; Maloon, B.; Most, V. R.; Stroud, D.; Santos, R.; Byagowi, A.; Kammerer, G.; Jayaraman, D. & Calandra, R.
    DIGIT: A Novel Design for a Low-Cost Compact High-Resolution Tactile Sensor with Application to In-Hand Manipulation
    IEEE Robotics and Automation Letters (RA-L), 2020, 5, 3838-3845
  • Padmanabha, A.; Ebert, F.; Tian, S.; Calandra, R.; Finn, C. & Levine, S.
    OmniTact: A Multi-Directional High-Resolution Touch Sensor
    IEEE International Conference on Robotics and Automation (ICRA), 2020, 618-624
  • Lin, J.; Calandra, R. & Levine, S.
    Learning to Identify Object Instances by Touch: Tactile Recognition via Multimodal Matching
    IEEE International Conference on Robotics and Automation (ICRA), 2019, 3644-3650
  • Tian, S.; Ebert, F.; Jayaraman, D.; Mudigonda, M.; Finn, C.; Calandra, R. & Levine, S.
    Manipulation by Feel: Touch-Based Control with Deep Predictive Models
    IEEE International Conference on Robotics and Automation (ICRA), 2019, 818-824
  • Calandra, R.; Owens, A.; Jayaraman, D.; Yuan, W.; Lin, J.; Malik, J.; Adelson, E. H. & Levine, S.
    More Than a Feeling: Learning to Grasp and Regrasp using Vision and Touch
    IEEE Robotics and Automation Letters (RA-L), 2018, 3, 3300-3307
  • Calandra, R.; Owens, A.; Upadhyaya, M.; Yuan, W.; Lin, J.; Adelson, E. H. & Levine, S.
    The Feeling of Success: Does Touch Sensing Help Predict Grasp Outcomes?
    Conference on Robot Learning (CORL), 2017, 314-323
  • Yi, Z.; Calandra, R.; Veiga, F. F.; van Hoof, H.; Hermans, T.; Zhang, Y. & Peters, J.
    Active Tactile Object Exploration with Gaussian Processes
    IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2016, 4925-4930
  • Calandra, R.; Ivaldi, S.; Deisenroth, M. P.; Rueckert, E. & Peters, J.
    Learning Inverse Dynamics Models with Contacts
    IEEE International Conference on Robotics and Automation (ICRA), 2015, 3186-3191
  • Calandra, R.; Ivaldi, S.; Deisenroth, M. P. & Peters, J.
    Learning Torque Control in Presence of Contacts using Tactile Sensing from Robot Skin
    IEEE-RAS International Conference on Humanoid Robots (HUMANOIDS), 2015, 690-695

IROS 2025 Keynote

By Roberto Calandra

IROS 2025 Keynote

Touch is a crucial sensor modality in both humans and robots. Recent advances in tactile sensing hardware have resulted -- for the first time -- in the availability of mass-produced, high-resolution, inexpensive, and reliable tactile sensors. In this talk, I will argue for the importance of creating a new computational field of "Touch processing" dedicated to the processing and understanding of touch, similarly to what computer vision is for vision. This new field will present significant challenges both in terms of research and engineering. Finally, I will present some applications of touch in robotics and discuss other future applications.

  • 38