Opinions expressed are my own
📍Pittsburgh, USA 🔗 akashsharma02.github.io
5/6
5/6
4/6
4/6
Decorrelate signals by tokenizing 1s window of tactile data.
Condition the encoder on robot hand configurations via sensor positions as input
3/6
Decorrelate signals by tokenizing 1s window of tactile data.
Condition the encoder on robot hand configurations via sensor positions as input
3/6
It improves tactile tasks by over 56% in end-to-end methods and by over 41% in prior work.
It is trained via self-supervision for the Xela sensor, so no labeled data needed!
2/6
It improves tactile tasks by over 56% in end-to-end methods and by over 41% in prior work.
It is trained via self-supervision for the Xela sensor, so no labeled data needed!
2/6
Can 1 model improve many tactile tasks?
🌟Introducing Sparsh-skin: tinyurl.com/y935wz5c
1/6
Can 1 model improve many tactile tasks?
🌟Introducing Sparsh-skin: tinyurl.com/y935wz5c
1/6
On cross-sensor transfer, e.g. for textile classification: decoders for GelSight can transfer to DIGIT with only 10 samples using Sparsh (9→61%) vs end-to-end learning (3→10%)
5/6
On cross-sensor transfer, e.g. for textile classification: decoders for GelSight can transfer to DIGIT with only 10 samples using Sparsh (9→61%) vs end-to-end learning (3→10%)
5/6
With Sparsh, we extend self-supervised learning (SSL) to the tactile domain showing that SSL features allow for sample efficient learning of downstream tasks.
2/6
With Sparsh, we extend self-supervised learning (SSL) to the tactile domain showing that SSL features allow for sample efficient learning of downstream tasks.
2/6
More on 🎯Sparsh 🧵⬇
1/6
More on 🎯Sparsh 🧵⬇
1/6