Opinions expressed are my own
📍Pittsburgh, USA 🔗 akashsharma02.github.io
5/6
5/6
4/6
4/6
Decorrelate signals by tokenizing 1s window of tactile data.
Condition the encoder on robot hand configurations via sensor positions as input
3/6
Decorrelate signals by tokenizing 1s window of tactile data.
Condition the encoder on robot hand configurations via sensor positions as input
3/6
It improves tactile tasks by over 56% in end-to-end methods and by over 41% in prior work.
It is trained via self-supervision for the Xela sensor, so no labeled data needed!
2/6
It improves tactile tasks by over 56% in end-to-end methods and by over 41% in prior work.
It is trained via self-supervision for the Xela sensor, so no labeled data needed!
2/6
Can 1 model improve many tactile tasks?
🌟Introducing Sparsh-skin: tinyurl.com/y935wz5c
1/6
Can 1 model improve many tactile tasks?
🌟Introducing Sparsh-skin: tinyurl.com/y935wz5c
1/6
I'm grateful to my committee, and everyone who supported me.
My proposed thesis "Self supervised perception for tactile dexterity" will explore ways to improve dexterous manipulation using tactile reps.
I'm grateful to my committee, and everyone who supported me.
My proposed thesis "Self supervised perception for tactile dexterity" will explore ways to improve dexterous manipulation using tactile reps.
a6700 w/ 17-70mm Tamron lens
a6700 w/ 17-70mm Tamron lens
On cross-sensor transfer, e.g. for textile classification: decoders for GelSight can transfer to DIGIT with only 10 samples using Sparsh (9→61%) vs end-to-end learning (3→10%)
5/6
On cross-sensor transfer, e.g. for textile classification: decoders for GelSight can transfer to DIGIT with only 10 samples using Sparsh (9→61%) vs end-to-end learning (3→10%)
5/6
We find that Sparsh outperforms task & sensor specific models by an average of 95.1%
4/6
We find that Sparsh outperforms task & sensor specific models by an average of 95.1%
4/6
Sparsh is general-purpose and shows strong performance in many tasks. One could decode representations from Sparsh into normal and shear fields to train downstream policies.
3/6
Sparsh is general-purpose and shows strong performance in many tasks. One could decode representations from Sparsh into normal and shear fields to train downstream policies.
3/6
With Sparsh, we extend self-supervised learning (SSL) to the tactile domain showing that SSL features allow for sample efficient learning of downstream tasks.
2/6
With Sparsh, we extend self-supervised learning (SSL) to the tactile domain showing that SSL features allow for sample efficient learning of downstream tasks.
2/6
More on 🎯Sparsh 🧵⬇
1/6
More on 🎯Sparsh 🧵⬇
1/6