Viswanath (Vish) Sivakumar
@seasonedschemer.bsky.social
ML and neural interfaces research, Meta Reality Labs
Would recommend LLM-Training-Puzzles (which is about distributed training rather than about LLMs) as a companion for anyone reading the excellent Ultra-Scale Playbook by huggingface.
April 14, 2025 at 3:04 PM
Would recommend LLM-Training-Puzzles (which is about distributed training rather than about LLMs) as a companion for anyone reading the excellent Ultra-Scale Playbook by huggingface.
Boichick on Fillmore is new and good!
December 6, 2024 at 4:59 PM
Boichick on Fillmore is new and good!
9/9
Gratitude to the entire CTRL-Labs team and partners at
Reality Labs, Meta.
Special shout out to @jseely.bsky.social, Michael Mandel, Sean Bittner, Alexandre Gramfort, Adam Berenzweig, Patrick Kaifosh, and TR Readon!
Gratitude to the entire CTRL-Labs team and partners at
Reality Labs, Meta.
Special shout out to @jseely.bsky.social, Michael Mandel, Sean Bittner, Alexandre Gramfort, Adam Berenzweig, Patrick Kaifosh, and TR Readon!
December 5, 2024 at 9:55 PM
9/9
Gratitude to the entire CTRL-Labs team and partners at
Reality Labs, Meta.
Special shout out to @jseely.bsky.social, Michael Mandel, Sean Bittner, Alexandre Gramfort, Adam Berenzweig, Patrick Kaifosh, and TR Readon!
Gratitude to the entire CTRL-Labs team and partners at
Reality Labs, Meta.
Special shout out to @jseely.bsky.social, Michael Mandel, Sean Bittner, Alexandre Gramfort, Adam Berenzweig, Patrick Kaifosh, and TR Readon!
8/
And and that's not all!
Here's another large dataset we just released - emg2pose - focused on hand pose estimation using sEMG. emg2pose has 193 participants over 370 hours & >50 behavioral categories w/ hand motion capture ground truth.
arxiv.org/abs/2412.02725
github.com/facebookrese...
And and that's not all!
Here's another large dataset we just released - emg2pose - focused on hand pose estimation using sEMG. emg2pose has 193 participants over 370 hours & >50 behavioral categories w/ hand motion capture ground truth.
arxiv.org/abs/2412.02725
github.com/facebookrese...
December 5, 2024 at 9:55 PM
8/
And and that's not all!
Here's another large dataset we just released - emg2pose - focused on hand pose estimation using sEMG. emg2pose has 193 participants over 370 hours & >50 behavioral categories w/ hand motion capture ground truth.
arxiv.org/abs/2412.02725
github.com/facebookrese...
And and that's not all!
Here's another large dataset we just released - emg2pose - focused on hand pose estimation using sEMG. emg2pose has 193 participants over 370 hours & >50 behavioral categories w/ hand motion capture ground truth.
arxiv.org/abs/2412.02725
github.com/facebookrese...
7/
Our baseline model built using standard techniques from the Speech Recognition literature shows that, with some personalization on top of a model pretrained with 100 subjects, we can quite accurately enable typing with sEMG, eventually without a physical keyboard.
Our baseline model built using standard techniques from the Speech Recognition literature shows that, with some personalization on top of a model pretrained with 100 subjects, we can quite accurately enable typing with sEMG, eventually without a physical keyboard.
December 5, 2024 at 9:55 PM
7/
Our baseline model built using standard techniques from the Speech Recognition literature shows that, with some personalization on top of a model pretrained with 100 subjects, we can quite accurately enable typing with sEMG, eventually without a physical keyboard.
Our baseline model built using standard techniques from the Speech Recognition literature shows that, with some personalization on top of a model pretrained with 100 subjects, we can quite accurately enable typing with sEMG, eventually without a physical keyboard.
6/
Towards that goal, we now release emg2qwerty - a wrist sEMG dataset collected while touch typing on a QWERTY keyboard. With 108 subjects, 1,135 sessions, 346 hours, and 5.2 million keystrokes, this is quite large by neuroscience standards.
arxiv.org/abs/2410.20081
github.com/facebookrese...
Towards that goal, we now release emg2qwerty - a wrist sEMG dataset collected while touch typing on a QWERTY keyboard. With 108 subjects, 1,135 sessions, 346 hours, and 5.2 million keystrokes, this is quite large by neuroscience standards.
arxiv.org/abs/2410.20081
github.com/facebookrese...
December 5, 2024 at 9:55 PM
6/
Towards that goal, we now release emg2qwerty - a wrist sEMG dataset collected while touch typing on a QWERTY keyboard. With 108 subjects, 1,135 sessions, 346 hours, and 5.2 million keystrokes, this is quite large by neuroscience standards.
arxiv.org/abs/2410.20081
github.com/facebookrese...
Towards that goal, we now release emg2qwerty - a wrist sEMG dataset collected while touch typing on a QWERTY keyboard. With 108 subjects, 1,135 sessions, 346 hours, and 5.2 million keystrokes, this is quite large by neuroscience standards.
arxiv.org/abs/2410.20081
github.com/facebookrese...
5/
But can we further push the boundaries of what is possible with sEMG?
We imagine many other applications for sEMG, including the ability to manipulate objects in AR or write full messages as quickly as—or faster than—typing on a keyboard, with very little effort.
But can we further push the boundaries of what is possible with sEMG?
We imagine many other applications for sEMG, including the ability to manipulate objects in AR or write full messages as quickly as—or faster than—typing on a keyboard, with very little effort.
December 5, 2024 at 9:55 PM
5/
But can we further push the boundaries of what is possible with sEMG?
We imagine many other applications for sEMG, including the ability to manipulate objects in AR or write full messages as quickly as—or faster than—typing on a keyboard, with very little effort.
But can we further push the boundaries of what is possible with sEMG?
We imagine many other applications for sEMG, including the ability to manipulate objects in AR or write full messages as quickly as—or faster than—typing on a keyboard, with very little effort.
4/
And at Connect 2024, we showed how one can use an EMG wristband with Orion—our AR glasses product prototype—for seamless and comfortable control over digital content to swipe, click, and scroll while keeping your arm resting comfortably by your side.
And at Connect 2024, we showed how one can use an EMG wristband with Orion—our AR glasses product prototype—for seamless and comfortable control over digital content to swipe, click, and scroll while keeping your arm resting comfortably by your side.
Introducing Orion, Our First AR Glasses | Meta Quest Blog
Today at Connect, Mark Zuckerberg unveiled Orion—our first pair of true AR glasses, previously codenamed Project Nazare.
www.meta.com
December 5, 2024 at 9:55 PM
4/
And at Connect 2024, we showed how one can use an EMG wristband with Orion—our AR glasses product prototype—for seamless and comfortable control over digital content to swipe, click, and scroll while keeping your arm resting comfortably by your side.
And at Connect 2024, we showed how one can use an EMG wristband with Orion—our AR glasses product prototype—for seamless and comfortable control over digital content to swipe, click, and scroll while keeping your arm resting comfortably by your side.
3/
Earlier this year, we unveiled a wristband device that can be seamlessly worn, non-invasively sense muscle activations in the wrist and hand via surface EMG, and achieve out-of-the-box generalization across individuals for multiple tasks including handwriting!
Earlier this year, we unveiled a wristband device that can be seamlessly worn, non-invasively sense muscle activations in the wrist and hand via surface EMG, and achieve out-of-the-box generalization across individuals for multiple tasks including handwriting!
A generic noninvasive neuromotor interface for human-computer interaction
Since the advent of computing, humans have sought computer input technologies that are expressive, intuitive, and universal. While diverse modalities have been developed, including keyboards, mice, an...
www.biorxiv.org
December 5, 2024 at 9:55 PM
3/
Earlier this year, we unveiled a wristband device that can be seamlessly worn, non-invasively sense muscle activations in the wrist and hand via surface EMG, and achieve out-of-the-box generalization across individuals for multiple tasks including handwriting!
Earlier this year, we unveiled a wristband device that can be seamlessly worn, non-invasively sense muscle activations in the wrist and hand via surface EMG, and achieve out-of-the-box generalization across individuals for multiple tasks including handwriting!
2/
The papers - emg2qwerty and emg2pose - are accepted at NeurIPS 2024 Datasets and Benchmarks Track.
If you’re in Vancouver next week, drop by our posters on Fri Dec 13 at 11am PST or at the Meta booth. We have some cool demos to show too!
neurips.cc/virtual/2024...
nips.cc/virtual/2024...
The papers - emg2qwerty and emg2pose - are accepted at NeurIPS 2024 Datasets and Benchmarks Track.
If you’re in Vancouver next week, drop by our posters on Fri Dec 13 at 11am PST or at the Meta booth. We have some cool demos to show too!
neurips.cc/virtual/2024...
nips.cc/virtual/2024...
NeurIPS Poster emg2qwerty: A Large Dataset with Baselines for Touch Typing using Surface ElectromyographyNeurIPS 2024
neurips.cc
December 5, 2024 at 9:55 PM
2/
The papers - emg2qwerty and emg2pose - are accepted at NeurIPS 2024 Datasets and Benchmarks Track.
If you’re in Vancouver next week, drop by our posters on Fri Dec 13 at 11am PST or at the Meta booth. We have some cool demos to show too!
neurips.cc/virtual/2024...
nips.cc/virtual/2024...
The papers - emg2qwerty and emg2pose - are accepted at NeurIPS 2024 Datasets and Benchmarks Track.
If you’re in Vancouver next week, drop by our posters on Fri Dec 13 at 11am PST or at the Meta booth. We have some cool demos to show too!
neurips.cc/virtual/2024...
nips.cc/virtual/2024...
Resonates a lot having lived in both places. I moved back to NYC as of last month, and personally, it’s been a steep QoL upgrade. But of course, SF will continue to be a fixture given I’m in tech.
December 2, 2024 at 6:08 PM
Resonates a lot having lived in both places. I moved back to NYC as of last month, and personally, it’s been a steep QoL upgrade. But of course, SF will continue to be a fixture given I’m in tech.