associate professor @ ETH Zürich. sensing, interaction & perception lab https://siplab.org. computational VR/AR input, egocentric perception, predictive mobile health, HCI
EgoSim: An Egocentric Multi-view Simulator for Body-worn Cameras.
@neuripsconf.bsky.social '24 Datasets & Benchmarks
more: siplab.org/projects/Ego...
PDF: static.siplab.org/papers/neuri...
code & dataset: github.com/eth-siplab/E...
Our baseline method—a simple multi-view video pose transformer—directly regresses full-body 3D poses from 6 egocentric videos with solid accuracy, showing EgoSim's effective data generation.
EgoSim takes real mocap data (e.g., from AMASS) and synthesizes multi-modal egocentric videos.
Plus: MultiEgoView, a real dataset from 6 GoPro cameras and ground-truth 3D poses during several activities (13 people).
J. Jiang, P. Streli, M. Meier, C. Holz.
EgoPoser: Robust Real-Time Egocentric Pose Estimation from Sparse and Intermittent Observations Everywhere.
ECCV 2024 @eccv.bsky.social
arXiv:
arxiv.org/abs/2308.06493
code:
github.com/eth-siplab/E...
project page:
siplab.org/projects/Ego...
Our SlowFast feature fusion samples input signals sparsely & densely to extend the input context with no computational overhead.
#ECCV2024
EgoPoser supports diverse body shapes & remains robust even when users move in large environments
@jiaxijiang.bsky.social @paulstreli.bsky.social