Christian Holz
christianholz.bsky.social
Christian Holz
@christianholz.bsky.social

associate professor @ ETH Zürich. sensing, interaction & perception lab https://siplab.org. computational VR/AR input, egocentric perception, predictive mobile health, HCI

Computer science 51%
Engineering 22%

We generated 119h of footage using EgoSim across 4 high-fidelity virtual environments with high realism.

Our baseline method—a simple multi-view video pose transformer—directly regresses full-body 3D poses from 6 egocentric videos with solid accuracy, showing EgoSim's effective data generation.

Releasing EgoSim, a simulator for body-worn cameras. #NeurIPS2024

EgoSim takes real mocap data (e.g., from AMASS) and synthesizes multi-modal egocentric videos.

Plus: MultiEgoView, a real dataset from 6 GoPro cameras and ground-truth 3D poses during several activities (13 people).

reference:
J. Jiang, P. Streli, M. Meier, C. Holz.
EgoPoser: Robust Real-Time Egocentric Pose Estimation from Sparse and Intermittent Observations Everywhere.
ECCV 2024 @eccv.bsky.social

arXiv:
arxiv.org/abs/2308.06493

code:
github.com/eth-siplab/E...

project page:
siplab.org/projects/Ego...
EgoPoser: Robust Real-Time Egocentric Pose Estimation from Sparse and Intermittent Observations Everywhere · Sensing, Perception & Interaction Lab
siplab.org

EgoPoser proposes spatio-temporal input decomposition to robustly estimate position-invariant poses. We also explicit model headset FOV to account for tracking dropouts.

Our SlowFast feature fusion samples input signals sparsely & densely to extend the input context with no computational overhead.

EgoPoser is a novel method for accurate full-body pose estimation—purely from egocentric capture of head & hand poses (even when out of view)
#ECCV2024

EgoPoser supports diverse body shapes & remains robust even when users move in large environments

@jiaxijiang.bsky.social @paulstreli.bsky.social