Wenzel Jakob
wjakob.bsky.social
Wenzel Jakob
@wjakob.bsky.social
Associate professor leading EPFL's Realistic Graphics Lab. My research involves inverse graphics, material appearance modeling and physically based rendering
Wasn’t that something.. Flocke (German for “flake”) says hi!
August 15, 2025 at 4:45 AM
To reconstruct their interior, we: 1️⃣Localize annual rings on cube faces 2️⃣ Optimize a procedural growth field that assigns an age to every 3D point (when that wood formed during the tree's life) 3️⃣ Synthesize detailed textures via procedural model or a neural cellular automaton
August 8, 2025 at 11:53 AM
The Mokume dataset consists of 190 physical wood cubes from 17 species, each documented with:

- High-res photos of all 6 faces
- Annual ring annotations
- Photos of slanted cuts for validation
- CT scans revealing the true interior structure (for future use)
August 8, 2025 at 11:53 AM
Wood textures are everywhere in graphics, but realistic texturing requires knowing what wood looks like throughout its volume, not just on the surfaces.
The patterns depend on tree species, growth conditions, and where and how the wood was cut from the tree.
August 8, 2025 at 11:53 AM
How can one reconstruct the complete 3D interior of a wood block using only photos of its surfaces? 🪵
At SIGGRAPH'25 (Thursday!), Maria Larsson will present *Mokume*: a dataset of 190 diverse wood samples and a pipeline that solves this inverse texturing challenge. 🧵👇
August 8, 2025 at 11:53 AM
My lab will be recruiting at all levels. PhD students, postdocs, and a research engineering position (worldwide for PhD/postdoc, EU candidates only for the engineering position). If you're at SIGGRAPH, I'd love to talk to you if you are interested in any of these.
August 8, 2025 at 8:09 AM
Check out our paper for more details at rgl.epfl.ch/publications...
August 7, 2025 at 12:21 PM
This is a joint work with @ziyizh.bsky.social, @njroussel.bsky.social, Thomas Müller, @tizian.bsky.social, @merlin.ninja, and Fabrice Rousselle.
August 7, 2025 at 12:21 PM
Our method minimizes the expected loss, whereas NeRF optimizes the loss of the expectation.
It generalizes deterministic surface evolution methods (e.g., NvDiffrec) and elegantly handles discontinuities. Future applications include physically based rendering and tomography.
August 7, 2025 at 12:21 PM
Instead of blending colors along rays and supervising the resulting images, we project the training images into the scene to supervise the radiance field.
Each point along a ray is treated as a surface candidate, independently optimized to match that ray's reference color.
August 7, 2025 at 12:21 PM
By changing just a few lines of code, we can adapt existing NeRF frameworks for surface reconstruction.
This patch shows the necessary changes to Instant NGP, which was originally designed for volume reconstruction.
August 7, 2025 at 12:21 PM
Methods like NeRF and Gaussian Splats model the world as radioactive fog, rendered using alpha blending. This produces great results.. but are volumes the only way to get there?🤔 Our new SIGGRAPH'25 paper directly reconstructs surfaces without heuristics or regularizers.
August 7, 2025 at 12:21 PM
Dr.Jit+Mitsuba just added support for fused neural networks, hash grids, and function freezing to eliminate tracing overheads. This significantly accelerates optimization &realtime workloads and enables custom Instant NGP and neural material/radiosity/path guiding projects. What will you do with it?
August 7, 2025 at 11:15 AM
(above: 64-wide MLP 2D neural field fit to the Great Wave off Kanagawa). Putting MLPs inside rendering code has been one of the most common feature requests, and I am excited that there is finally a solution. For details on cooperative vectors, see this page: drjit.readthedocs.io/en/latest/co....
June 1, 2025 at 2:04 AM
The latest development version of Dr.Jit now provides built-in support for evaluating and training MLPs (including fusing them into rendering workloads). They compile to efficient Tensor Core operations via NVIDIA's Cooperative Vector extension. Details: drjit.readthedocs.io/en/latest/nn...
June 1, 2025 at 2:04 AM
Variants: the switch to nanobind has made the Mitsuba binaries smaller, and we use that opportunity to address a frequent user request by shipping additional pre-built variants, including polarized monochromatic/spectral variants for scalar and differentiable CUDA/LLVM backends.
November 26, 2024 at 3:09 PM
Ease of use: we documented every single operation in Dr.Jit and added lots of introductory material as well. Both Mitsuba & Dr.Jit now provide detailed type stubs, which enable static type checking and bring rich code completion to editors like Visual Studio Code.
November 26, 2024 at 3:09 PM
Control flow: a key feature of Dr.Jit has always been that it can record control flow and fuse it into efficient GPU kernels, but the Python syntax to do this used to be awkward. Starting with Dr.Jit 1.0, you can decorate functions with "@dr.syntax" and use regular Python syntax.
November 26, 2024 at 3:09 PM
The Python bindings were redesigned from scratch using a new library (pybind11→nanobind), fixing long-standing performance issues. Code generation of various operations was improved as well. We see 10-20× faster tracing in Dr.Jit and 1.2-2× wall-clock (inverse) rendering time.
November 26, 2024 at 3:09 PM
Following over 1.5 years of hard work (w/@njroussel.bsky.social &@rtabbara.bsky.social), we just released a brand-new version of Dr.Jit (v1.0), my lab's differentiable rendering compiler along with an updated Mitsuba (v3.6). The list of changes is insanely long—here is what we're most excited about🧵
November 26, 2024 at 3:09 PM
ドイツ人は休めば休むほど成果を出せます👌
November 25, 2024 at 4:49 AM
My team RGL is looking for new PhD students starting 2025. If you are excited about topics like inverse rendering, compilers for graphics, and physically based modeling then please join us at EPFL. Info about the lab & admission process: rgl.epfl.ch/pages/jobs (Deadline: Dec 15)
November 22, 2024 at 3:14 PM