Leonardo Ferreira Guilhoto (he/him/his)
@leonardofg.bsky.social
Applied Mathematics and Computational Science PhD candidate @ UPenn, advised by Prof. Paris Perdikaris. I research applications of deep learning to science and engineering, with a focus on Uncertainty Quantification, Operator Leaning and PINNs!
Now that I have that presentation ready, I might do a quick recording of it to post online later. Be on the lookout for that if the paper peaked your interest 🙂
(6/6)
(6/6)
March 9, 2025 at 5:09 PM
Now that I have that presentation ready, I might do a quick recording of it to post online later. Be on the lookout for that if the paper peaked your interest 🙂
(6/6)
(6/6)
2) Proposing Neon, a neural network architecture for operator learning that has built-in uncertainty quantification by using Epinets. When compared to GPs and deep ensembles, Neon achieved the best performance, at times requiring ~40x less trainable parameters than DeepONet ensembles.
(5/6)
(5/6)
March 9, 2025 at 5:09 PM
2) Proposing Neon, a neural network architecture for operator learning that has built-in uncertainty quantification by using Epinets. When compared to GPs and deep ensembles, Neon achieved the best performance, at times requiring ~40x less trainable parameters than DeepONet ensembles.
(5/6)
(5/6)
1) Formulating the Leaky Expected Improvement (L-EI) acquisition function for Bayesian optimization, which is provably similar to traditional EI, but significantly easier to optimize via gradient-based methods.
(4/6)
(4/6)
March 9, 2025 at 5:09 PM
1) Formulating the Leaky Expected Improvement (L-EI) acquisition function for Bayesian optimization, which is provably similar to traditional EI, but significantly easier to optimize via gradient-based methods.
(4/6)
(4/6)
I also gave a talk on my recent paper “Composite Bayesian Optimization in function spaces using NEON—Neural Epistemic Operator Networks”. You can read the full paper here: www.nature.com/articles/s41...
The paper’s main contributions are:
(3/6)
The paper’s main contributions are:
(3/6)
Composite Bayesian Optimization in function spaces using NEON—Neural Epistemic Operator Networks - Scientific Reports
Scientific Reports - Composite Bayesian Optimization in function spaces using NEON—Neural Epistemic Operator Networks
www.nature.com
March 9, 2025 at 5:09 PM
I also gave a talk on my recent paper “Composite Bayesian Optimization in function spaces using NEON—Neural Epistemic Operator Networks”. You can read the full paper here: www.nature.com/articles/s41...
The paper’s main contributions are:
(3/6)
The paper’s main contributions are:
(3/6)
We had 8 wonderful talks from researchers both from industry and academia. Topics ranged from Bayesian Optimization, data-driven solutions to PDEs, climate forecasting, ecological modeling, and more! Thanks to all the speakers for accepting the invitation and traveling to present their work!
(2/6)
(2/6)
March 9, 2025 at 5:09 PM
We had 8 wonderful talks from researchers both from industry and academia. Topics ranged from Bayesian Optimization, data-driven solutions to PDEs, climate forecasting, ecological modeling, and more! Thanks to all the speakers for accepting the invitation and traveling to present their work!
(2/6)
(2/6)
You can watch the recorded livestream, including all the talks, here (mine starts around the 3:39:00 mark):
www.youtube.com/live/dya_05f...
(4/4)
#MachineLearning #UncertaintyQuantification #ScienceCommunication #SciML #AI4Science #TED #PhDStudent
www.youtube.com/live/dya_05f...
(4/4)
#MachineLearning #UncertaintyQuantification #ScienceCommunication #SciML #AI4Science #TED #PhDStudent
Penn Grad Talks 2025 | Livestream
YouTube video by Penn Arts & Sciences
www.youtube.com
March 2, 2025 at 4:16 PM
You can watch the recorded livestream, including all the talks, here (mine starts around the 3:39:00 mark):
www.youtube.com/live/dya_05f...
(4/4)
#MachineLearning #UncertaintyQuantification #ScienceCommunication #SciML #AI4Science #TED #PhDStudent
www.youtube.com/live/dya_05f...
(4/4)
#MachineLearning #UncertaintyQuantification #ScienceCommunication #SciML #AI4Science #TED #PhDStudent
Making technical math research accessible to a broad audience wasn’t easy, but it was incredibly rewarding. Huge thanks to everyone who attended and to my fellow speakers for their great talks!
(3/4)
(3/4)
March 2, 2025 at 4:16 PM
Making technical math research accessible to a broad audience wasn’t easy, but it was incredibly rewarding. Huge thanks to everyone who attended and to my fellow speakers for their great talks!
(3/4)
(3/4)
In my talk, "How to Know What You Don't Know", I spoke about Uncertainty Quantification—how knowing what we don’t know makes machine learning models more trustworthy and less overconfident, especially in science and engineering.
(2/4)
(2/4)
March 2, 2025 at 4:16 PM
In my talk, "How to Know What You Don't Know", I spoke about Uncertainty Quantification—how knowing what we don’t know makes machine learning models more trustworthy and less overconfident, especially in science and engineering.
(2/4)
(2/4)
There'll be a panel of judges, but there's also an "Audience Choice" prize, which you'll be able to vote for during the talks (in-person or virtual), so I'd appreciate any help on that end 👀👀
(3/3)
(3/3)
February 19, 2025 at 3:26 PM
There'll be a panel of judges, but there's also an "Audience Choice" prize, which you'll be able to vote for during the talks (in-person or virtual), so I'd appreciate any help on that end 👀👀
(3/3)
(3/3)
The title of my talk is "How To Know What You Don't Know", and I'll be talking about Uncertainty Quantification (UQ) and how important it is in the current era of AI models being deployed in the real world.
(2/3)
(2/3)
February 19, 2025 at 3:26 PM
The title of my talk is "How To Know What You Don't Know", and I'll be talking about Uncertainty Quantification (UQ) and how important it is in the current era of AI models being deployed in the real world.
(2/3)
(2/3)
I recently gave a talk on the latter of these papers, (ActNet - Deep Learning Alternatives of the Kolmogorov Superposition Theorem) which you can check out here:
youtu.be/pQKayyvNo5E
I'll be giving a talk on the NEON paper soon at the SIAM CSE conference. Be on the lookout for a video on it soon!
youtu.be/pQKayyvNo5E
I'll be giving a talk on the NEON paper soon at the SIAM CSE conference. Be on the lookout for a video on it soon!
ActNet - Deep Learning Alternatives of the Kolmogorov Superposition Theorem
YouTube video by Leonardo Ferreira Guilhoto
youtu.be
February 14, 2025 at 3:53 PM
I recently gave a talk on the latter of these papers, (ActNet - Deep Learning Alternatives of the Kolmogorov Superposition Theorem) which you can check out here:
youtu.be/pQKayyvNo5E
I'll be giving a talk on the NEON paper soon at the SIAM CSE conference. Be on the lookout for a video on it soon!
youtu.be/pQKayyvNo5E
I'll be giving a talk on the NEON paper soon at the SIAM CSE conference. Be on the lookout for a video on it soon!
To learn more about my recent research, feel free to check out my two most recent papers!
- Neural Epistemic Operator Networks (NEON): www.nature.com/articles/s41...
- ActNet - Deep Learning Alternatives of the Kolmogorov Superposition Theorem (ICLR 2025 Spotlight Paper) : arxiv.org/abs/2410.01990
- Neural Epistemic Operator Networks (NEON): www.nature.com/articles/s41...
- ActNet - Deep Learning Alternatives of the Kolmogorov Superposition Theorem (ICLR 2025 Spotlight Paper) : arxiv.org/abs/2410.01990
Composite Bayesian Optimization in function spaces using NEON—Neural Epistemic Operator Networks - Scientific Reports
Scientific Reports - Composite Bayesian Optimization in function spaces using NEON—Neural Epistemic Operator Networks
www.nature.com
February 14, 2025 at 3:39 PM
To learn more about my recent research, feel free to check out my two most recent papers!
- Neural Epistemic Operator Networks (NEON): www.nature.com/articles/s41...
- ActNet - Deep Learning Alternatives of the Kolmogorov Superposition Theorem (ICLR 2025 Spotlight Paper) : arxiv.org/abs/2410.01990
- Neural Epistemic Operator Networks (NEON): www.nature.com/articles/s41...
- ActNet - Deep Learning Alternatives of the Kolmogorov Superposition Theorem (ICLR 2025 Spotlight Paper) : arxiv.org/abs/2410.01990
- Within SciML, my main focus is on a) Uncertainty Quantification (UQ) - i.e.: models that know what they don't know; and b) building accurate and efficient surrogate models via Operator Learning and Physics Informed Neural Networks (PINNs) - i.e.: models that integrate data & scientific knowledge.
February 14, 2025 at 3:39 PM
- Within SciML, my main focus is on a) Uncertainty Quantification (UQ) - i.e.: models that know what they don't know; and b) building accurate and efficient surrogate models via Operator Learning and Physics Informed Neural Networks (PINNs) - i.e.: models that integrate data & scientific knowledge.
- I'm a PhD candidate in Applied Math & Computational Science at the University of Pennsylvania.
- I am originally from São Paulo, Brazil! 🇧🇷
- I work on developing deep learning methods for science and engineering applications, an area often called Scientific Machine Learning (SciML), or AI4Science.
- I am originally from São Paulo, Brazil! 🇧🇷
- I work on developing deep learning methods for science and engineering applications, an area often called Scientific Machine Learning (SciML), or AI4Science.
February 14, 2025 at 3:39 PM
- I'm a PhD candidate in Applied Math & Computational Science at the University of Pennsylvania.
- I am originally from São Paulo, Brazil! 🇧🇷
- I work on developing deep learning methods for science and engineering applications, an area often called Scientific Machine Learning (SciML), or AI4Science.
- I am originally from São Paulo, Brazil! 🇧🇷
- I work on developing deep learning methods for science and engineering applications, an area often called Scientific Machine Learning (SciML), or AI4Science.