Ashesh Chattopadhyay
banner
ashesh6810.bsky.social
Ashesh Chattopadhyay
@ashesh6810.bsky.social
Scientific ML, ML theory, ML for climate, fluids, dynamical systems. Asst. Prof of Applied Math at UCSC. https://sites.google.com/view/ashesh6810/home
📄 Paper:
Lupin-Jimenez et al. (2025)
"Simultaneous Emulation and Downscaling..."
doi.org/10.1029/2025JH000851

💾 Code & data:
zenodo.org/record/14607130

We’d love to hear from collaborators in ocean ML, emulation, and climate AI 🌊🤝
Simultaneous Emulation and Downscaling With Physically Consistent Deep Learning‐Based Regional Ocean Emulators
An AI-based physically consistent long-term regional emulator has been developed for the Gulf of Mexico region A deterministic and stochastic downscaling model has been developed to super-resolve...
doi.org
August 20, 2025 at 4:28 PM
🚀 Why it’s exciting:
✅ Physically grounded
✅ 10x–1000x faster than ROMS
✅ Enables regional “digital twins”
✅ Sets up for coupled ocean–atmosphere emulation
✅ Works across different reanalysis sources

AI meets ocean science.
August 20, 2025 at 4:28 PM
📊 Results:

Beats interpolation

Matches or outperforms ROMS in short-term accuracy

Stays stable & realistic over 10 years

Captures mean state and eddy variability

Preserves spectral energy across scales

No exploding gradients here 💥
August 20, 2025 at 4:28 PM
🧠 What’s different:

We don't just super-resolve existing data.
We downscale from an emulator that predicts ocean dynamics.

Plus: our downscaler learns to correct both model bias and physical mismatch (GLORYS → CNAPS). That’s new.
August 20, 2025 at 4:28 PM
⚙️ Our framework (FCDS):

An FNO emulator predicts SSH, SSU, SSV, SSKE daily at 8 km

A UNet + PatchGAN-VAE downscales to 4 km & corrects bias

Spectral loss + online fine-tuning ensures physical consistency

Together: speed, structure, and stability.
August 20, 2025 at 4:28 PM
🌍 Why this matters:

Regional ocean models like the Gulf of Mexico are hard—complex coastlines, eddies, Loop Current, chaotic boundary forcing.

Physics models = accurate but slow.
ML = fast, but unstable after a few weeks. We wanted the best of both.
August 20, 2025 at 4:28 PM
Led by @baskinengineering.bsky.social PhD students Niloofar & Lenny with Tianning Wu & Roy He @ncstate.bsky.social

If you're working on GenAI for Earth systems, let’s connect — curious to hear your thoughts!
#GenAI #ClimateAI #OceanML #FNO #DDPM #DataAssimilation
July 10, 2025 at 1:39 AM
Our method is:
⚡️ One-shot
🌀 Physics-consistent
🌐 Scalable

It captures high-wavenumber, fine-scale structures other ML baselines miss. Spectral diagnostics & vorticity metrics confirm this. (4/5)
July 10, 2025 at 1:39 AM
🧠 The framework combines:
• FNO (Fourier Neural Operator)
• DDPM (Denoising Diffusion Probabilistic Model)

✅ Reconstructs high-resolution states from 1%–0.1% data
✅ Works on synthetic turbulence, GLORYS reanalysis & real satellite altimetry
✅ No forward solver required (3/5)
July 10, 2025 at 1:39 AM
Ocean observations are often sparse, noisy, and Lagrangian (they move with the flow).
This makes reconstructing fine-scale ocean dynamics like eddies and fronts very hard — especially for forecasting.
We tackle this using a diffusion model conditioned on a neural operator. (2/5)
July 10, 2025 at 1:39 AM
A key takeaway is that both a priori and a posteriori performance of ML-based parameterization (stability, accuracy, etc) can be derived from insights embedded in the spectral representation of neural networks. Take a look at some of our older work if interested. academic.oup.com/pnasnexus/ar...
Explaining the physics of transfer learning in data-driven turbulence modeling
Abstract. Transfer learning (TL), which enables neural networks (NNs) to generalize out-of-distribution via targeted re-training, is becoming a powerful to
academic.oup.com
April 23, 2025 at 8:21 PM
We find an interesting distribution of Gabor filters and low-pass filters before and after fine-tuning and predictable spectral dynamics of the hidden layers both during training and fine-tuning phase.
April 23, 2025 at 8:21 PM
The key idea lies in analyzing the network in spectral space during training, inference, and fine-tuning. Interestingly, more often than not, generalizing to a new system means generalizing to a new shape of the Fourier spectrum and that is a key indicator of model performance a priori.
April 23, 2025 at 8:21 PM
We have released the codes and the framework as a part of the pre-print. Do check it out if you are interested or work in this space. The framework is adaptable to other areas of geophysics and generally Earth system modeling beyond just the ocean and atmosphere.
January 10, 2025 at 6:26 AM
The 8Km emulated ocean is then downscaled to a 4KM reanalysis product with a generative model. The coupled emulator + downscaling framework is long-term stable, demonstrates accurate kinetic energy spectrum, and has the right mean and variability over decadal time scales.
January 10, 2025 at 6:26 AM
Some the key ideas in the work involves building a framework where instead of costly reanalysis products or forecasts which are downscaled, we built an ocean emulator at 8Km over the Gulf of Mexico which is long-stable, does not drift, and remain physically consistent.
January 10, 2025 at 6:26 AM