| | --- |
| | license: apache-2.0 |
| | --- |
| | # Earth-2 Checkpoints: FourCastNet 3 |
| |
|
| | ## Description: |
| |
|
| | FourCastNet 3 advances global weather modeling by implementing a scalable, geometric |
| | machine learning (ML) approach to probabilistic ensemble forecasting. The approach is |
| | designed to respect spherical geometry and to accurately model the spatially |
| | correlated probabilistic nature of the problem, resulting in stable spectra and |
| | realistic dynamics across multiple scales. FourCastNet 3 delivers forecasting accuracy |
| | that surpasses leading conventional ensemble models and rivals the best diffusion-based |
| | methods, while producing forecasts 8 to 60 times faster than these approaches. In |
| | contrast to other ML approaches, FourCastNet 3 demonstrates excellent probabilistic |
| | calibration and retains realistic spectra, even at extended lead times of up to 60 days. |
| | All of these advances are realized using a purely convolutional neural network |
| | architecture specifically tailored for spherical geometry. Scalable and efficient |
| | large-scale training on 1024 GPUs and more is enabled by a novel training paradigm for |
| | combined model- and data-parallelism, inspired by domain decomposition methods in |
| | classical numerical models. Additionally, FourCastNet 3 enables rapid inference on a |
| | single GPU, producing a 60-day global forecast at 0.25°, 6-hourly resolution in under |
| | 4 minutes. Its computational efficiency, medium-range probabilistic skill, spectral |
| | fidelity, and rollout stability at subseasonal timescales make it a strong candidate |
| | for improving meteorological forecasting and early warning systems through large |
| | ensemble predictions. |
| |
|
| |  |
| |
|
| | This model is ready for commercial/non-commercial use. |
| |
|
| | ### License/Terms of Use: |
| |
|
| | [Apache 2.0 license](https://www.apache.org/licenses/LICENSE-2.0) |
| |
|
| | ### Deployment Geography: |
| |
|
| | Global |
| |
|
| | ### Use Case: |
| |
|
| | Industry, academic, and government research teams interested in medium-range and |
| | subseasonal-to-seasonal weather forecasting, and climate modeling. |
| |
|
| | ### Release Date: |
| |
|
| | NGC 07/18/2025 |
| |
|
| | ## Reference: |
| |
|
| | **Papers**: |
| |
|
| | - [FourCastNet 3: A geometric approach to probabilistic machine-learning weather forecasting at scale](https://arxiv.org/abs/2507.12144v2) |
| | - [Neural Operators with Localized Integral and Differential Kernels](https://arxiv.org/abs/2402.16845) |
| | - [Huge Ensembles Part I: Design of Ensemble Weather Forecasts using Spherical Fourier Neural Operators](https://arxiv.org/abs/2408.03100) |
| | - [Huge Ensembles Part II: Properties of a Huge Ensemble of Hindcasts Generated with Spherical Fourier Neural Operators](https://arxiv.org/abs/2408.01581) |
| | - [Spherical Fourier Neural Operators: Learning Stable Dynamics on the Sphere](https://arxiv.org/abs/2306.03838) |
| |
|
| | **Code**: |
| |
|
| | - [Makani](https://github.com/NVIDIA/makani) |
| |
|
| | - [PhysicsNeMo](https://github.com/NVIDIA/physicsnemo) |
| |
|
| | - [Earth2Studio](https://github.com/NVIDIA/earth2studio) |
| |
|
| | - [torch-harmonics](https://github.com/NVIDIA/torch-harmonics) |
| |
|
| | ## Model Architecture: |
| |
|
| | **Architecture Type:** Spherical Neural Operator. A fully convolutional architecture |
| | based on group convolutions defined on the sphere. Leverages both local and global |
| | convolutions. For details regarding the architecture refer to the |
| | [FourCastNet 3 paper](https://arxiv.org/abs/2507.12144v1). <br> |
| |
|
| | **Network Architecture:** N/A <br> |
| |
|
| | **Number of model parameters:** 710,867,670 |
| |
|
| | **Model datatype:** We recommend that the model is run in AMP with bf16, however, the |
| | inputs and outputs are typically float32. |
| |
|
| | ## Input: |
| |
|
| | **Input Type:** |
| |
|
| | - Tensor (72 surface and pressure-level variables) |
| |
|
| | **Input Format:** PyTorch Tensor <br> |
| | **Input Parameters:** |
| |
|
| | - Six Dimensional (6D) (batch, time, lead time, variable, latitude, longitude) <br> |
| |
|
| | **Other Properties Related to Input:** |
| |
|
| | - Input equi-rectangular latitude/longitude grid: 0.25 degree 721 x 1440 |
| | - Input state weather variables: `u10m`, `v10m`, `u100m`, `v100m`, `t2m`, `msl`, |
| | `tcwv`, `u50`, `u100`, `u150`, `u200`, `u250`, `u300`, `u400`, `u500`, `u600`, `u700`, |
| | `u850`, `u925`, `u1000`, `v50`, `v100`, `v150`, `v200`, `v250`, `v300`, `v400`, `v500`, |
| | `v600`, `v700`, `v850`, `v925`, `v1000`, `z50`, `z100`, `z150`, `z200`, `z250`, `z300`, |
| | `z400`, `z500`, `z600`, `z700`, `z850`, `z925`, `z1000`, `t50`, `t100`, `t150`, `t200`, |
| | `t250`, `t300`, `t400`, `t500`, `t600`, `t700`, `t850`, `t925`, `t1000`, `q50`, `q100`, |
| | `q150`, `q200`, `q250`, `q300`, `q400`, `q500`, `q600`, `q700`, `q850`, `q925`, `q1000` |
| | - Time: datetime64 |
| |
|
| | For variable name information, review the Lexicon at [Earth2Studio](https://github.com/NVIDIA/earth2studio). |
| |
|
| | ## Output: |
| |
|
| | **Output Type:** Tensor (72 surface and pressure-level variables) <br> |
| | **Output Format:** Pytorch Tensor <br> |
| | **Output Parameters:** Six Dimensional (6D) (batch, time, lead time, variable, |
| | latitude, longitude) <br> |
| | **Other Properties Related to Output:** |
| |
|
| | - Output latitude/longitude grid: 0.25 degree 721 x 1440, same as input. |
| | - Output state weather variables: same as above. |
| |
|
| | Our AI models are designed and/or optimized to run on NVIDIA GPU-accelerated systems. |
| | By leveraging NVIDIA’s hardware (e.g. GPU cores) and software frameworks (e.g., CUDA |
| | libraries), the model achieves faster training and inference times compared to |
| | CPU-only solutions. |
| |
|
| | ## Software Integration |
| |
|
| | **Runtime Engine:** Pytorch <br> |
| | **Supported Hardware Microarchitecture Compatibility:** <br> |
| |
|
| | - NVIDIA Ampere <br> |
| | - NVIDIA Hopper <br> |
| | - NVIDIA Turing <br> |
| |
|
| | **Supported Operating System:** |
| |
|
| | - Linux <br> |
| |
|
| | ## Model Version: |
| |
|
| | **Model Version:** v1 <br> |
| |
|
| | ## Training, Testing, and Evaluation Datasets: |
| |
|
| | **Total size (in number of data points):** 110,960 <br> |
| | **Total number of datasets:** 1<br> |
| | **Dataset partition:** training 95%, testing 2.5%, validation 2.5% <br> |
| |
|
| | ## Training Dataset: |
| |
|
| | **Link:** [ERA5](https://cds.climate.copernicus.eu/) <br> |
| |
|
| | **Data Collection Method by dataset** <br> |
| |
|
| | - Automatic/Sensors <br> |
| |
|
| | **Labeling Method by dataset** <br> |
| |
|
| | - Automatic/Sensors <br> |
| |
|
| | **Properties:** |
| | ERA5 data for the period 1980-2015. ERA5 provides hourly estimates of various |
| | atmospheric, land, and oceanic climate variables. The data covers the Earth on a 30km |
| | grid and resolves the atmosphere at 137 levels. <br> |
| |
|
| | ## Testing Dataset: |
| |
|
| | **Link:** [ERA5](https://cds.climate.copernicus.eu/) <br> |
| |
|
| | **Data Collection Method by dataset** <br> |
| |
|
| | - Automatic/Sensors <br> |
| |
|
| | **Labeling Method by dataset** <br> |
| |
|
| | - Automatic/Sensors <br> |
| |
|
| | **Properties:** |
| | ERA5 data for the period 2016-2017. ERA5 provides hourly estimates of various |
| | atmospheric, land, and oceanic climate variables. The data covers the Earth on a 30km |
| | grid and resolves the atmosphere at 137 levels. <br> |
| |
|
| | ## Evaluation Dataset: |
| |
|
| | **Link:** [ERA5](https://cds.climate.copernicus.eu/) <br> |
| |
|
| | **Data Collection Method by dataset** <br> |
| |
|
| | - Automatic/Sensors <br> |
| |
|
| | **Labeling Method by dataset** <br> |
| |
|
| | - Automatic/Sensors <br> |
| |
|
| | **Properties:** |
| | ERA5 data for the period 2018-2019. ERA5 provides hourly estimates of various |
| | atmospheric, land, and oceanic climate variables. The data covers the Earth on a 30km |
| | grid and resolves the atmosphere at 137 levels. <br> |
| |
|
| | ## Inference: |
| |
|
| | **Acceleration Engine:** Makani, Pytorch <br> |
| | **Test Hardware:** |
| |
|
| | - A100 <br> |
| | - H100 <br> |
| | - L40S <br> |
| |
|
| | ## Ethical Considerations: |
| |
|
| | NVIDIA believes Trustworthy AI is a shared responsibility and we have established |
| | policies and practices to enable development for a wide array of AI applications. |
| | When downloaded or used in accordance with our terms of service, developers should |
| | work with their internal model team to ensure this model meets requirements for the |
| | relevant industry and use case and addresses unforeseen product misuse. |
| |
|
| | For more detailed information on ethical considerations for this model, please see the |
| | Model Card++ Explainability, Bias, Safety & Security, and Privacy Subcards. |
| |
|
| | Please report model quality, risk, security vulnerabilities or NVIDIA AI Concerns [here](https://www.nvidia.com/en-us/support/submit-security-vulnerability/). |